The Power of a Question: A Case Study of Two Organizational Knowledge Capture Systems
NASA Technical Reports Server (NTRS)
Cooper, Lynn P.
2003-01-01
This document represents a presentation regarding organizational knowledge capture systems which was delivered at the HICSS-36 conference held from January 6-9, 2003. An exploratory case study of two knowledge resources is offered. Then, two organizational knowledge capture systems are briefly described: knowledge transfer from practitioner and the use of questions to represent knowledge. Finally, the creation of a database of peer review questions is suggested as a method of promoting organizational discussions and knowledge representation and exchange.
Beyond knowledge capture: creating useful work-centric systems
NASA Technical Reports Server (NTRS)
Cooper, L. P.; Majchrzak, A.
2001-01-01
Once you have successfully captured knowledge, the challenge then becomes one of creating an affective way to use that knowledge. Two high knowledge content systems developed at the Jet Propulsion Laboratory are presented as examples of work-centric systems, where the primary value to the user is in the content.
Study of Design Knowledge Capture (DKC) schemes implemented in magnetic bearing applications
NASA Technical Reports Server (NTRS)
1990-01-01
A design knowledge capture (DKC) scheme was implemented using frame-based techniques. The objective of such a system is to capture not only the knowledge which describes a design, but also that which explains how the design decisions were reached. These knowledge types were labelled definitive and explanatory, respectively. Examination of the design process helped determine what knowledge to retain and at what stage that knowledge is used. A discussion of frames resulted in the recognition of their value to knowledge representation and organization. The FORMS frame system was used as a basis for further development, and for examples using magnetic bearing design. The specific contributions made by this research include: determination that frame-based systems provide a useful methodology for management and application of design knowledge; definition of specific user interface requirements, (this consists of a window-based browser); specification of syntax for DKC commands; and demonstration of the feasibility of DKC by applications to existing designs. It was determined that design knowledge capture could become an extremely valuable engineering tool for complicated, long-life systems, but that further work was needed, particularly the development of a graphic, window-based interface.
A knowledge-based system design/information tool
NASA Technical Reports Server (NTRS)
Allen, James G.; Sikora, Scott E.
1990-01-01
The objective of this effort was to develop a Knowledge Capture System (KCS) for the Integrated Test Facility (ITF) at the Dryden Flight Research Facility (DFRF). The DFRF is a NASA Ames Research Center (ARC) facility. This system was used to capture the design and implementation information for NASA's high angle-of-attack research vehicle (HARV), a modified F/A-18A. In particular, the KCS was used to capture specific characteristics of the design of the HARV fly-by-wire (FBW) flight control system (FCS). The KCS utilizes artificial intelligence (AI) knowledge-based system (KBS) technology. The KCS enables the user to capture the following characteristics of automated systems: the system design; the hardware (H/W) design and implementation; the software (S/W) design and implementation; and the utilities (electrical and hydraulic) design and implementation. A generic version of the KCS was developed which can be used to capture the design information for any automated system. The deliverable items for this project consist of the prototype generic KCS and an application, which captures selected design characteristics of the HARV FCS.
Development of a general-purpose, integrated knowledge capture and delivery system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roberts, A.G.; Freer, E.B.
1991-01-01
KATIE (Knowledge-Based Assistant for Troubleshooting Industrial Equipment) was first conceived as a solution for maintenance problems. In the area of process control, maintenance technicians have become responsible for increasingly complicated equipment and an overwhelming amount of associated information. The sophisticated distributed control systems have proven to be such a drastic change for technicians that they are forced to rely on the engineer for troubleshooting guidance. Because it is difficult for a knowledgeable engineer to be readily available for troubleshooting,maintenance personnel wish to capture the information provided by the engineer. The solution provided has two stages. First, a specific complicated systemmore » was chosen as a test case. An effort was made to gather all available system information in some form. Second, a method of capturing and delivering this collection of information was developed. Several features were desired for this knowledge capture/delivery system (KATIE). Creation of the knowledge base needed to be independent of the delivery system. The delivery path need to be as simple as possible for the technician, and the capture, or authoring, system could provide very sophisticated features. It was decided that KATIE should be as general as possible, not internalizing specifics about the first implementation. The knowledge bases created needed to be completely separate from KATIE needed to have a modular structure so that each type of information (rules, procedures, manuals, symptoms) could be encapsulated individually.« less
Interactive Business Development, Capturing Business Knowledge and Practice: A Case Study
ERIC Educational Resources Information Center
McKelvie, Gregor; Dotsika, Fefie; Patrick, Keith
2007-01-01
Purpose: The purpose of this paper is to follow the planning and development of MapaWiki, a Knowledge Management System for Mapa, an independent research company that specialises in competitor benchmarking. Starting with the standard requirements to capture, store and share information and knowledge, a system was sought that would allow growth and…
Case-Based Capture and Reuse of Aerospace Design Rationale
NASA Technical Reports Server (NTRS)
Leake, David B.
2001-01-01
The goal of this project was to apply artificial intelligence techniques to facilitate capture and reuse of aerospace design rationale. The project combined case-based reasoning (CBR) and concept maps (CMaps) to develop methods for capturing, organizing, and interactively accessing records of experiences encapsulating the methods and rationale underlying expert aerospace design, in order to bring the captured knowledge to bear to support future reasoning. The project's results contribute both principles and methods for effective design-aiding systems that aid capture and access of useful design knowledge. The project has been guided by the tenets that design-aiding systems must: (1) Leverage a designer's knowledge, rather than attempting to replace it; (2) Be able to reflect different designers' differing conceptualizations of the design task, and to clarify those conceptualizations to others; (3) Include capabilities to capture information both by interactive knowledge modeling and during normal use; and (4) Integrate into normal designer tasks as naturally and unobtrusive as possible.
U.S. Spacesuit Knowledge Capture
NASA Technical Reports Server (NTRS)
Chullen, Cinda; Thomas, Ken; McMann, Joe; Dolan, Kristi; Bitterly, Rose; Lewis, Cathleen
2010-01-01
The ability to learn from both the mistakes and successes of the past is vital to assuring success in the future. Due to the close physical interaction between spacesuit systems and human beings as users, spacesuit technology and usage lends itself rather uniquely to the benefits realized from the skillful organization of historical information; its dissemination; the collection and identification of artifacts; and the education of individuals and groups working in the field. The National Aeronautics and Space Administration (NASA), other organizations and individuals have been performing United States (U.S.) spacesuit knowledge capture since the beginning of space exploration. Avenues used to capture the knowledge have included publication of reports; conference presentations; specialized seminars; and classes usually given by veterans in the field. Recently, the effort has been more concentrated and formalized whereby a new avenue of spacesuit knowledge capture has been added to the archives through which videotaping occurs, engaging both current and retired specialists in the field presenting technical scope specifically for education and preservation of knowledge. Now with video archiving, all these avenues of learning can be brought to life with the real experts presenting their wealth of knowledge on screen for future learners to enjoy. U.S. spacesuit knowledge capture topics have included lessons learned in spacesuit technology, experience from the Gemini, Apollo, Skylab and Shuttle programs, hardware certification, design, development and other program components, spacesuit evolution and experience, failure analysis and resolution, and aspects of program management. Concurrently, U.S. spacesuit knowledge capture activities have progressed to a level where NASA, the National Air and Space Museum (NASM), Hamilton Sundstrand (HS) and the spacesuit community are now working together to provide a rather closed-looped spacesuit knowledge capture system which includes specific attention to spacesuit system artifacts as well. A NASM report has recently been created that allows the cross reference of history to the artifacts and the artifacts to the history including spacesuit manufacturing details with current condition and location. NASA has examined spacesuits in the NASM collection for evidence of wear during their operational life. NASA s formal spacesuit knowledge capture efforts now make use of both the NASM spacesuit preservation collection and report to enhance its efforts to educate NASA personnel and contribute to spacesuit history. Be it archiving of human knowledge or archiving of the actual spacesuit legacy hardware with its rich history, the joining together of spacesuit system artifact history with that of development and use during past programs will provide a wealth of knowledge which will greatly enhance the chances for the success of future and more ambitious spacesuit system programs.
NASA Astrophysics Data System (ADS)
Nieten, Joseph L.; Burke, Roger
1993-03-01
The system diagnostic builder (SDB) is an automated knowledge acquisition tool using state- of-the-art artificial intelligence (AI) technologies. The SDB uses an inductive machine learning technique to generate rules from data sets that are classified by a subject matter expert (SME). Thus, data is captured from the subject system, classified by an expert, and used to drive the rule generation process. These rule-bases are used to represent the observable behavior of the subject system, and to represent knowledge about this system. The rule-bases can be used in any knowledge based system which monitors or controls a physical system or simulation. The SDB has demonstrated the utility of using inductive machine learning technology to generate reliable knowledge bases. In fact, we have discovered that the knowledge captured by the SDB can be used in any number of applications. For example, the knowledge bases captured from the SMS can be used as black box simulations by intelligent computer aided training devices. We can also use the SDB to construct knowledge bases for the process control industry, such as chemical production, or oil and gas production. These knowledge bases can be used in automated advisory systems to ensure safety, productivity, and consistency.
Hubble Space Telescope Design Engineering Knowledgebase (HSTDEK)
NASA Technical Reports Server (NTRS)
Johannes, James D.; Everetts, Clark
1989-01-01
The research covered here pays specific attention to the development of tools to assist knowledge engineers in acquiring knowledge and to assist other technical, engineering, and management personnel in automatically performing knowledge capture as part of their everyday work without adding any extra work to what they already do. Requirements for data products, the knowledge base, and methods for mapping knowledge in the documents onto the knowledge representations are discussed, as are some of the difficulties of capturing in the knowledge base the structure of the design process itself, along with a model of the system designed. The capture of knowledge describing the interactions of different components is also discussed briefly.
A model to capture and manage tacit knowledge using a multiagent system
NASA Astrophysics Data System (ADS)
Paolino, Lilyam; Paggi, Horacio; Alonso, Fernando; López, Genoveva
2014-10-01
This article presents a model to capture and register business tacit knowledge belonging to different sources, using an expert multiagent system which enables the entry of incidences and captures the tacit knowledge which could fix them. This knowledge and their sources are evaluated through the application of trustworthy algorithms that lead to the registration of the data base and the best of each of them. Through its intelligent software agents, this system interacts with the administrator, users, with the knowledge sources and with all the practice communities which might exist in the business world. The sources as well as the knowledge are constantly evaluated, before being registered and also after that, in order to decide the staying or modification of its original weighting. If there is the possibility of better, new knowledge are registered through the old ones. This is also part of an investigation being carried out which refers to knowledge management methodologies in order to manage tacit business knowledge so as to make the business competitiveness easier and leading to innovation learning.
US Spacesuit Knowledge Capture
NASA Technical Reports Server (NTRS)
Chullen, Cinda; Thomas, Ken; McMann, Joe; Dolan, Kristi; Bitterly, Rose; Lewis, Cathleen
2011-01-01
The ability to learn from both the mistakes and successes of the past is vital to assuring success in the future. Due to the close physical interaction between spacesuit systems and human beings as users, spacesuit technology and usage lends itself rather uniquely to the benefits realized from the skillful organization of historical information; its dissemination; the collection and identification of artifacts; and the education of those in the field. The National Aeronautics and Space Administration (NASA), other organizations and individuals have been performing United States (U.S.) Spacesuit Knowledge Capture since the beginning of space exploration. Avenues used to capture the knowledge have included publication of reports; conference presentations; specialized seminars; and classes usually given by veterans in the field. More recently the effort has been more concentrated and formalized whereby a new avenue of spacesuit knowledge capture has been added to the archives in which videotaping occurs engaging both current and retired specialists in the field presenting technical scope specifically for education and preservation of knowledge. With video archiving, all these avenues of learning can now be brought to life with the real experts presenting their wealth of knowledge on screen for future learners to enjoy. Scope and topics of U.S. spacesuit knowledge capture have included lessons learned in spacesuit technology, experience from the Gemini, Apollo, Skylab and Shuttle programs, hardware certification, design, development and other program components, spacesuit evolution and experience, failure analysis and resolution, and aspects of program management. Concurrently, U.S. spacesuit knowledge capture activities have progressed to a level where NASA, the National Air and Space Museum (NASM), Hamilton Sundstrand (HS) and the spacesuit community are now working together to provide a comprehensive closed-looped spacesuit knowledge capture system which includes
TARGET: Rapid Capture of Process Knowledge
NASA Technical Reports Server (NTRS)
Ortiz, C. J.; Ly, H. V.; Saito, T.; Loftin, R. B.
1993-01-01
TARGET (Task Analysis/Rule Generation Tool) represents a new breed of tool that blends graphical process flow modeling capabilities with the function of a top-down reporting facility. Since NASA personnel frequently perform tasks that are primarily procedural in nature, TARGET models mission or task procedures and generates hierarchical reports as part of the process capture and analysis effort. Historically, capturing knowledge has proven to be one of the greatest barriers to the development of intelligent systems. Current practice generally requires lengthy interactions between the expert whose knowledge is to be captured and the knowledge engineer whose responsibility is to acquire and represent the expert's knowledge in a useful form. Although much research has been devoted to the development of methodologies and computer software to aid in the capture and representation of some types of knowledge, procedural knowledge has received relatively little attention. In essence, TARGET is one of the first tools of its kind, commercial or institutional, that is designed to support this type of knowledge capture undertaking. This paper will describe the design and development of TARGET for the acquisition and representation of procedural knowledge. The strategies employed by TARGET to support use by knowledge engineers, subject matter experts, programmers and managers will be discussed. This discussion includes the method by which the tool employs its graphical user interface to generate a task hierarchy report. Next, the approach to generate production rules for incorporation in and development of a CLIPS based expert system will be elaborated. TARGET also permits experts to visually describe procedural tasks as a common medium for knowledge refinement by the expert community and knowledge engineer making knowledge consensus possible. The paper briefly touches on the verification and validation issues facing the CLIPS rule generation aspects of TARGET. A description of efforts to support TARGET's interoperability issues on PCs, Macintoshes and UNIX workstations concludes the paper.
Knowledge Capture and Management for Space Flight Systems
NASA Technical Reports Server (NTRS)
Goodman, John L.
2005-01-01
The incorporation of knowledge capture and knowledge management strategies early in the development phase of an exploration program is necessary for safe and successful missions of human and robotic exploration vehicles over the life of a program. Following the transition from the development to the flight phase, loss of underlying theory and rationale governing design and requirements occur through a number of mechanisms. This degrades the quality of engineering work resulting in increased life cycle costs and risk to mission success and safety of flight. Due to budget constraints, concerned personnel in legacy programs often have to improvise methods for knowledge capture and management using existing, but often sub-optimal, information technology and archival resources. Application of advanced information technology to perform knowledge capture and management would be most effective if program wide requirements are defined at the beginning of a program.
Techniques for capturing expert knowledge - An expert systems/hypertext approach
NASA Technical Reports Server (NTRS)
Lafferty, Larry; Taylor, Greg; Schumann, Robin; Evans, Randy; Koller, Albert M., Jr.
1990-01-01
The knowledge-acquisition strategy developed for the Explosive Hazards Classification (EHC) Expert System is described in which expert systems and hypertext are combined, and broad applications are proposed. The EHC expert system is based on rapid prototyping in which primary knowledge acquisition from experts is not emphasized; the explosive hazards technical bulletin, technical guidance, and minimal interviewing are used to develop the knowledge-based system. Hypertext is used to capture the technical information with respect to four issues including procedural, materials, test, and classification issues. The hypertext display allows the integration of multiple knowlege representations such as clarifications or opinions, and thereby allows the performance of a broad range of tasks on a single machine. Among other recommendations, it is suggested that the integration of hypertext and expert systems makes the resulting synergistic system highly efficient.
Making Sense of Rocket Science - Building NASA's Knowledge Management Program
NASA Technical Reports Server (NTRS)
Holm, Jeanne
2002-01-01
The National Aeronautics and Space Administration (NASA) has launched a range of KM activities-from deploying intelligent "know-bots" across millions of electronic sources to ensuring tacit knowledge is transferred across generations. The strategy and implementation focuses on managing NASA's wealth of explicit knowledge, enabling remote collaboration for international teams, and enhancing capture of the key knowledge of the workforce. An in-depth view of the work being done at the Jet Propulsion Laboratory (JPL) shows the integration of academic studies and practical applications to architect, develop, and deploy KM systems in the areas of document management, electronic archives, information lifecycles, authoring environments, enterprise information portals, search engines, experts directories, collaborative tools, and in-process decision capture. These systems, together, comprise JPL's architecture to capture, organize, store, and distribute key learnings for the U.S. exploration of space.
NASA Technical Reports Server (NTRS)
Nieten, Joseph; Burke, Roger
1993-01-01
Consideration is given to the System Diagnostic Builder (SDB), an automated knowledge acquisition tool using state-of-the-art AI technologies. The SDB employs an inductive machine learning technique to generate rules from data sets that are classified by a subject matter expert. Thus, data are captured from the subject system, classified, and used to drive the rule generation process. These rule bases are used to represent the observable behavior of the subject system, and to represent knowledge about this system. The knowledge bases captured from the Shuttle Mission Simulator can be used as black box simulations by the Intelligent Computer Aided Training devices. The SDB can also be used to construct knowledge bases for the process control industry, such as chemical production or oil and gas production.
On the acquisition and representation of procedural knowledge
NASA Technical Reports Server (NTRS)
Saito, T.; Ortiz, C.; Loftin, R. B.
1992-01-01
Historically knowledge acquisition has proven to be one of the greatest barriers to the development of intelligent systems. Current practice generally requires lengthy interactions between the expert whose knowledge is to be captured and the knowledge engineer whose responsibility is to acquire and represent knowledge in a useful form. Although much research has been devoted to the development of methodologies and computer software to aid in the capture and representation of some of some types of knowledge, little attention has been devoted to procedural knowledge. NASA personnel frequently perform tasks that are primarily procedural in nature. Previous work is reviewed in the field of knowledge acquisition and then focus on knowledge acquisition for procedural tasks with special attention devoted to the Navy's VISTA tool. The design and development is described of a system for the acquisition and representation of procedural knowledge-TARGET (Task Analysis and Rule Generation Tool). TARGET is intended as a tool that permits experts to visually describe procedural tasks and as a common medium for knowledge refinement by the expert and knowledge engineer. The system is designed to represent the acquired knowledge in the form of production rules. Systems such as TARGET have the potential to profoundly reduce the time, difficulties, and costs of developing knowledge-based systems for the performance of procedural tasks.
1989-02-01
which capture the knowledge of such experts. These Expert Systems, or Knowledge-Based Systems’, differ from the usual computer programming techniques...their applications in the fields of structural design and welding is reviewed. 5.1 Introduction Expert Systems, or KBES, are computer programs using Al...procedurally constructed as conventional computer programs usually are; * The knowledge base of such systems is executable, unlike databases 3 "Ill
A Survey of Knowledge Management Research & Development at NASA Ames Research Center
NASA Technical Reports Server (NTRS)
Keller, Richard M.; Clancy, Daniel (Technical Monitor)
2002-01-01
This chapter catalogs knowledge management research and development activities at NASA Ames Research Center as of April 2002. A general categorization scheme for knowledge management systems is first introduced. This categorization scheme divides knowledge management capabilities into five broad categories: knowledge capture, knowledge preservation, knowledge augmentation, knowledge dissemination, and knowledge infrastructure. Each of nearly 30 knowledge management systems developed at Ames is then classified according to this system. Finally, a capsule description of each system is presented along with information on deployment status, funding sources, contact information, and both published and internet-based references.
Information Technology Management Strategies to Implement Knowledge Management Systems
ERIC Educational Resources Information Center
McGee, Mary Jane Christy
2017-01-01
More than 38% of the U.S. public workforce will likely retire by 2030, which may result in a labor shortage. Business leaders may adopt strategies to mitigate knowledge loss within their organizations by capturing knowledge in a knowledge management system (KMS). The purpose of this single case study was to explore strategies that information…
1992-05-01
methodology, knowledge acquisition, 140 requirements definition, information systems, information engineering, 16. PRICE CODE systems engineering...and knowledge resources. Like manpower, materials, and machines, information and knowledge assets are recognized as vital resources that can be...evolve towards an information -integrated enterprise. These technologies are designed to leverage information and knowledge resources as the key
MSFC Propulsion Systems Department Knowledge Management Project
NASA Technical Reports Server (NTRS)
Caraccioli, Paul A.
2007-01-01
This slide presentation reviews the Knowledge Management (KM) project of the Propulsion Systems Department at Marshall Space Flight Center. KM is needed to support knowledge capture, preservation and to support an information sharing culture. The presentation includes the strategic plan for the KM initiative, the system requirements, the technology description, the User Interface and custom features, and a search demonstration.
ERIC Educational Resources Information Center
Zeng, Qingtian; Zhao, Zhongying; Liang, Yongquan
2009-01-01
User's knowledge requirement acquisition and analysis are very important for a personalized or user-adaptive learning system. Two approaches to capture user's knowledge requirement about course content within an e-learning system are proposed and implemented in this paper. The first approach is based on the historical data accumulated by an…
1995-09-01
vital processes of a business. process, IDEF, method, methodology, modeling, knowledge acquisition, requirements definition, information systems... knowledge resources. Like manpower, materials, and machines, information and knowledge assets are recognized as vital resources that can be leveraged to...integrated enterprise. These technologies are designed to leverage information and knowledge resources as the key enablers for high quality systems
The elements of design knowledge capture
NASA Technical Reports Server (NTRS)
Freeman, Michael S.
1988-01-01
This paper will present the basic constituents of a design knowledge capture effort. This will include a discussion of the types of knowledge to be captured in such an effort and the difference between design knowledge capture and more traditional knowledge base construction. These differences include both knowledge base structure and knowledge acquisition approach. The motivation for establishing a design knowledge capture effort as an integral part of major NASA programs will be outlined, along with the current NASA position on that subject. Finally the approach taken in design knowledge capture for Space Station will be contrasted with that used in the HSTDEK project.
Capturing flight system test engineering expertise: Lessons learned
NASA Technical Reports Server (NTRS)
Woerner, Irene Wong
1991-01-01
Within a few years, JPL will be challenged by the most active mission set in history. Concurrently, flight systems are increasingly more complex. Presently, the knowledge to conduct integration and test of spacecraft and large instruments is held by a few key people, each with many years of experience. JPL is in danger of losing a significant amount of this critical expertise, through retirement, during a period when demand for this expertise is rapidly increasing. The most critical issue at hand is to collect and retain this expertise and develop tools that would ensure the ability to successfully perform the integration and test of future spacecraft and large instruments. The proposed solution was to capture and codity a subset of existing knowledge, and to utilize this captured expertise in knowledge-based systems. First year results and activities planned for the second year of this on-going effort are described. Topics discussed include lessons learned in knowledge acquisition and elicitation techniques, life-cycle paradigms, and rapid prototyping of a knowledge-based advisor (Spacecraft Test Assistant) and a hypermedia browser (Test Engineering Browser). The prototype Spacecraft Test Assistant supports a subset of integration and test activities for flight systems. Browser is a hypermedia tool that allows users easy perusal of spacecraft test topics. A knowledge acquisition tool called ConceptFinder which was developed to search through large volumes of data for related concepts is also described and is modified to semi-automate the process of creating hypertext links.
Space Shuttle Guidance, Navigation, and Rendezvous Knowledge Capture Reports. Revision 1
NASA Technical Reports Server (NTRS)
Goodman, John L.
2011-01-01
This document is a catalog and readers guide to lessons learned, experience, and technical history reports, as well as compilation volumes prepared by United Space Alliance personnel for the NASA/Johnson Space Center (JSC) Flight Dynamics Division.1 It is intended to make it easier for future generations of engineers to locate knowledge capture documentation from the Shuttle Program. The first chapter covers observations on documentation quality and research challenges encountered during the Space Shuttle and Orion programs. The second chapter covers the knowledge capture approach used to create many of the reports covered in this document. These chapters are intended to provide future flight programs with insight that could be used to formulate knowledge capture and management strategies. The following chapters contain descriptions of each knowledge capture report. The majority of the reports concern the Space Shuttle. Three are included that were written in support of the Orion Program. Most of the reports were written from the years 2001 to 2011. Lessons learned reports concern primarily the shuttle Global Positioning System (GPS) upgrade and the knowledge capture process. Experience reports on navigation and rendezvous provide examples of how challenges were overcome and how best practices were identified and applied. Some reports are of a more technical history nature covering navigation and rendezvous. They provide an overview of mission activities and the evolution of operations concepts and trajectory design. The lessons learned, experience, and history reports would be considered secondary sources by historians and archivists.
Yu, Kun; Mitch, William A; Dai, Ning
2017-10-17
Amine-based absorption is the primary contender for postcombustion CO 2 capture from fossil fuel-fired power plants. However, significant concerns have arisen regarding the formation and emission of toxic nitrosamine and nitramine byproducts from amine-based systems. This paper reviews the current knowledge regarding these byproducts in CO 2 capture systems. In the absorber, flue gas NO x drives nitrosamine and nitramine formation after its dissolution into the amine solvent. The reaction mechanisms are reviewed based on CO 2 capture literature as well as biological and atmospheric chemistry studies. In the desorber, nitrosamines are formed under high temperatures by amines reacting with nitrite (a hydrolysis product of NO x ), but they can also thermally decompose following pseudo-first order kinetics. The effects of amine structure, primarily amine order, on nitrosamine formation and the corresponding mechanisms are discussed. Washwater units, although intended to control emissions from the absorber, can contribute to additional nitrosamine formation when accumulated amines react with residual NO x . Nitramines are much less studied than nitrosamines in CO 2 capture systems. Mitigation strategies based on the reaction mechanisms in each unit of the CO 2 capture systems are reviewed. Lastly, we highlight research needs in clarifying reaction mechanisms, developing analytical methods for both liquid and gas phases, and integrating different units to quantitatively predict the accumulation and emission of nitrosamines and nitramines.
Knowledge represented using RDF semantic network in the concept of semantic web
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lukasova, A., E-mail: alena.lukasova@osu.cz; Vajgl, M., E-mail: marek.vajgl@osu.cz; Zacek, M., E-mail: martin.zacek@osu.cz
The RDF(S) model has been declared as the basic model to capture knowledge of the semantic web. It provides a common and flexible way to decompose composed knowledge to elementary statements, which can be represented by RDF triples or by RDF graph vectors. From the logical point of view, elements of knowledge can be expressed using at most binary predicates, which can be converted to RDF-triples or graph vectors. However, it is not able to capture implicit knowledge representable by logical formulas. This contribution shows how existing approaches (semantic networks and clausal form logic) can be combined together with RDFmore » to obtain RDF-compatible system with ability to represent implicit knowledge and inference over knowledge base.« less
The Spelling Sensitivity Score: Noting Developmental Changes in Spelling Knowledge
ERIC Educational Resources Information Center
Masterson, Julie J.; Apel, Kenn
2010-01-01
Spelling is a language skill supported by several linguistic knowledge sources, including phonemic, orthographic, and morphological knowledge. Typically, however, spelling assessment procedures do not capture the development and use of these linguistic knowledge sources. The purpose of this article is to describe a new assessment system, the…
Software support environment design knowledge capture
NASA Technical Reports Server (NTRS)
Dollman, Tom
1990-01-01
The objective of this task is to assess the potential for using the software support environment (SSE) workstations and associated software for design knowledge capture (DKC) tasks. This assessment will include the identification of required capabilities for DKC and hardware/software modifications needed to support DKC. Several approaches to achieving this objective are discussed and interim results are provided: (1) research into the problem of knowledge engineering in a traditional computer-aided software engineering (CASE) environment, like the SSE; (2) research into the problem of applying SSE CASE tools to develop knowledge based systems; and (3) direct utilization of SSE workstations to support a DKC activity.
ERIC Educational Resources Information Center
Flor, Alexander Gonzalez
2013-01-01
The paper is based on the challenges encountered by the researcher while conducting a study titled "Design, Development and Testing of an Indigenous Knowledge Management System Using Mobile Device Video Capture and Web 2.0 Protocols." During the conduct of the study the researcher observed a marked reluctance from organized indigenous…
U.S. Spacesuit Knowledge Capture Status and Initiatives
NASA Technical Reports Server (NTRS)
Chullen, Cinda; Woods, Ron; Jairala, Juniper; Bitterly, Rose; McMann, Joe; Lewis, Cathleen
2011-01-01
The National Aeronautics and Space Administration (NASA), other organizations and individuals have been performing United States (U.S.) spacesuit knowledge capture since the beginning of space exploration via publication of reports, conference presentations, specialized seminars, and classes instructed by veterans in the field. Recently, the effort has been more concentrated and formalized whereby a new avenue of spacesuit knowledge capture has been added to the archives through which videotaping occurs, engaging both current and retired specialists in the field presenting technical scope specifically for education and preservation of knowledge or being interviewed to archive their significance to NASA s history. Now with video archiving, all these avenues of learning are brought to life with the real experts presenting their wealth of knowledge on screen for future learners to enjoy. U.S. spacesuit knowledge capture topics have included lessons learned in spacesuit technology, experience from the Gemini, Apollo, Skylab and Shuttle programs, hardware certification, design, development and other program components, spacesuit evolution and experience, failure analysis and resolution, aspects of program management, and personal interviews. These archives of actual spacesuit legacy now reflect its rich history and will provide a wealth of knowledge which will greatly enhance the chances for the success of future and more ambitious spacesuit system programs. In this paper, NASA s formal spacesuit knowledge capture efforts will be reviewed and a status will be provided to reveal initiatives and accomplishments since the inception of the more formal U.S. spacesuit knowledge program. A detail itemization of the actual archives will be addressed along with topics that are now available to the general NASA community and the public. Additionally, the latest developments in the archival relationship with the Smithsonian will be discussed.
U.S. Spacesuit Knowledge Capture Status and Initiatives
NASA Technical Reports Server (NTRS)
Chullen, Cinda; Woods, Ron; Jairala, Juniper; Bitterly, Rose; McMann, Joe; Lewis, Cathleen
2012-01-01
The National Aeronautics and Space Administration (NASA), other organizations and individuals have been performing United States (U.S.) spacesuit knowledge capture since the beginning of space exploration via publication of reports, conference presentations, specialized seminars, and classes instructed by veterans in the field. Recently, the effort has been more concentrated and formalized whereby a new avenue of spacesuit knowledge capture has been added to the archives through which videotaping occurs, engaging both current and retired specialists in the field presenting technical scope specifically for education and preservation of knowledge or being interviewed to archive their significance to NASA's history. Now with video archiving, all these avenues of learning are brought to life with the real experts presenting their wealth of knowledge on screen for future learners to enjoy. U.S. spacesuit knowledge capture topics have included lessons learned in spacesuit technology, experience from the Gemini, Apollo, Skylab and Shuttle programs, hardware certification, design, development and other program components, spacesuit evolution and experience, failure analysis and resolution, aspects of program management, and personal interviews. These archives of actual spacesuit legacy now reflect its rich history and will provide a wealth of knowledge which will greatly enhance the chances for the success of future and more ambitious spacesuit system programs. In this paper, NASA s formal spacesuit knowledge capture efforts will be reviewed and a status will be provided to reveal initiatives and accomplishments since the inception of the more formal U.S. spacesuit knowledge program. A detail itemization of the actual archives will be addressed along with topics that are now available to the general NASA community and the public. Additionally, the latest developments in the archival relationship with the Smithsonian will be discussed.
Knowledge-based requirements analysis for automating software development
NASA Technical Reports Server (NTRS)
Markosian, Lawrence Z.
1988-01-01
We present a new software development paradigm that automates the derivation of implementations from requirements. In this paradigm, informally-stated requirements are expressed in a domain-specific requirements specification language. This language is machine-understable and requirements expressed in it are captured in a knowledge base. Once the requirements are captured, more detailed specifications and eventually implementations are derived by the system using transformational synthesis. A key characteristic of the process is that the required human intervention is in the form of providing problem- and domain-specific engineering knowledge, not in writing detailed implementations. We describe a prototype system that applies the paradigm in the realm of communication engineering: the prototype automatically generates implementations of buffers following analysis of the requirements on each buffer.
Conservation of design knowledge. [of large complex spaceborne systems
NASA Technical Reports Server (NTRS)
Sivard, Cecilia; Zweben, Monte; Cannon, David; Lakin, Fred; Leifer, Larry
1989-01-01
This paper presents an approach for acquiring knowledge about a design during the design process. The objective is to increase the efficiency of the lifecycle management of a space-borne system by providing operational models of the system's structure and behavior, as well as the design rationale, to human and automated operators. A design knowledge acquisition system is under development that compares how two alternative design versions meet the system requirements as a means for automatically capturing rationale for design changes.
Design of customer knowledge management system for Aglaonema Nursery in South Tangerang, Indonesia
NASA Astrophysics Data System (ADS)
Sugiarto, D.; Mardianto, I.; Dewayana, TS; Khadafi, M.
2017-12-01
The purpose of this paper is to describe the design of customer knowledge management system to support customer relationship management activities for an aglaonema nursery in South Tangerang, Indonesia. System. The steps were knowledge identification (knowledge about customer, knowledge from customer, knowledge for customer), knowledge capture, codification, analysis of system requirement and create use case and activity diagram. The result showed that some key knowledge were about supporting customer in plant care (know how) and types of aglaonema including with the prices (know what). That knowledge for customer then codified and shared in knowledge portal website integrated with social media. Knowledge about customer were about customers and their behaviour in purchasing aglaonema. Knowledge from customer were about feedback, favorite and customer experience. Codified knowledge were placed and shared using content management system based on wordpress.
Engineering design knowledge recycling in near-real-time
NASA Technical Reports Server (NTRS)
Leifer, Larry; Baya, Vinod; Toye, George; Baudin, Catherine; Underwood, Jody Gevins
1994-01-01
It is hypothesized that the capture and reuse of machine readable design records is cost beneficial. This informal engineering notebook design knowledge can be used to model the artifact and the design process. Design rationale is, in part, preserved and available for examination. Redesign cycle time is significantly reduced (Baya et al, 1992). These factors contribute to making it less costly to capture and reuse knowledge than to recreate comparable knowledge (current practice). To test the hypothesis, we have focused on validation of the concept and tools in two 'real design' projects this past year: (1) a short (8 month) turnaround project for NASA life science bioreactor researchers was done by a team of three mechanical engineering graduate students at Stanford University (in a class, ME210abc 'Mechatronic Systems Design and Methodology' taught by one of the authors, Leifer); and (2) a long range (8 to 20 year) international consortium project for NASA's Space Science program (STEP: satellite test of the equivalence principle). Design knowledge capture was supported this year by assigning the use of a Team-Design PowerBook. Design records were cataloged in near-real time. These records were used to qualitatively model the artifact design as it evolved. Dedal, an 'intelligent librarian' developed at NASA-ARC, was used to navigate and retrieve captured knowledge for reuse.
A Design Rationale Capture Tool to Support Design Verification and Re-use
NASA Technical Reports Server (NTRS)
Hooey, Becky Lee; Da Silva, Jonny C.; Foyle, David C.
2012-01-01
A design rationale tool (DR tool) was developed to capture design knowledge to support design verification and design knowledge re-use. The design rationale tool captures design drivers and requirements, and documents the design solution including: intent (why it is included in the overall design); features (why it is designed the way it is); information about how the design components support design drivers and requirements; and, design alternatives considered but rejected. For design verification purposes, the tool identifies how specific design requirements were met and instantiated within the final design, and which requirements have not been met. To support design re-use, the tool identifies which design decisions are affected when design drivers and requirements are modified. To validate the design tool, the design knowledge from the Taxiway Navigation and Situation Awareness (T-NASA; Foyle et al., 1996) system was captured and the DR tool was exercised to demonstrate its utility for validation and re-use.
Theory and ontology for sharing temporal knowledge
NASA Technical Reports Server (NTRS)
Loganantharaj, Rasiah
1996-01-01
Using current technology, the sharing or re-using of knowledge-bases is very difficult, if not impossible. ARPA has correctly recognized the problem and funded a knowledge sharing initiative. One of the outcomes of this project is a formal language called Knowledge Interchange Format (KIF) for representing knowledge that could be translated into other languages. Capturing and representing design knowledge and reasoning with them have become very important for NASA who is a pioneer of innovative design of unique products. For upgrading an existing design for changing technology, needs, or requirements, it is essential to understand the design rationale, design choices, options and other relevant information associated with the design. Capturing such information and presenting them in the appropriate form are part of the ongoing Design Knowledge Capture project of NASA. The behavior of an object and various other aspects related to time are captured by the appropriate temporal knowledge. The captured design knowledge will be represented in such a way that various groups of NASA who are interested in various aspects of the design cycle should be able to access and use the design knowledge effectively. To facilitate knowledge sharing among these groups, one has to develop a very well defined ontology. Ontology is a specification of conceptualization. In the literature several specific domains were studied and some well defined ontologies were developed for such domains. However, very little, or no work has been done in the area of representing temporal knowledge to facilitate sharing. During the ASEE summer program, I have investigated several temporal models and have proposed a theory for time that is flexible to accommodate the time elements, such as, points and intervals, and is capable of handling the qualitative and quantitative temporal constraints. I have also proposed a primitive temporal ontology using which other relevant temporal ontologies can be built. I have investigated various issues of sharing knowledge and have proposed a formal framework for modeling the concept of knowledge sharing. This work may be implemented and tested in the software environment supplied by Knowledge Based System, Inc.
Knowledge-based nursing diagnosis
NASA Astrophysics Data System (ADS)
Roy, Claudette; Hay, D. Robert
1991-03-01
Nursing diagnosis is an integral part of the nursing process and determines the interventions leading to outcomes for which the nurse is accountable. Diagnoses under the time constraints of modern nursing can benefit from a computer assist. A knowledge-based engineering approach was developed to address these problems. A number of problems were addressed during system design to make the system practical extended beyond capture of knowledge. The issues involved in implementing a professional knowledge base in a clinical setting are discussed. System functions, structure, interfaces, health care environment, and terminology and taxonomy are discussed. An integrated system concept from assessment through intervention and evaluation is outlined.
Marketing practitioner’s tacit knowledge acquisition using Repertory Grid Technique (RTG)
NASA Astrophysics Data System (ADS)
Azmi, Afdhal; Adriman, Ramzi
2018-05-01
The tacit knowledge of Marketing practitioner’s experts is excellent resources and priceless. It takes into account their experiential, skill, ideas, belief systems, insight and speculation into management decision-making. This expertise is an individual intuitive judgment and personal shortcuts to complete the work efficiently. Tacit knowledge of Marketing practitioner’s experts is one of best problem solutions in marketing strategy, environmental analysis, product management and partner’s relationship. This paper proposes the acquisition method of tacit knowledge from Marketing practitioner’s using Repertory Grid Technique (RGT). The RGT is a software application for tacit acquisition knowledge to provide a systematic approach to capture and acquire the constructs from an individual. The result shows the understanding of RGT could make TKE and MPE get a good result in capturing and acquiring tacit knowledge of Marketing practitioner’s experts.
A case of malignant hyperthermia captured by an anesthesia information management system.
Maile, Michael D; Patel, Rajesh A; Blum, James M; Tremper, Kevin K
2011-04-01
Many cases of malignant hyperthermia triggered by volatile anesthetic agents have been described. However, to our knowledge, there has not been a report describing the precise changes in physiologic data of a human suffering from this process. Here we describe a case of malignant hyperthermia in which monitoring information was frequently and accurately captured by an anesthesia information management system.
Assessment of the NASA Space Shuttle Program's Problem Reporting and Corrective Action System
NASA Technical Reports Server (NTRS)
Korsmeryer, D. J.; Schreiner, J. A.; Norvig, Peter (Technical Monitor)
2001-01-01
This paper documents the general findings and recommendations of the Design for Safety Programs Study of the Space Shuttle Programs (SSP) Problem Reporting and Corrective Action (PRACA) System. The goals of this Study were: to evaluate and quantify the technical aspects of the SSP's PRACA systems, and to recommend enhancements addressing specific deficiencies in preparation for future system upgrades. The Study determined that the extant SSP PRACA systems accomplished a project level support capability through the use of a large pool of domain experts and a variety of distributed formal and informal database systems. This operational model is vulnerable to staff turnover and loss of the vast corporate knowledge that is not currently being captured by the PRACA system. A need for a Program-level PRACA system providing improved insight, unification, knowledge capture, and collaborative tools was defined in this study.
The Environmental Control and Life Support System (ECLSS) advanced automation project
NASA Technical Reports Server (NTRS)
Dewberry, Brandon S.; Carnes, Ray
1990-01-01
The objective of the environmental control and life support system (ECLSS) Advanced Automation Project is to influence the design of the initial and evolutionary Space Station Freedom Program (SSFP) ECLSS toward a man-made closed environment in which minimal flight and ground manpower is needed. Another objective includes capturing ECLSS design and development knowledge future missions. Our approach has been to (1) analyze the SSFP ECLSS, (2) envision as our goal a fully automated evolutionary environmental control system - an augmentation of the baseline, and (3) document the advanced software systems, hooks, and scars which will be necessary to achieve this goal. From this analysis, prototype software is being developed, and will be tested using air and water recovery simulations and hardware subsystems. In addition, the advanced software is being designed, developed, and tested using automation software management plan and lifecycle tools. Automated knowledge acquisition, engineering, verification and testing tools are being used to develop the software. In this way, we can capture ECLSS development knowledge for future use develop more robust and complex software, provide feedback to the knowledge based system tool community, and ensure proper visibility of our efforts.
ERIC Educational Resources Information Center
Cress, Ulrike; Held, Christoph; Kimmerle, Joachim
2013-01-01
Tag clouds generated in social tagging systems can capture the collective knowledge of communities. Using as a basis spreading activation theories, information foraging theory, and the co-evolution model of cognitive and social systems, we present here a model for an "extended information scent," which proposes that both collective and individual…
Failure of communication and capture: The perils of temporary unipolar pacing system.
Sahinoglu, Efe; Wool, Thomas J; Wool, Kenneth J
2015-06-01
We present a case of a patient with pacemaker dependence secondary to complete heart block who developed loss of capture of her temporary pacemaker. Patient developed torsades de pointes then ventricular fibrillation, requiring CPR and external cardioversion. After patient was stabilized, it was noticed that loss of capture of pacemaker corresponded with nursing care, when the pulse generator was lifted off patient׳s chest wall, and that patient׳s temporary pacing system had been programmed to unipolar mode without knowledge of attending cardiologist. This case highlights the importance of communication ensuring all caregivers are aware of mode of the temporary pacing system.
U.S. Spacesuit Knowledge Capture Series Catalog
NASA Technical Reports Server (NTRS)
Bitterly, Rose; Oliva, Vladenka
2012-01-01
The National Aeronautics and Space Administration (NASA) and other organizations have been performing U.S. Spacesuit Knowledge Capture (USSKC) since the beginning of space exploration through published reports, conference presentations, specialized seminars, and classes instructed by veterans in the field. The close physical interaction between spacesuit systems and human beings makes them among the most personally evocative pieces of space hardware. Consequently, spacesuit systems have required nearly constant engineering refinements to do their jobs without impinging on human activity. Since 2008, spacesuit knowledge capture has occurred through video recording, engaging both current and former specialists presenting technical scope specifically to educate individuals and preserve knowledge. These archives of spacesuit legacy reflect its rich history and will provide knowledge that will enhance the chances for the success of future and more ambitious spacesuit system programs. The scope and topics of USSKC have included lessons learned in spacesuit technology; experience from the Gemini, Apollo, Skylab, and Shuttle Programs; the process of hardware certification, design, development, and other program components; spacesuit evolution and experience; failure analysis and resolution; and aspects of program management. USSKC activities have progressed to a level where NASA, the National Air and Space Museum (NASM), Hamilton Sundstrand (HS) and the spacesuit community are now working together to provide a comprehensive way to organize and archive intra-agency information related to the development of spacesuit systems. These video recordings are currently being reviewed for public release using NASA export control processes. After a decision is made for either public or non-public release (internal NASA only), the videos and presentations will be available through the NASA Johnson Space Center Engineering Directorate (EA) Engineering Academy, the NASA Technical Reports Server (NTRS), the NASA Aeronautics & Space Database (NA&SD), or NASA YouTube. Event availability is duly noted in this catalog.
Structure and properties of visible-light absorbing homodisperse nanoparticle
DOE Office of Scientific and Technical Information (OSTI.GOV)
Benedict, Jason
Broadly, the scientific progress from this award focused in two main areas: developing time-resolved X-ray diffraction methods and the synthesis and characterization of molecular systems relevant to solar energy harvesting. The knowledge of photo–induced non–equilibrium states is central to our understanding of processes involved in solar–energy capture. More specifically, knowledge of the geometry changes on excitation and their relation to lifetimes and variation with adsorption of chromophores on the substrates is of importance for the design of molecular devices used in light capture.
Automated knowledge base development from CAD/CAE databases
NASA Technical Reports Server (NTRS)
Wright, R. Glenn; Blanchard, Mary
1988-01-01
Knowledge base development requires a substantial investment in time, money, and resources in order to capture the knowledge and information necessary for anything other than trivial applications. This paper addresses a means to integrate the design and knowledge base development process through automated knowledge base development from CAD/CAE databases and files. Benefits of this approach include the development of a more efficient means of knowledge engineering, resulting in the timely creation of large knowledge based systems that are inherently free of error.
Designing Albaha Internet of Farming Architecture
NASA Astrophysics Data System (ADS)
Alahmadi, A.
2017-04-01
Up to now, most farmers in Albaha, Saudi Arabia are still practicing traditional way, which is not optimized in term of water usage, quality of product, etc. At the same time, nowadays ICT becomes a key driver for Innovation in Farming. In this project, we propose a smart Internet of farming system to assist farmers in Albaha to optimize their farm productivity by providing accurate information to the farmers the right time prediction to harvest, to fertilize, to watering and other activities related to the farming/agriculture technology. The proposed system utilizes wireless sensor cloud to capture remotely important data such as temperature, humidity, soil condition (moisture, water level), etc., and then they are sent to a storage servers at Albaha University cloud. An adaptive knowledge engine will process the captured data into knowledge and the farmers can retrieve the knowledge using their smartphones via the Internet.
Managing clinical failure: a complex adaptive system perspective.
Matthews, Jean I; Thomas, Paul T
2007-01-01
The purpose of this article is to explore the knowledge capture process at the clinical level. It aims to identify factors that enable or constrain learning. The study applies complex adaptive system thinking principles to reconcile learning within the NHS. The paper uses a qualitative exploratory study with an interpretative methodological stance set in a secondary care NHS Trust. Semi-structured interviews were conducted with healthcare practitioners and managers involved at both strategic and operational risk management processes. A network structure is revealed that exhibits the communication and interdependent working practices to support knowledge capture and adaptive learning. Collaborative multidisciplinary communities, whose values reflect local priorities and promote open dialogue and reflection, are featured. The main concern is that the characteristics of bureaucracy; rational-legal authority, a rule-based culture, hierarchical lines of communication and a centralised governance focus, are hindering clinical learning by generating barriers. Locally emergent collaborative processes are a key strategic resource to capture knowledge, potentially fostering an environment that could learn from failure and translate lessons between contexts. What must be addressed is that reporting mechanisms serve not only the governance objectives, but also supplement learning by highlighting the potential lessons in context. Managers must nurture a collaborative infrastructure using networks in a co-evolutionary manner. Their role is not to direct and design processes but to influence, support and create effective knowledge capture. Although the study only investigated one site the findings and conclusions may well translate to other trusts--such as the risk of not enabling a learning environment at clinical levels.
NASA Technical Reports Server (NTRS)
Johnson, Teresa A.
2006-01-01
Knowledge Management is a proactive pursuit for the future success of any large organization faced with the imminent possibility that their senior managers/engineers with gained experiences and lessons learned plan to retire in the near term. Safety and Mission Assurance (S&MA) is proactively pursuing unique mechanism to ensure knowledge learned is retained and lessons learned captured and documented. Knowledge Capture Event/Activities/Management helps to provide a gateway between future retirees and our next generation of managers/engineers. S&MA hosted two Knowledge Capture Events during 2005 featuring three of its retiring fellows (Axel Larsen, Dave Whittle and Gary Johnson). The first Knowledge Capture Event February 24, 2005 focused on two Safety and Mission Assurance Safety Panels (Space Shuttle System Safety Review Panel (SSRP); Payload Safety Review Panel (PSRP) and the latter event December 15, 2005 featured lessons learned during Apollo, Skylab, and Space Shuttle which could be applicable in the newly created Crew Exploration Vehicle (CEV)/Constellation development program. Gemini, Apollo, Skylab and the Space Shuttle promised and delivered exciting human advances in space and benefits of space in people s everyday lives on earth. Johnson Space Center's Safety & Mission Assurance team work over the last 20 years has been mostly focused on operations we are now beginning the Exploration development program. S&MA will promote an atmosphere of knowledge sharing in its formal and informal cultures and work processes, and reward the open dissemination and sharing of information; we are asking "Why embrace relearning the "lessons learned" in the past?" On the Exploration program the focus will be on Design, Development, Test, & Evaluation (DDT&E); therefore, it is critical to understand the lessons from these past programs during the DDT&E phase.
Protecting and Promoting Indigenous Knowledge: Environmental Adult Education and Organic Agriculture
ERIC Educational Resources Information Center
Sumner, Jennifer
2008-01-01
Given today's pressing environmental issues, environmental adult educators can help us learn to live more sustainably. One of the models for more sustainable ways of life is organic agriculture, based in a knowledge system that works with nature, not against it. In order to understand this knowledge, we need to frame it in a way that captures all…
Information Integration for Concurrent Engineering (IICE) Compendium of Methods Report
1995-06-01
technological, economic, and strategic benefits can be attained through the effective capture, control, and management of information and knowledge ...resources. Like manpower, materials, and machines, information and knowledge assets are recognized as vital resources that can be leveraged to achieve...integrated enterprise. These technologies are designed to leverage information and knowledge resources as the key enablers for high quality systems that
USDA-ARS?s Scientific Manuscript database
Metabolic reconstructions (MRs) are common denominators in systems biology and represent biochemical, genetic, and genomic (BiGG) knowledge-bases for target organisms by capturing currently available information in a consistent, structured manner. Salmonella enterica subspecies I serovar Typhimurium...
Automation of the Environmental Control and Life Support System
NASA Technical Reports Server (NTRS)
Dewberry, Brandon S.; Carnes, J. Ray
1990-01-01
The objective of the Environmental Control and Life Support System (ECLSS) Advanced Automation Project is to recommend and develop advanced software for the initial and evolutionary Space Station Freedom (SSF) ECLS system which will minimize the crew and ground manpower needed for operations. Another objective includes capturing ECLSS design and development knowledge for future missions. This report summarizes our results from Phase I, the ECLSS domain analysis phase, which we broke down into three steps: 1) Analyze and document the baselined ECLS system, 2) envision as our goal an evolution to a fully automated regenerative life support system, built upon an augmented baseline, and 3) document the augmentations (hooks and scars) and advanced software systems which we see as necessary in achieving minimal manpower support for ECLSS operations. In addition, Phase I included development of an advanced software life cycle testing tools will be used in the development of the software. In this way, we plan in preparation for phase II and III, the development and integration phases, respectively. Automated knowledge acquisition, engineering, verification, and can capture ECLSS development knowledge for future use, develop more robust and complex software, provide feedback to the KBS tool community, and insure proper visibility of our efforts.
Design knowledge capture for the space station
NASA Technical Reports Server (NTRS)
Crouse, K. R.; Wechsler, D. B.
1987-01-01
The benefits of design knowledge availability are identifiable and pervasive. The implementation of design knowledge capture and storage using current technology increases the probability for success, while providing for a degree of access compatibility with future applications. The space station design definition should be expanded to include design knowledge. Design knowledge should be captured. A critical timing relationship exists between the space station development program, and the implementation of this project.
NASA Astrophysics Data System (ADS)
Wright, Willie E.
2003-05-01
As Military Medical Information Assurance organizations face off with modern pressures to downsize and outsource, they battle with losing knowledgeable people who leave and take with them what they know. This knowledge is increasingly being recognized as an important resource and organizations are now taking steps to manage it. In addition, as the pressures for globalization (Castells, 1998) increase, collaboration and cooperation are becoming more distributed and international. Knowledge sharing in a distributed international environment is becoming an essential part of Knowledge Management. This is a major shortfall in the current approach to capturing and sharing knowledge in Military Medical Information Assurance. This paper addresses this challenge by exploring Risk Information Management Resource (RIMR) as a tool for sharing knowledge using the concept of Communities of Practice. RIMR is based no the framework of sharing and using knowledge. This concept is done through three major components - people, process and technology. The people aspect enables remote collaboration, support communities of practice, reward and recognize knowledge sharing while encouraging storytelling. The process aspect enhances knowledge capture and manages information. While the technology aspect enhance system integration and data mining, it also utilizes intelligent agents and exploits expert systems. These coupled with supporting activities of education and training, technology infrastructure and information security enables effective information assurance collaboration.
EDNA: Expert fault digraph analysis using CLIPS
NASA Technical Reports Server (NTRS)
Dixit, Vishweshwar V.
1990-01-01
Traditionally fault models are represented by trees. Recently, digraph models have been proposed (Sack). Digraph models closely imitate the real system dependencies and hence are easy to develop, validate and maintain. However, they can also contain directed cycles and analysis algorithms are hard to find. Available algorithms tend to be complicated and slow. On the other hand, the tree analysis (VGRH, Tayl) is well understood and rooted in vast research effort and analytical techniques. The tree analysis algorithms are sophisticated and orders of magnitude faster. Transformation of a digraph (cyclic) into trees (CLP, LP) is a viable approach to blend the advantages of the representations. Neither the digraphs nor the trees provide the ability to handle heuristic knowledge. An expert system, to capture the engineering knowledge, is essential. We propose an approach here, namely, expert network analysis. We combine the digraph representation and tree algorithms. The models are augmented by probabilistic and heuristic knowledge. CLIPS, an expert system shell from NASA-JSC will be used to develop a tool. The technique provides the ability to handle probabilities and heuristic knowledge. Mixed analysis, some nodes with probabilities, is possible. The tool provides graphics interface for input, query, and update. With the combined approach it is expected to be a valuable tool in the design process as well in the capture of final design knowledge.
Competent Systems: Effective, Efficient, Deliverable.
ERIC Educational Resources Information Center
Abramson, Bruce
Recent developments in artificial intelligence and decision analysis suggest reassessing the approaches commonly taken to the design of knowledge-based systems. Competent systems are based on models known as influence diagrams, which graphically capture a domain's basic objects and their interrelationships. Among the benefits offered by influence…
AI in medicine on its way from knowledge-intensive to data-intensive systems.
Horn, W
2001-08-01
The last 20 years of research and development in the field of artificial intelligence in medicine (AIM) show a path from knowledge-intensive systems, which try to capture the essential knowledge of experts in a knowledge-based system, to data-intensive systems available today. Nowadays enormous amounts of information is accessible electronically. Large datasets are collected continuously monitoring physiological parameters of patients. Knowledge-based systems are needed to make use of all these data available and to help us to cope with the information explosion. In addition, temporal data analysis and intelligent information visualization can help us to get a summarized view of the change over time of clinical parameters. Integrating AIM modules into the daily-routine software environment of our care providers gives us a great chance for maintaining and improving quality of care.
Consistent visualizations of changing knowledge
Tipney, Hannah J.; Schuyler, Ronald P.; Hunter, Lawrence
2009-01-01
Networks are increasingly used in biology to represent complex data in uncomplicated symbolic form. However, as biological knowledge is continually evolving, so must those networks representing this knowledge. Capturing and presenting this type of knowledge change over time is particularly challenging due to the intimate manner in which researchers customize those networks they come into contact with. The effective visualization of this knowledge is important as it creates insight into complex systems and stimulates hypothesis generation and biological discovery. Here we highlight how the retention of user customizations, and the collection and visualization of knowledge associated provenance supports effective and productive network exploration. We also present an extension of the Hanalyzer system, ReOrient, which supports network exploration and analysis in the presence of knowledge change. PMID:21347184
NASA Technical Reports Server (NTRS)
Stewart, Helen; Spence, Matt Chew; Holm, Jeanne; Koga, Dennis (Technical Monitor)
2001-01-01
This white paper explores how to increase the success and operation of critical, complex, national systems by effectively capturing knowledge management requirements within the federal acquisition process. Although we focus on aerospace flight systems, the principles outlined within may have a general applicability to other critical federal systems as well. Fundamental design deficiencies in federal, mission-critical systems have contributed to recent, highly visible system failures, such as the V-22 Osprey and the Delta rocket family. These failures indicate that the current mechanisms for knowledge management and risk management are inadequate to meet the challenges imposed by the rising complexity of critical systems. Failures of aerospace system operations and vehicles may have been prevented or lessened through utilization of better knowledge management and information management techniques.
User Acceptance of Mobile Knowledge Management Learning System: Design and Analysis
ERIC Educational Resources Information Center
Chen, Hong-Ren; Huang, Hui-Ling
2010-01-01
Thanks to advanced developments in wireless technology, learners can now utilize digital learning websites at anytime and anywhere. Mobile learning captures more and more attention in the wave of digital learning. Evolving use of knowledge management plays an important role to enhance problem solving skills. Recently, innovative approaches for…
A Working Framework for Enabling International Science Data System Interoperability
NASA Astrophysics Data System (ADS)
Hughes, J. Steven; Hardman, Sean; Crichton, Daniel J.; Martinez, Santa; Law, Emily; Gordon, Mitchell K.
2016-07-01
For diverse scientific disciplines to interoperate they must be able to exchange information based on a shared understanding. To capture this shared understanding, we have developed a knowledge representation framework that leverages ISO level reference models for metadata registries and digital archives. This framework provides multi-level governance, evolves independent of the implementation technologies, and promotes agile development, namely adaptive planning, evolutionary development, early delivery, continuous improvement, and rapid and flexible response to change. The knowledge representation is captured in an ontology through a process of knowledge acquisition. Discipline experts in the role of stewards at the common, discipline, and project levels work to design and populate the ontology model. The result is a formal and consistent knowledge base that provides requirements for data representation, integrity, provenance, context, identification, and relationship. The contents of the knowledge base are translated and written to files in suitable formats to configure system software and services, provide user documentation, validate input, and support data analytics. This presentation will provide an overview of the framework, present a use case that has been adopted by an entire science discipline at the international level, and share some important lessons learned.
FUTURE APPLICATIONS OF EXPERT SYSTEMS FOR THE EVALUATION OF ENERGY RESOURCES.
Miller, Betty M.
1988-01-01
The loss of professional experience and expertise in the domain of the earth sciences may prove to be one of the most serious outcomes of the boom-and-bust cyclic nature of the volatile energy and mining industries. Promising new applications of powerful computer systems, known as 'expert systems' or 'knowledge-based systems', are predicted for use in the earth science. These systems have the potential capability to capture and preserve the invaluable knowledge bases essential to the evaluation of US energy and mineral resources.
FUTURE APPLICATIONS OF EXPERT SYSTEMS FOR THE EVALUATION OF ENERGY RESOURCES.
Miller, B.M.
1987-01-01
The loss of professional experience and expertise in the domain of the earth sciences may prove to be one of the most serious outcomes of the boom-and-bust cyclic nature of the volatile energy and mining industries. Promising new applications of powerful computer systems, known as 'expert systems' or 'knowledge-based systems', are predicted for use in the earth sciences. These systems have the potential capability to capture and preserve the invaluable knowledge bases essential to the evaluation of the Nation's energy and mineral resources.
Virtual Tissues and Developmental Systems Biology (book chapter)
Virtual tissue (VT) models provide an in silico environment to simulate cross-scale properties in specific tissues or organs based on knowledge of the underlying biological networks. These integrative models capture the fundamental interactions in a biological system and enable ...
Knowledge Acquisition and Management for the NASA Earth Exchange (NEX)
NASA Astrophysics Data System (ADS)
Votava, P.; Michaelis, A.; Nemani, R. R.
2013-12-01
NASA Earth Exchange (NEX) is a data, computing and knowledge collaboratory that houses NASA satellite, climate and ancillary data where a focused community can come together to share modeling and analysis codes, scientific results, knowledge and expertise on a centralized platform with access to large supercomputing resources. As more and more projects are being executed on NEX, we are increasingly focusing on capturing the knowledge of the NEX users and provide mechanisms for sharing it with the community in order to facilitate reuse and accelerate research. There are many possible knowledge contributions to NEX, it can be a wiki entry on the NEX portal contributed by a developer, information extracted from a publication in an automated way, or a workflow captured during code execution on the supercomputing platform. The goal of the NEX knowledge platform is to capture and organize this information and make it easily accessible to the NEX community and beyond. The knowledge acquisition process consists of three main faucets - data and metadata, workflows and processes, and web-based information. Once the knowledge is acquired, it is processed in a number of ways ranging from custom metadata parsers to entity extraction using natural language processing techniques. The processed information is linked with existing taxonomies and aligned with internal ontology (which heavily reuses number of external ontologies). This forms a knowledge graph that can then be used to improve users' search query results as well as provide additional analytics capabilities to the NEX system. Such a knowledge graph will be an important building block in creating a dynamic knowledge base for the NEX community where knowledge is both generated and easily shared.
Space Telecommunications Radio System (STRS) Application Repository Design and Analysis
NASA Technical Reports Server (NTRS)
Handler, Louis M.
2013-01-01
The Space Telecommunications Radio System (STRS) Application Repository Design and Analysis document describes the STRS application repository for software-defined radio (SDR) applications intended to be compliant to the STRS Architecture Standard. The document provides information about the submission of artifacts to the STRS application repository, to provide information to the potential users of that information, and for the systems engineer to understand the requirements, concepts, and approach to the STRS application repository. The STRS application repository is intended to capture knowledge, documents, and other artifacts for each waveform application or other application outside of its project so that when the project ends, the knowledge is retained. The document describes the transmission of technology from mission to mission capturing lessons learned that are used for continuous improvement across projects and supporting NASA Procedural Requirements (NPRs) for performing software engineering projects and NASAs release process.
Methods, Knowledge Support, and Experimental Tools for Modeling
2006-10-01
open source software entities: the PostgreSQL relational database management system (http://www.postgres.org), the Apache web server (http...past. The revision control system allows the program to capture disagreements, and allows users to explore the history of such disagreements by
Knowledge Preservation for Design of Rocket Systems
NASA Technical Reports Server (NTRS)
Moreman, Douglas
2002-01-01
An engineer at NASA Lewis RC presented a challenge to us at Southern University. Our response to that challenge, stated circa 1993, has evolved into the Knowledge Preservation Project which is here reported. The stated problem was to capture some of the knowledge of retiring NASA engineers and make it useful to younger engineers via computers. We evolved that initial challenge to this - design a system of tools such that, with this system, people might efficiently capture and make available via commonplace computers, deep knowledge of retiring NASA engineers. In the process of proving some of the concepts of this system, we would (and did) capture knowledge from some specific engineers and, so, meet the original challenge along the way to meeting the new. Some of the specific knowledge acquired, particularly that on the RL- 10 engine, was directly relevant to design of rocket engines. We considered and rejected some of the techniques popular in the days we began - specifically "expert systems" and "oral histories". We judged that these old methods had too high a cost per sentence preserved. That cost could be measured in hours of labor of a "knowledge professional". We did spend, particularly in the grant preceding this one, some time creating a couple of "concept maps", one of the latest ideas of the day, but judged this also to be costly in time of a specially trained knowledge-professional. We reasoned that the cost in specialized labor could be lowered if less time were spent being selective about sentences from the engineers and in crafting replacements for those sentences. The trade-off would seem to be that our set of sentences would be less dense in information, but we found a computer-based way around this seeming defect. Our plan, details of which we have been carrying out, was to find methods of extracting information from experts which would be capable of gaining cooperation, and interest, of senior engineers and using their time in a way they would find worthy (and, so, they would give more of their time and recruit time of other engineers as well). We studied these four ways of creating text: 1) the old way, via interviews and discussions - one of our team working with one expert, 2) a group-discussion led by one of the experts themselves and on a topic which inspires interaction of the experts, 3) a spoken dissertation by one expert practiced in giving talks, 4) expropriating, and modifying for our system, some existing reports (such as "oral histories" from the Smithsonian Institution).
Extending TOPS: Ontology-driven Anomaly Detection and Analysis System
NASA Astrophysics Data System (ADS)
Votava, P.; Nemani, R. R.; Michaelis, A.
2010-12-01
Terrestrial Observation and Prediction System (TOPS) is a flexible modeling software system that integrates ecosystem models with frequent satellite and surface weather observations to produce ecosystem nowcasts (assessments of current conditions) and forecasts useful in natural resources management, public health and disaster management. We have been extending the Terrestrial Observation and Prediction System (TOPS) to include a capability for automated anomaly detection and analysis of both on-line (streaming) and off-line data. In order to best capture the knowledge about data hierarchies, Earth science models and implied dependencies between anomalies and occurrences of observable events such as urbanization, deforestation, or fires, we have developed an ontology to serve as a knowledge base. We can query the knowledge base and answer questions about dataset compatibilities, similarities and dependencies so that we can, for example, automatically analyze similar datasets in order to verify a given anomaly occurrence in multiple data sources. We are further extending the system to go beyond anomaly detection towards reasoning about possible causes of anomalies that are also encoded in the knowledge base as either learned or implied knowledge. This enables us to scale up the analysis by eliminating a large number of anomalies early on during the processing by either failure to verify them from other sources, or matching them directly with other observable events without having to perform an extensive and time-consuming exploration and analysis. The knowledge is captured using OWL ontology language, where connections are defined in a schema that is later extended by including specific instances of datasets and models. The information is stored using Sesame server and is accessible through both Java API and web services using SeRQL and SPARQL query languages. Inference is provided using OWLIM component integrated with Sesame.
Learning about Ecological Systems by Constructing Qualitative Models with DynaLearn
ERIC Educational Resources Information Center
Leiba, Moshe; Zuzovsky, Ruth; Mioduser, David; Benayahu, Yehuda; Nachmias, Rafi
2012-01-01
A qualitative model of a system is an abstraction that captures ordinal knowledge and predicts the set of qualitatively possible behaviours of the system, given a qualitative description of its structure and initial state. This paper examines an innovative approach to science education using an interactive learning environment that supports…
The importance of knowledge-based technology.
Cipriano, Pamela F
2012-01-01
Nurse executives are responsible for a workforce that can provide safer and more efficient care in a complex sociotechnical environment. National quality priorities rely on technologies to provide data collection, share information, and leverage analytic capabilities to interpret findings and inform approaches to care that will achieve better outcomes. As a key steward for quality, the nurse executive exercises leadership to provide the infrastructure to build and manage nursing knowledge and instill accountability for following evidence-based practices. These actions contribute to a learning health system where new knowledge is captured as a by-product of care delivery enabled by knowledge-based electronic systems. The learning health system also relies on rigorous scientific evidence embedded into practice at the point of care. The nurse executive optimizes use of knowledge-based technologies, integrated throughout the organization, that have the capacity to help transform health care.
Factors shaping the evolution of electronic documentation systems
NASA Technical Reports Server (NTRS)
Dede, Christopher J.; Sullivan, Tim R.; Scace, Jacque R.
1990-01-01
The main goal is to prepare the space station technical and managerial structure for likely changes in the creation, capture, transfer, and utilization of knowledge. By anticipating advances, the design of Space Station Project (SSP) information systems can be tailored to facilitate a progression of increasingly sophisticated strategies as the space station evolves. Future generations of advanced information systems will use increases in power to deliver environmentally meaningful, contextually targeted, interconnected data (knowledge). The concept of a Knowledge Base Management System is emerging when the problem is focused on how information systems can perform such a conversion of raw data. Such a system would include traditional management functions for large space databases. Added artificial intelligence features might encompass co-existing knowledge representation schemes; effective control structures for deductive, plausible, and inductive reasoning; means for knowledge acquisition, refinement, and validation; explanation facilities; and dynamic human intervention. The major areas covered include: alternative knowledge representation approaches; advanced user interface capabilities; computer-supported cooperative work; the evolution of information system hardware; standardization, compatibility, and connectivity; and organizational impacts of information intensive environments.
Adaptive versus Learner Control in a Multiple Intelligence Learning Environment
ERIC Educational Resources Information Center
Kelly, Declan
2008-01-01
Within the field of technology enhanced learning, adaptive educational systems offer an advanced form of learning environment that attempts to meet the needs of different students. Such systems capture and represent, for each student, various characteristics such as knowledge and traits in an individual learner model. Subsequently, using the…
A diagnostic expert system for aircraft generator control unit (GCU)
NASA Astrophysics Data System (ADS)
Ho, Ting-Long; Bayles, Robert A.; Havlicsek, Bruce L.
The modular VSCF (variable-speed constant-frequency) generator families are described as using standard modules to reduce the maintenance cost and to improve the product's testability. A general diagnostic expert system shell that guides troubleshooting of modules or line replaceable units (LRUs) is introduced. An application of the diagnostic system to a particular LRU, the generator control unit (GCU) is reported. The approach to building the diagnostic expert system is first to capture general diagnostic strategy in an expert system shell. This shell can be easily applied to different devices or LRUs by writing rules to capture only additional device-specific diagnostic information from expert repair personnel. The diagnostic system has the necessary knowledge embedded in its programs and exhibits expertise to troubleshoot the GCU.
The Study on Collaborative Manufacturing Platform Based on Agent
NASA Astrophysics Data System (ADS)
Zhang, Xiao-yan; Qu, Zheng-geng
To fulfill the trends of knowledge-intensive in collaborative manufacturing development, we have described multi agent architecture supporting knowledge-based platform of collaborative manufacturing development platform. In virtue of wrapper service and communication capacity agents provided, the proposed architecture facilitates organization and collaboration of multi-disciplinary individuals and tools. By effectively supporting the formal representation, capture, retrieval and reuse of manufacturing knowledge, the generalized knowledge repository based on ontology library enable engineers to meaningfully exchange information and pass knowledge across boundaries. Intelligent agent technology increases traditional KBE systems efficiency and interoperability and provides comprehensive design environments for engineers.
Evaluation of a Gait Assessment Module Using 3D Motion Capture Technology
Baskwill, Amanda J.; Belli, Patricia; Kelleher, Leila
2017-01-01
Background Gait analysis is the study of human locomotion. In massage therapy, this observation is part of an assessment process that informs treatment planning. Massage therapy students must apply the theory of gait assessment to simulated patients. At Humber College, the gait assessment module traditionally consists of a textbook reading and a three-hour, in-class session in which students perform gait assessment on each other. In 2015, Humber College acquired a three-dimensional motion capture system. Purpose The purpose was to evaluate the use of 3D motion capture in a gait assessment module compared to the traditional gait assessment module. Participants Semester 2 massage therapy students who were enrolled in Massage Theory 2 (n = 38). Research Design Quasi-experimental, wait-list comparison study. Intervention The intervention group participated in an in-class session with a Qualisys motion capture system. Main Outcome Measure(s) The outcomes included knowledge and application of gait assessment theory as measured by quizzes, and students’ satisfaction as measured through a questionnaire. Results There were no statistically significant differences in baseline and post-module knowledge between both groups (pre-module: p = .46; post-module: p = .63). There was also no difference between groups on the final application question (p = .13). The intervention group enjoyed the in-class session because they could visualize the content, whereas the comparison group enjoyed the interactivity of the session. The intervention group recommended adding the assessment of gait on their classmates to their experience. Both groups noted more time was needed for the gait assessment module. Conclusions Based on the results of this study, it is recommended that the gait assessment module combine both the traditional in-class session and the 3D motion capture system. PMID:28293329
How do we Remain Us in a Time of Change: Culture and Knowledge Management at NASA
NASA Technical Reports Server (NTRS)
Linde, Charlotte
2003-01-01
This viewgraph representation presents an overview of findings of a NASA agency-wide Knowledge Management Team considering culture and knowledge management issues at the agency. Specific issues identified by the team include: (1) NASA must move from being a knowledge hoarding culture to a knowledge sharing culture; (2) NASA must move from being center focused to being Agency focused; (3) NASA must capture the knowledge of a departing workforce. Topics considered include: what must NASA know to remain NASA, what were previous forms of knowledge reproduction and how has technological innovations changed these systems, and what changes in funding and relationships between contractors and NASA affected knowledge reproduction.
Hilimire, Matthew R; Corballis, Paul M
2014-01-01
Objects compete for representation in our limited capacity visual system. We examined how this competition is influenced by top-down knowledge using event-related potentials. Competition was manipulated by presenting visual search arrays in which the target or distractor was the only color singleton compared to displays in which both singletons were presented. Experiments 1 and 2 manipulated whether the observer knew the color of the target in advance. Experiment 3 ruled out low-level sensory explanations. Results show that, under conditions of competition, the distractor does not elicit an N2pc when the target color is known. However, the N2pc elicited by the target is reduced in the presence of a distractor. These findings suggest that top-down knowledge can prevent the capture of attention by distracting information, but this prior knowledge does not eliminate the competitive influence of the distractor on the target. Copyright © 2013 Society for Psychophysiological Research.
Artificial intelligence techniques for scheduling Space Shuttle missions
NASA Technical Reports Server (NTRS)
Henke, Andrea L.; Stottler, Richard H.
1994-01-01
Planning and scheduling of NASA Space Shuttle missions is a complex, labor-intensive process requiring the expertise of experienced mission planners. We have developed a planning and scheduling system using combinations of artificial intelligence knowledge representations and planning techniques to capture mission planning knowledge and automate the multi-mission planning process. Our integrated object oriented and rule-based approach reduces planning time by orders of magnitude and provides planners with the flexibility to easily modify planning knowledge and constraints without requiring programming expertise.
Research accomplished at the Knowledge Based Systems Lab: IDEF3, version 1.0
NASA Technical Reports Server (NTRS)
Mayer, Richard J.; Menzel, Christopher P.; Mayer, Paula S. D.
1991-01-01
An overview is presented of the foundations and content of the evolving IDEF3 process flow and object state description capture method. This method is currently in beta test. Ongoing efforts in the formulation of formal semantics models for descriptions captured in the outlined form and in the actual application of this method can be expected to cause an evolution in the method language. A language is described for the representation of process and object state centered system description. IDEF3 is a scenario driven process flow modeling methodology created specifically for these types of descriptive activities.
Tacit Knowledge Capture and the Brain-Drain at Electrical Utilities
NASA Astrophysics Data System (ADS)
Perjanik, Nicholas Steven
As a consequence of an aging workforce, electric utilities are at risk of losing their most experienced and knowledgeable electrical engineers. In this research, the problem was a lack of understanding of what electric utilities were doing to capture the tacit knowledge or know-how of these engineers. The purpose of this qualitative research study was to explore the tacit knowledge capture strategies currently used in the industry by conducting a case study of 7 U.S. electrical utilities that have demonstrated an industry commitment to improving operational standards. The research question addressed the implemented strategies to capture the tacit knowledge of retiring electrical engineers and technical personnel. The research methodology involved a qualitative embedded case study. The theories used in this study included knowledge creation theory, resource-based theory, and organizational learning theory. Data were collected through one time interviews of a senior electrical engineer or technician within each utility and a workforce planning or training professional within 2 of the 7 utilities. The analysis included the use of triangulation and content analysis strategies. Ten tacit knowledge capture strategies were identified: (a) formal and informal on-boarding mentorship and apprenticeship programs, (b) formal and informal off-boarding mentorship programs, (c) formal and informal training programs, (d) using lessons learned during training sessions, (e) communities of practice, (f) technology enabled tools, (g) storytelling, (h) exit interviews, (i) rehiring of retirees as consultants, and (j) knowledge risk assessments. This research contributes to social change by offering strategies to capture the know-how needed to ensure operational continuity in the delivery of safe, reliable, and sustainable power.
Knowledge Value Creation Characteristics of Virtual Teams: A Case Study in the Construction Sector
NASA Astrophysics Data System (ADS)
Vorakulpipat, Chalee; Rezgui, Yacine
Any knowledge environment aimed at virtual teams should promote identification, access, capture and retrieval of relevant knowledge anytime / anywhere, while nurturing the social activities that underpin the knowledge sharing and creation process. In fact, socio-cultural issues play a critical role in the successful implementation of Knowledge Management (KM), and constitute a milestone towards value creation. The findings indicate that Knowledge Management Systems (KMS) promote value creation when they embed and nurture the social conditions that bind and bond team members together. Furthermore, technology assets, human networks, social capital, intellectual capital, and change management are identified as essential ingredients that have the potential to ensure effective knowledge value creation.
Linking Earth Observations and Models to Societal Information Needs: The Case of Coastal Flooding
NASA Astrophysics Data System (ADS)
Buzzanga, B. A.; Plag, H. P.
2016-12-01
Coastal flooding is expected to increase in many areas due to sea level rise (SLR). Many societal applications such as emergency planning and designing public services depend on information on how the flooding spectrum may change as a result of SLR. To identify the societal information needs a conceptual model is needed that identifies the key stakeholders, applications, and information and observation needs. In the context of the development of the Global Earth Observation System of Systems (GEOSS), which is implemented by the Group on Earth Observations (GEO), the Socio-Economic and Environmental Information Needs Knowledge Base (SEE-IN KB) is developed as part of the GEOSS Knowledge Base. A core function of the SEE-IN KB is to facilitate the linkage of societal information needs to observations, models, information and knowledge. To achieve this, the SEE-IN KB collects information on objects such as user types, observational requirements, societal goals, models, and datasets. Comprehensive information concerning the interconnections between instances of these objects is used to capture the connectivity and to establish a conceptual model as a network of networks. The captured connectivity can be used in searches to allow users to discover products and services for their information needs, and providers to search for users and applications benefiting from their products. It also allows to answer "What if?" questions and supports knowledge creation. We have used the SEE-IN KB to develop a conceptual model capturing the stakeholders in coastal flooding and their information needs, and to link these elements to objects. We show how the knowledge base enables the transition of scientific data to useable information by connecting individuals such as city managers to flood maps. Within the knowledge base, these same users can request information that improves their ability to make specific planning decisions. These needs are linked to entities within research institutions that have the capabilities to meet them. Further, current research such as that investigating precipitation-induced flooding under different SLR scenarios is linked to the users who benefit from the knowledge, effectively creating a bi-directional channel between science and society that increases knowledge and improves foresight.
Reproducibility and Knowledge Capture Architecture for the NASA Earth Exchange (NEX)
NASA Astrophysics Data System (ADS)
Votava, P.; Michaelis, A.; Nemani, R. R.
2015-12-01
NASA Earth Exchange (NEX) is a data, supercomputing and knowledge collaboratory that houses NASA satellite, climate and ancillary data where a focused community can come together to address large-scale challenges in Earth sciences. As NEX has been growing into a platform for analysis, experiments and production of data on the order of petabytes in volume, it has been increasingly important to enable users to easily retrace their steps, identify what datasets were produced by which process or chain of processes, and give them ability to readily reproduce their results. This can be a tedious and difficult task even for a small project, but is almost impossible on large processing pipelines. For example, the NEX Landsat pipeline is deployed to process hundreds of thousands of Landsat scenes in a non-linear production workflow with many-to-many mappings of files between 40 separate processing stages where over 100 million processes get executed. At this scale it is almost impossible to easily examine the entire provenance of each file, let alone easily reproduce it. We have developed an initial solution for the NEX system - a transparent knowledge capture and reproducibility architecture that does not require any special code instrumentation and other actions on user's part. Users can automatically capture their work through a transparent provenance tracking system and the information can subsequently be queried and/or converted into workflows. The provenance information is streamed to a MongoDB document store and a subset is converted to an RDF format and inserted into our triple-store. The triple-store already contains semantic information about other aspects of the NEX system and adding provenance enhances the ability to relate workflows and data to users, locations, projects and other NEX concepts that can be queried in a standard way. The provenance system has the ability to track data throughout NEX and across number of executions and can recreate and re-execute the entire history and reproduce the results. The information can also be used to automatically create individual workflow components and full workflows that can be visually examined, modified, executed and extended by researchers. This provides a key component for accelerating research through knowledge capture and scientific reproducibility on NEX.
Good, Benjamin M; Loguercio, Salvatore; Griffith, Obi L; Nanis, Max; Wu, Chunlei; Su, Andrew I
2014-07-29
Molecular signatures for predicting breast cancer prognosis could greatly improve care through personalization of treatment. Computational analyses of genome-wide expression datasets have identified such signatures, but these signatures leave much to be desired in terms of accuracy, reproducibility, and biological interpretability. Methods that take advantage of structured prior knowledge (eg, protein interaction networks) show promise in helping to define better signatures, but most knowledge remains unstructured. Crowdsourcing via scientific discovery games is an emerging methodology that has the potential to tap into human intelligence at scales and in modes unheard of before. The main objective of this study was to test the hypothesis that knowledge linking expression patterns of specific genes to breast cancer outcomes could be captured from players of an open, Web-based game. We envisioned capturing knowledge both from the player's prior experience and from their ability to interpret text related to candidate genes presented to them in the context of the game. We developed and evaluated an online game called The Cure that captured information from players regarding genes for use as predictors of breast cancer survival. Information gathered from game play was aggregated using a voting approach, and used to create rankings of genes. The top genes from these rankings were evaluated using annotation enrichment analysis, comparison to prior predictor gene sets, and by using them to train and test machine learning systems for predicting 10 year survival. Between its launch in September 2012 and September 2013, The Cure attracted more than 1000 registered players, who collectively played nearly 10,000 games. Gene sets assembled through aggregation of the collected data showed significant enrichment for genes known to be related to key concepts such as cancer, disease progression, and recurrence. In terms of the predictive accuracy of models trained using this information, these gene sets provided comparable performance to gene sets generated using other methods, including those used in commercial tests. The Cure is available on the Internet. The principal contribution of this work is to show that crowdsourcing games can be developed as a means to address problems involving domain knowledge. While most prior work on scientific discovery games and crowdsourcing in general takes as a premise that contributors have little or no expertise, here we demonstrated a crowdsourcing system that succeeded in capturing expert knowledge.
NASA Technical Reports Server (NTRS)
Caraccioli, Paul; Varnedoe, Tom; Smith, Randy; McCarter, Mike; Wilson, Barry; Porter, Richard
2006-01-01
NASA Marshall Space Flight Center's Propulsion Systems Department (PSD) is four months into a fifteen month Knowledge Management (KM) initiative to support enhanced engineering decision making and analyses, faster resolution of anomalies (near-term) and effective, efficient knowledge infused engineering processes, reduced knowledge attrition, and reduced anomaly occurrences (long-term). The near-term objective of this initiative is developing a KM Pilot project, within the context of a 3-5 year KM strategy, to introduce and evaluate the use of KM within PSD. An internal NASA/MSFC PSD KM team was established early in project formulation to maintain a practitioner, user-centric focus throughout the conceptual development, planning and deployment of KM technologies and capabilities within the PSD. The PSD internal team is supported by the University of Alabama's Aging Infrastructure Systems Center of Excellence (AISCE), lntergraph Corporation, and The Knowledge Institute. The principle product of the initial four month effort has been strategic planning of PSD KNI implementation by first determining the "as is" state of KM capabilities and developing, planning and documenting the roadmap to achieve the desired "to be" state. Activities undertaken to suppoth e planning phase have included data gathering; cultural surveys, group work-sessions, interviews, documentation review, and independent research. Assessments and analyses have beon pedormed including industry benchmarking, related local and Agency initiatives, specific tools and techniques used and strategies for leveraging existing resources, people and technology to achieve common KM goals. Key findings captured in the PSD KM Strategic Plan include the system vision, purpose, stakeholders, prioritized strategic objectives mapped to the top ten practitioner needs and analysis of current resource usage. Opportunities identified from research, analyses, cultural1KM surveys and practitioner interviews include: executive and senior management sponsorship, KM awareness, promotion and training, cultural change management, process improvement, leveraging existing resources and new innovative technologies to align with other NASA KM initiatives (convergence: the big picture). To enable results based incremental implementation and future growth of the KM initiative, key performance measures have been identified including stakeholder value, system utility, learning and growth (knowledge capture, sharing, reduced anomaly recurrence), cultural change, process improvement and return-on-investment. The next steps for the initial implementation spiral (focused on SSME Turbomachinery) have been identified, largely based on the organization and compilation of summary level engineering process models, data capture matrices, functional models and conceptual-level svstems architecture. Key elements include detailed KM requirements definition, KM technology architecture assessment, - evaluation and selection, deployable KM Pilot design, development, implementation and evaluation, and justifying full implementation (estimated Return-on-Investment). Features identified for the notional system architecture include the knowledge presentation layer (and its components), knowledge network layer (and its components), knowledge storage layer (and its components), User Interface and capabilities. This paper provides a snapshot of the progress to date, the near term planning for deploying the KM pilot project and a forward look at results based growth of KM capabilities with-in the MSFC PSD.
Sketching for Knowledge Capture: A Progress Report
2002-01-16
understanding , qualitative modeling, knowledge acquisition, analogy, diagrammatic reasoning, spatial reasoning. INTRODUCTION Sketching is often used...main limits of sKEA’s expressivity are (a) the predicate vocabulary in its knowledge base and (b) how natural it is to express a piece of information ...Sketching for knowledge capture: A progress report Kenneth D. Forbus Qualitative Reasoning Group Northwestern University 1890 Maple Avenue
Marshall Space Flight Center Propulsion Systems Department (PSD) KM Initiative
NASA Technical Reports Server (NTRS)
Caraccioli, Paul; Varnadoe, Tom; McCarter, Mike
2006-01-01
NASA Marshall Space Flight Center s Propulsion Systems Department (PSD) is four months into a fifteen month Knowledge Management (KM) initiative to support enhanced engineering decision making and analyses, faster resolution of anomalies (near-term) and effective, efficient knowledge infused engineering processes, reduced knowledge attrition, and reduced anomaly occurrences (long-term). The near-term objective of this initiative is developing a KM Pilot project, within the context of a 3-5 year KM strategy, to introduce and evaluate the use of KM within PSD. An internal NASA/MSFC PSD KM team was established early in project formulation to maintain a practitioner, user-centric focus throughout the conceptual development, planning and deployment of KM technologies and capabilities with in the PSD. The PSD internal team is supported by the University of Alabama's Aging Infrastructure Systems Center Of Excellence (AISCE), Intergraph Corporation, and The Knowledge Institute. The principle product of the initial four month effort has been strategic planning of PSD KM implementation by first determining the "as is" state of KM capabilities and developing, planning and documenting the roadmap to achieve the desired "to be" state. Activities undertaken to support the planning phase have included data gathering; cultural surveys, group work-sessions, interviews, documentation review, and independent research. Assessments and analyses have been performed including industry benchmarking, related local and Agency initiatives, specific tools and techniques used and strategies for leveraging existing resources, people and technology to achieve common KM goals. Key findings captured in the PSD KM Strategic Plan include the system vision, purpose, stakeholders, prioritized strategic objectives mapped to the top ten practitioner needs and analysis of current resource usage. Opportunities identified from research, analyses, cultural/KM surveys and practitioner interviews include: executive and senior management sponsorship, KM awareness, promotion and training, cultural change management, process improvement, leveraging existing resources and new innovative technologies to align with other NASA KM initiatives (convergence: the big picture). To enable results based incremental implementation and future growth of the KM initiative, key performance measures have been identified including stakeholder value, system utility, learning and growth (knowledge capture, sharing, reduced anomaly recurrence), cultural change, process improvement and return-on-investment. The next steps for the initial implementation spiral (focused on SSME Turbomachinery) have been identified, largely based on the organization and compilation of summary level engineering process models, data capture matrices, functional models and conceptual-level systems architecture. Key elements include detailed KM requirements definition, KM technology architecture assessment, evaluation and selection, deployable KM Pilot design, development, implementation and evaluation, and justifying full implementation (estimated Return-on-Investment). Features identified for the notional system architecture include the knowledge presentation layer (and its components), knowledge network layer (and its components), knowledge storage layer (and its components), User Interface and capabilities. This paper provides a snapshot of the progress to date, the near term planning for deploying the KM pilot project and a forward look at results based growth of KM capabilities with-in the MSFC PSD.
ISLE: Intelligent Selection of Loop Electronics. A CLIPS/C++/INGRES integrated application
NASA Technical Reports Server (NTRS)
Fischer, Lynn; Cary, Judson; Currie, Andrew
1990-01-01
The Intelligent Selection of Loop Electronics (ISLE) system is an integrated knowledge-based system that is used to configure, evaluate, and rank possible network carrier equipment known as Digital Loop Carrier (DLC), which will be used to meet the demands of forecasted telephone services. Determining the best carrier systems and carrier architectures, while minimizing the cost, meeting corporate policies and addressing area service demands, has become a formidable task. Network planners and engineers use the ISLE system to assist them in this task of selecting and configuring the appropriate loop electronics equipment for future telephone services. The ISLE application is an integrated system consisting of a knowledge base, implemented in CLIPS (a planner application), C++, and an object database created from existing INGRES database information. The embedibility, performance, and portability of CLIPS provided us with a tool with which to capture, clarify, and refine corporate knowledge and distribute this knowledge within a larger functional system to network planners and engineers throughout U S WEST.
Reasoning with case histories of process knowledge for efficient process development
NASA Technical Reports Server (NTRS)
Bharwani, Seraj S.; Walls, Joe T.; Jackson, Michael E.
1988-01-01
The significance of compiling case histories of empirical process knowledge and the role of such histories in improving the efficiency of manufacturing process development is discussed in this paper. Methods of representing important investigations as cases and using the information from such cases to eliminate redundancy of empirical investigations in analogous process development situations are also discussed. A system is proposed that uses such methods to capture the problem-solving framework of the application domain. A conceptual design of the system is presented and discussed.
Semantic Maps Capturing Organization Knowledge in e-Learning
NASA Astrophysics Data System (ADS)
Mavridis, Androklis; Koumpis, Adamantios; Demetriadis, Stavros N.
e-learning, shows much promise in accessibility and opportunity to learn, due to its asynchronous nature and its ability to transmit knowledge fast and effectively. However without a universal standard for online learning and teaching, many systems are proclaimed as “e-learning-compliant”, offering nothing more than automated services for delivering courses online, providing no additional enhancement to reusability and learner personalization. Hence, the focus is not on providing reusable and learner-centered content, but on developing the technology aspects of e-learning. This current trend has made it crucial to find a more refined definition of what constitutes knowledge in the e-learning context. We propose an e-learning system architecture that makes use of a knowledge model to facilitate continuous dialogue and inquiry-based knowledge learning, by exploiting the full benefits of the semantic web as a medium capable for supplying the web with formalized knowledge.
ERIC Educational Resources Information Center
Madhavan, Krishna; Johri, Aditya; Xian, Hanjun; Wang, G. Alan; Liu, Xiaomo
2014-01-01
The proliferation of digital information technologies and related infrastructure has given rise to novel ways of capturing, storing and analyzing data. In this paper, we describe the research and development of an information system called Interactive Knowledge Networks for Engineering Education Research (iKNEER). This system utilizes a framework…
Kabachinski, Jeff
2010-01-01
Knowledge can range from complex, accumulated expertise (tacit knowledge) to structured explicit content like service procedures. For most of us, knowledge management should only be one of many collaborative means to an end, not the end in itself (unless you are the corporate knowledge management director or chief knowledge officer). For that reason, KM is important only to the extent that it improves an organization's capability and capacity to deal with, and develop in, the four dimensions of capturing, codifying, storing, and using knowledge. Knowledge that is more or less explicit can be embedded in procedures or represented in documents and databases and transferred with reasonable accuracy. Tacit knowledge transfer generally requires extensive personal contact. Take for example troubleshooting circuits. While troubleshooting can be procedural to an extent, it is still somewhat of an art that pulls from experience and training. This is the kind of tacit knowledge where partnerships, mentoring, or an apprenticeship, are most effective. The most successful organizations are those where knowledge management is part of everyone's job. Tacit, complex knowledge that is developed and internalized over a long period of time is almost impossible to reproduce in a document, database, or expert system. Even before the days of "core competencies", the learning organization, expert systems, and strategy focus, good managers valued the experience and know-how of employees. Today, many are recognizing that what is needed is more than a casual approach to corporate knowledge if they are to succeed. In addition, the aging population of the baby boomers may require means to capture their experience and knowledge before they leave the workforce. There is little doubt that knowledge is one of any organization's most important resources, or that knowledge workers' roles will grow in importance in the years ahead. Why would an organization believe that knowledge and knowledge workers are important, yet not advocate active management of knowledge itself? Taking advantage of already accumulated corporate intellectual property is by far the most low-cost way to increase capability and competitive stature. These are all good reasons why it might pay to take a look at your KM usage.
Temporal and contextual knowledge in model-based expert systems
NASA Technical Reports Server (NTRS)
Toth-Fejel, Tihamer; Heher, Dennis
1987-01-01
A basic paradigm that allows representation of physical systems with a focus on context and time is presented. Paragon provides the capability to quickly capture an expert's knowledge in a cognitively resonant manner. From that description, Paragon creates a simulation model in LISP, which when executed, verifies that the domain expert did not make any mistakes. The Achille's heel of rule-based systems has been the lack of a systematic methodology for testing, and Paragon's developers are certain that the model-based approach overcomes that problem. The reason this testing is now possible is that software, which is very difficult to test, has in essence been transformed into hardware.
Engineered Resilient Systems: Knowledge Capture and Transfer
2014-08-29
development, but the work has not progressed significantly. 71 Peter Kall and Stein W. Wallace, Stochastic Programming, John Wiley & Sons, Chichester, 1994...John Wiley and Sons: Hoboken, 2008. Peter Kall and Stein W. Wallace, Stochastic Programming, John Wiley & Sons, Chichester, 1994. Rhodes, D.H., Lamb
NASA System Safety Framework and Concepts for Implementation
NASA Technical Reports Server (NTRS)
Dezfuli, Homayoon
2012-01-01
This report has been developed by the National Aeronautics and Space Administration (NASA) Human Exploration and Operations Mission Directorate (HEOMD) Risk Management team knowledge capture forums.. This document provides a point-in-time, cumulative, summary of actionable key lessons learned in safety framework and concepts.
A knowledge-based patient assessment system: conceptual and technical design.
Reilly, C. A.; Zielstorff, R. D.; Fox, R. L.; O'Connell, E. M.; Carroll, D. L.; Conley, K. A.; Fitzgerald, P.; Eng, T. K.; Martin, A.; Zidik, C. M.; Segal, M.
2000-01-01
This paper describes the design of an inpatient patient assessment application that captures nursing assessment data using a wireless laptop computer. The primary aim of this system is to capture structured information for facilitating decision support and quality monitoring. The system also aims to improve efficiency of recording patient assessments, reduce costs, and improve discharge planning and early identification of patient learning needs. Object-oriented methods were used to elicit functional requirements and to model the proposed system. A tools-based development approach is being used to facilitate rapid development and easy modification of assessment items and rules for decision support. Criteria for evaluation include perceived utility by clinician users, validity of decision support rules, time spent recording assessments, and perceived utility of aggregate reports for quality monitoring. PMID:11079970
A knowledge-based patient assessment system: conceptual and technical design.
Reilly, C A; Zielstorff, R D; Fox, R L; O'Connell, E M; Carroll, D L; Conley, K A; Fitzgerald, P; Eng, T K; Martin, A; Zidik, C M; Segal, M
2000-01-01
This paper describes the design of an inpatient patient assessment application that captures nursing assessment data using a wireless laptop computer. The primary aim of this system is to capture structured information for facilitating decision support and quality monitoring. The system also aims to improve efficiency of recording patient assessments, reduce costs, and improve discharge planning and early identification of patient learning needs. Object-oriented methods were used to elicit functional requirements and to model the proposed system. A tools-based development approach is being used to facilitate rapid development and easy modification of assessment items and rules for decision support. Criteria for evaluation include perceived utility by clinician users, validity of decision support rules, time spent recording assessments, and perceived utility of aggregate reports for quality monitoring.
Corporate knowledge repository: Adopting academic LMS into corporate environment
NASA Astrophysics Data System (ADS)
Bakar, Muhamad Shahbani Abu; Jalil, Dzulkafli
2017-10-01
The growth of Knowledge Economy has transformed human capital to be the vital asset in business organization of the 21st century. Arguably, due to its white-collar nature, knowledge-based industry is more favorable than traditional manufacturing business. However, over dependency on human capital can also be a major challenge as any workers will inevitably leave the company or retire. This situation will possibly create knowledge gap that may impact business continuity of the enterprise. Knowledge retention in the corporate environment has been of many research interests. Learning Management System (LMS) refers to the system that provides the delivery, assessment and management tools for an organization to handle its knowledge repository. By using the aspirations of a proven LMS implemented in an academic environment, this paper proposes LMS model that can be used to enable peer-to-peer knowledge capture and sharing in the knowledge-based organization. Cloud Enterprise Resource Planning (ERP), referred to an ERP solution in the internet cloud environment was chosen as the domain knowledge. The complexity of the Cloud ERP business and its knowledge make it very vulnerable to the knowledge retention problem. This paper discusses how the company's essential knowledge can be retained using the LMS system derived from academic environment into the corporate model.
Lindberg, Arley
2012-01-01
Federal welfare reform, local service collaborations, and the evolution of statewide information systems inspired agency interest in evidence-informed practice and knowledge sharing systems. Four agency leaders, including the Director, Deputy Director, Director of Planning and Evaluation, and Staff Development Program Manager championed the development of a learning organization based on knowledge management throughout the agency. Internal department restructuring helped to strengthen the Planning and Evaluation, Staff Development, and Personnel units, which have become central to supporting knowledge sharing activities. The Four Pillars of Knowledge framework was designed to capture agency directions in relationship to future knowledge management goals. Featuring People, Practice, Technology and Budget, the framework links the agency's services, mission and goals to the process of becoming a learning organization. Built through an iterative process, the framework was created by observing existing activities in each department rather than being designed from the top down. Knowledge management can help the department to fulfill its mission despite reduced resources. Copyright © Taylor & Francis Group, LLC
Data Model Management for Space Information Systems
NASA Technical Reports Server (NTRS)
Hughes, J. Steven; Crichton, Daniel J.; Ramirez, Paul; Mattmann, chris
2006-01-01
The Reference Architecture for Space Information Management (RASIM) suggests the separation of the data model from software components to promote the development of flexible information management systems. RASIM allows the data model to evolve independently from the software components and results in a robust implementation that remains viable as the domain changes. However, the development and management of data models within RASIM are difficult and time consuming tasks involving the choice of a notation, the capture of the model, its validation for consistency, and the export of the model for implementation. Current limitations to this approach include the lack of ability to capture comprehensive domain knowledge, the loss of significant modeling information during implementation, the lack of model visualization and documentation capabilities, and exports being limited to one or two schema types. The advent of the Semantic Web and its demand for sophisticated data models has addressed this situation by providing a new level of data model management in the form of ontology tools. In this paper we describe the use of a representative ontology tool to capture and manage a data model for a space information system. The resulting ontology is implementation independent. Novel on-line visualization and documentation capabilities are available automatically, and the ability to export to various schemas can be added through tool plug-ins. In addition, the ingestion of data instances into the ontology allows validation of the ontology and results in a domain knowledge base. Semantic browsers are easily configured for the knowledge base. For example the export of the knowledge base to RDF/XML and RDFS/XML and the use of open source metadata browsers provide ready-made user interfaces that support both text- and facet-based search. This paper will present the Planetary Data System (PDS) data model as a use case and describe the import of the data model into an ontology tool. We will also describe the current effort to provide interoperability with the European Space Agency (ESA)/Planetary Science Archive (PSA) which is critically dependent on a common data model.
U.S. Spacesuit Knowledge Capture Accomplishments in Fiscal Year 2015
NASA Technical Reports Server (NTRS)
Chullen, Cinda; Oliva, Vladenka R.
2016-01-01
The NASA U.S. Spacesuit Knowledge Capture (SKC) Program continues to capture, share, and archive significant spacesuit-related knowledge with engineers and other technical staff and invested entities. Since its 2007 inception, the SKC Program has hosted and recorded more than 75 events. By the end of Fiscal Year (FY) 2015, 40 of these were processed and uploaded to a publically accessible NASA Web site where viewers can expand their knowledge about the spacesuit's evolution, known capabilities and limitations, and lessons learned. Sharing this knowledge with entities beyond NASA can increase not only more people's understanding of the technical effort and importance involved in designing a spacesuit, it can also expand the interest and support in this valuable program that ensures significant knowledge is retained and accessible. This paper discusses the FY 2015 SKC events, the release and accessibility of the approved events, and the program's future plans.
U.S. Spacesuit Knowledge Capture Accomplishments in Fiscal Year 2015
NASA Technical Reports Server (NTRS)
Chullen, Cinda; Oliva, Vladenka R.
2016-01-01
The NASA U.S. Spacesuit Knowledge Capture (SKC) Program continues to capture, share, and archive significant spacesuit-related knowledge with engineers and other technical staff and invested entities. Since its 2007 inception, the SKC Program has hosted and recorded more than 65 events. By the end of Fiscal Year (FY) 2015, 40 of these were processed and uploaded to a publically accessible NASA Web site where viewers can expand their knowledge about the spacesuit's evolution, known capability and limitations, and lessons learned. Sharing this knowledge with entities beyond NASA can increase not only more people's understanding of the technical effort and importance involved in designing a spacesuit, it can also expand the interest and support in this valuable program that ensures significant knowledge is retained and accessible. This paper discusses the FY 2015 SKC events, the release and accessibility of the approved events, and the program's future plans.
Factors Shaping the Evolution of Electronic Documentation Systems. Research Activity No. IM.4.
ERIC Educational Resources Information Center
Dede, C. J.; And Others
The first of 10 sections in this report focuses on factors that will affect the evolution of Space Station Project (SSP) documentation systems. The goal of this project is to prepare the space station technical and managerial structure for likely changes in the creation, capture, transfer, and utilization of knowledge about the space station which…
The Construction of Knowledge through Social Interaction via Computer-Mediated Communication
ERIC Educational Resources Information Center
Saritas, Tuncay
2008-01-01
With the advance in information and communication technologies, computer-mediated communication--more specifically computer conferencing systems (CCS)--has captured the interest of educators as an ideal tool to create a learning environment featuring active, participative, and reflective learning. Educators are increasingly adapting the features…
Intelligent systems for human resources.
Kline, K B
1988-11-01
An intelligent system contains knowledge about some domain; it has sophisticated decision-making processes and the ability to explain its actions. The most important aspect of an intelligent system is its ability to effectively interact with humans to teach or assist complex information processing. Two intelligent systems are Intelligent Tutoring Systems (ITs) and Expert Systems. The ITSs provide instruction to a student similar to a human tutor. The ITSs capture individual performance and tutor deficiencies. These systems consist of an expert module, which contains the knowledge or material to be taught; the student module, which contains a representation of the knowledge the student knows and does not know about the domain; and the instructional or teaching module, which selects specific knowledge to teach, the instructional strategy, and provides assistance to the student to tutor deficiencies. Expert systems contain an expert's knowledge about some domain and perform specialized tasks or aid a novice in the performance of certain tasks. The most important part of an expert system is the knowledge base. This knowledge base contains all the specialized and technical knowledge an expert possesses. For an expert system to interact effectively with humans, it must have the ability to explain its actions. Use of intelligent systems can have a profound effect on human resources. The ITSs can provide better training by tutoring on an individual basis, and the expert systems can make better use of human resources through job aiding and performing complex tasks. With increasing training requirements and "doing more with less," intelligent systems can have a positive effect on human resources.
Global Dynamic Exposure and the OpenBuildingMap - Communicating Risk and Involving Communities
NASA Astrophysics Data System (ADS)
Schorlemmer, Danijel; Beutin, Thomas; Hirata, Naoshi; Hao, Ken; Wyss, Max; Cotton, Fabrice; Prehn, Karsten
2017-04-01
Detailed understanding of local risk factors regarding natural catastrophes requires in-depth characterization of the local exposure. Current exposure capture techniques have to find the balance between resolution and coverage. We aim at bridging this gap by employing a crowd-sourced approach to exposure capturing, focusing on risk related to earthquake hazard. OpenStreetMap (OSM), the rich and constantly growing geographical database, is an ideal foundation for this task. More than 3.5 billion geographical nodes, more than 200 million building footprints (growing by 100'000 per day), and a plethora of information about school, hospital, and other critical facilities allows us to exploit this dataset for risk-related computations. We are combining the strengths of crowd-sourced data collection with the knowledge of experts in extracting the most information from these data. Besides relying on the very active OpenStreetMap community and the Humanitarian OpenStreetMap Team, which are collecting building information at high pace, we are providing a tailored building capture tool for mobile devices. This tool is facilitating simple and fast building property capturing for OpenStreetMap by any person or interested community. With our OpenBuildingMap system, we are harvesting this dataset by processing every building in near-realtime. We are collecting exposure and vulnerability indicators from explicitly provided data (e.g. hospital locations), implicitly provided data (e.g. building shapes and positions), and semantically derived data, i.e. interpretation applying expert knowledge. The expert knowledge is needed to translate the simple building properties as captured by OpenStreetMap users into vulnerability and exposure indicators and subsequently into building classifications as defined in the Building Taxonomy 2.0 developed by the Global Earthquake Model (GEM) and the European Macroseismic Scale (EMS98). With this approach, we increase the resolution of existing exposure models from aggregated exposure information to building-by-building vulnerability. We report on our method, on the software development for the mobile application and the server-side analysis system, and on the OpenBuildingMap (www.openbuildingmap.org), our global Tile Map Service focusing on building properties. The free/open framework we provide can be used on commodity hardware for local to regional exposure capturing, for stakeholders in disaster management and mitigation for communicating risk, and for communities to understand their risk.
Loguercio, Salvatore; Griffith, Obi L; Nanis, Max; Wu, Chunlei; Su, Andrew I
2014-01-01
Background Molecular signatures for predicting breast cancer prognosis could greatly improve care through personalization of treatment. Computational analyses of genome-wide expression datasets have identified such signatures, but these signatures leave much to be desired in terms of accuracy, reproducibility, and biological interpretability. Methods that take advantage of structured prior knowledge (eg, protein interaction networks) show promise in helping to define better signatures, but most knowledge remains unstructured. Crowdsourcing via scientific discovery games is an emerging methodology that has the potential to tap into human intelligence at scales and in modes unheard of before. Objective The main objective of this study was to test the hypothesis that knowledge linking expression patterns of specific genes to breast cancer outcomes could be captured from players of an open, Web-based game. We envisioned capturing knowledge both from the player’s prior experience and from their ability to interpret text related to candidate genes presented to them in the context of the game. Methods We developed and evaluated an online game called The Cure that captured information from players regarding genes for use as predictors of breast cancer survival. Information gathered from game play was aggregated using a voting approach, and used to create rankings of genes. The top genes from these rankings were evaluated using annotation enrichment analysis, comparison to prior predictor gene sets, and by using them to train and test machine learning systems for predicting 10 year survival. Results Between its launch in September 2012 and September 2013, The Cure attracted more than 1000 registered players, who collectively played nearly 10,000 games. Gene sets assembled through aggregation of the collected data showed significant enrichment for genes known to be related to key concepts such as cancer, disease progression, and recurrence. In terms of the predictive accuracy of models trained using this information, these gene sets provided comparable performance to gene sets generated using other methods, including those used in commercial tests. The Cure is available on the Internet. Conclusions The principal contribution of this work is to show that crowdsourcing games can be developed as a means to address problems involving domain knowledge. While most prior work on scientific discovery games and crowdsourcing in general takes as a premise that contributors have little or no expertise, here we demonstrated a crowdsourcing system that succeeded in capturing expert knowledge. PMID:25654473
A novel hand-type detection technique with fingerprint sensor
NASA Astrophysics Data System (ADS)
Abe, Narishige; Shinzaki, Takashi
2013-05-01
In large-scale biometric authentication systems such as the US-Visit (USA), a 10-fingerprints scanner which simultaneously captures four fingerprints is used. In traditional systems, specific hand-types (left or right) are indicated, but it is difficult to detect hand-type due to the hand rotation and the opening and closing of fingers. In this paper, we evaluated features that were extracted from hand images (which were captured by a general optical scanner) that are considered to be effective for detecting hand-type. Furthermore, we extended the knowledge to real fingerprint images, and evaluated the accuracy with which it detects hand-type. We obtained an accuracy of about 80% with only three fingers (index, middle, ring finger).
Scotland's Knowledge Network: translating knowledge into action to improve quality of care.
Wales, A; Graham, S; Rooney, K; Crawford, A
2012-11-01
The Knowledge Network (www.knowledge.scot.nhs.uk) is Scotland's online knowledge service for health and social care. It is designed to support practitioners to apply knowledge in frontline delivery of care, helping to translate knowledge into better health-care outcomes through safe, effective, person-centred care. The Knowledge Network helps to combine the worlds of evidence-based practice and quality improvement by providing access to knowledge about the effectiveness of clinical interventions ('know-what') and knowledge about how to implement this knowledge to support individual patients in working health-care environments ('know-how'). An 'evidence and guidance' search enables clinicians to quickly access quality-assured evidence and best practice, while point of care and mobile solutions provide knowledge in actionable formats to embed in clinical workflow. This research-based knowledge is complemented by social networking services and improvement tools which support the capture and exchange of knowledge from experience, facilitating practice change and systems improvement. In these cases, the Knowledge Network supports key components of the knowledge-to-action cycle--acquiring, creating, sharing and disseminating knowledge to improve performance and innovate. It provides a vehicle for implementing the recommendations of the national Knowledge into Action review, which outlines a new national approach to embedding knowledge in frontline practice and systems improvement.
Liu, Hu-Chen; Liu, Long; Lin, Qing-Lian; Liu, Nan
2013-06-01
The two most important issues of expert systems are the acquisition of domain experts' professional knowledge and the representation and reasoning of the knowledge rules that have been identified. First, during expert knowledge acquisition processes, the domain expert panel often demonstrates different experience and knowledge from one another and produces different types of knowledge information such as complete and incomplete, precise and imprecise, and known and unknown because of its cross-functional and multidisciplinary nature. Second, as a promising tool for knowledge representation and reasoning, fuzzy Petri nets (FPNs) still suffer a couple of deficiencies. The parameters in current FPN models could not accurately represent the increasingly complex knowledge-based systems, and the rules in most existing knowledge inference frameworks could not be dynamically adjustable according to propositions' variation as human cognition and thinking. In this paper, we present a knowledge acquisition and representation approach using the fuzzy evidential reasoning approach and dynamic adaptive FPNs to solve the problems mentioned above. As is illustrated by the numerical example, the proposed approach can well capture experts' diversity experience, enhance the knowledge representation power, and reason the rule-based knowledge more intelligently.
Toward an integrated knowledge environment to support modern oncology.
Blake, Patrick M; Decker, David A; Glennon, Timothy M; Liang, Yong Michael; Losko, Sascha; Navin, Nicholas; Suh, K Stephen
2011-01-01
Around the world, teams of researchers continue to develop a wide range of systems to capture, store, and analyze data including treatment, patient outcomes, tumor registries, next-generation sequencing, single-nucleotide polymorphism, copy number, gene expression, drug chemistry, drug safety, and toxicity. Scientists mine, curate, and manually annotate growing mountains of data to produce high-quality databases, while clinical information is aggregated in distant systems. Databases are currently scattered, and relationships between variables coded in disparate datasets are frequently invisible. The challenge is to evolve oncology informatics from a "systems" orientation of standalone platforms and silos into an "integrated knowledge environments" that will connect "knowable" research data with patient clinical information. The aim of this article is to review progress toward an integrated knowledge environment to support modern oncology with a focus on supporting scientific discovery and improving cancer care.
Demiris, G; Thompson, H
2011-01-01
As health care systems face limited resources and workforce shortages to address the complex needs of older adult populations, innovative approaches utilizing information technology can support aging. Smart Home and Ambient Assisted Living (SHAAL) systems utilize advanced and ubiquitous technologies including sensors and other devices that are integrated in the residential infrastructure or wearable, to capture data describing activities of daily living and health related events. This paper highlights how data from SHAAL systems can lead to information and knowledge that ultimately improves clinical outcomes and quality of life for older adults as well as quality of health care services. We conducted a review of personal health record applications specifically for older adults and approaches to using information to improve elder care. We present a framework that showcases how data captured from SHAAL systems can be processed to provide meaningful information that becomes part of a personal health record. Synthesis and visualization of information resulting from SHAAL systems can lead to knowledge and support education, delivery of tailored interventions and if needed, transitions in care. Such actions can involve multiple stakeholders as part of shared decision making. SHAAL systems have the potential to support aging and improve quality of life and decision making for older adults and their families. The framework presented in this paper demonstrates how emphasis needs to be placed into extracting meaningful information from new innovative systems that will support decision making. The challenge for informatics designers and researchers is to facilitate an evolution of SHAAL systems expanding beyond demonstration projects to actual interventions that will improve health care for older adults.
An approach to design knowledge capture for the space station
NASA Technical Reports Server (NTRS)
Wechsler, D. B.; Crouse, K. R.
1986-01-01
The design of NASA's space station has begun. During the design cycle, and after activation of the space station, the reoccurring need will exist to access not only designs, but also deeper knowledge about the designs, which is only hinted in the design definition. Areas benefiting from this knowledge include training, fault management, and onboard automation. NASA's Artificial Intelligence Office at Johnson Space Center and The MITRE Corporation have conceptualized an approach for capture and storage of design knowledge.
An Approach To Design Knowledge Capture For The Space Station
NASA Astrophysics Data System (ADS)
Wechsler, D. B.; Crouse, K. R.
1987-02-01
Design of NASA's Space Station has begun. During the design cycle, and after activation of the Space Station, the reoccuring need will exist to access not only designs; but also deeper knowledge about the designs, which is only hinted in the design definition. Areas benefiting from this knowledge include training, fault management, and onboard automation. NASA's Artificial Intelligence Office at Johnson Space Center and The MITRE Corporation have conceptualized an approach for capture and storage of design knowledge.
An approach to design knowledge capture for the space station
NASA Technical Reports Server (NTRS)
Wechsler, D. B.; Crouse, K. R.
1987-01-01
The design of NASA's space station has begun. During the design cycle, and after activation of the space station, the reoccurring need will exist to access not only designs, but also deeper knowledge about the designs, which is only hinted in the design definition. Areas benefiting from this knowledge include training, fault management, and onboard automation. NASA's Artificial Intelligence Office at Johnson Space Center and The MITRE Corporation have conceptualized an approach for capture and storage of design knowledge.
The nutrition advisor expert system
NASA Technical Reports Server (NTRS)
Huse, Scott M.; Shyne, Scott S.
1991-01-01
The Nutrition Advisor Expert System (NAES) is an expert system written in the C Language Integrated Production System (CLIPS). NAES provides expert knowledge and guidance into the complex world of nutrition management by capturing the knowledge of an expert and placing it at the user's fingertips. Specifically, NAES enables the user to: (1) obtain precise nutrition information for food items; (2) perform nutritional analysis of meal(s), flagging deficiencies based upon the U.S. Recommended Daily Allowances; (3) predict possible ailments based upon observed nutritional deficiency trends; (4) obtain a top ten listing of food items for a given nutrient; and (5) conveniently upgrade the data base. An explanation facility for the ailment prediction feature is also provided to document the reasoning process.
Approximate reasoning using terminological models
NASA Technical Reports Server (NTRS)
Yen, John; Vaidya, Nitin
1992-01-01
Term Subsumption Systems (TSS) form a knowledge-representation scheme in AI that can express the defining characteristics of concepts through a formal language that has a well-defined semantics and incorporates a reasoning mechanism that can deduce whether one concept subsumes another. However, TSS's have very limited ability to deal with the issue of uncertainty in knowledge bases. The objective of this research is to address issues in combining approximate reasoning with term subsumption systems. To do this, we have extended an existing AI architecture (CLASP) that is built on the top of a term subsumption system (LOOM). First, the assertional component of LOOM has been extended for asserting and representing uncertain propositions. Second, we have extended the pattern matcher of CLASP for plausible rule-based inferences. Third, an approximate reasoning model has been added to facilitate various kinds of approximate reasoning. And finally, the issue of inconsistency in truth values due to inheritance is addressed using justification of those values. This architecture enhances the reasoning capabilities of expert systems by providing support for reasoning under uncertainty using knowledge captured in TSS. Also, as definitional knowledge is explicit and separate from heuristic knowledge for plausible inferences, the maintainability of expert systems could be improved.
Common IED exploitation target set ontology
NASA Astrophysics Data System (ADS)
Russomanno, David J.; Qualls, Joseph; Wowczuk, Zenovy; Franken, Paul; Robinson, William
2010-04-01
The Common IED Exploitation Target Set (CIEDETS) ontology provides a comprehensive semantic data model for capturing knowledge about sensors, platforms, missions, environments, and other aspects of systems under test. The ontology also includes representative IEDs; modeled as explosives, camouflage, concealment objects, and other background objects, which comprise an overall threat scene. The ontology is represented using the Web Ontology Language and the SPARQL Protocol and RDF Query Language, which ensures portability of the acquired knowledge base across applications. The resulting knowledge base is a component of the CIEDETS application, which is intended to support the end user sensor test and evaluation community. CIEDETS associates a system under test to a subset of cataloged threats based on the probability that the system will detect the threat. The associations between systems under test, threats, and the detection probabilities are established based on a hybrid reasoning strategy, which applies a combination of heuristics and simplified modeling techniques. Besides supporting the CIEDETS application, which is focused on efficient and consistent system testing, the ontology can be leveraged in a myriad of other applications, including serving as a knowledge source for mission planning tools.
NASA Technical Reports Server (NTRS)
Campbell, William J.
1985-01-01
Intelligent data management is the concept of interfacing a user to a database management system with a value added service that will allow a full range of data management operations at a high level of abstraction using human written language. The development of such a system will be based on expert systems and related artificial intelligence technologies, and will allow the capturing of procedural and relational knowledge about data management operations and the support of a user with such knowledge in an on-line, interactive manner. Such a system will have the following capabilities: (1) the ability to construct a model of the users view of the database, based on the query syntax; (2) the ability to transform English queries and commands into database instructions and processes; (3) the ability to use heuristic knowledge to rapidly prune the data space in search processes; and (4) the ability to use an on-line explanation system to allow the user to understand what the system is doing and why it is doing it. Additional information is given in outline form.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McNutt, T.
Advancements in informatics in radiotherapy are opening up opportunities to improve our ability to assess treatment plans. Models on individualizing patient dose constraints from prior patient data and shape relationships have been extensively researched and are now making their way into commercial products. New developments in knowledge based treatment planning involve understanding the impact of the radiation dosimetry on the patient. Akin to radiobiology models that have driven intensity modulated radiotherapy optimization, toxicity and outcome predictions based on treatment plans and prior patient experiences may be the next step in knowledge based planning. In order to realize these predictions, itmore » is necessary to understand how the clinical information can be captured, structured and organized with ontologies and databases designed for recall. Large databases containing radiation dosimetry and outcomes present the opportunity to evaluate treatment plans against predictions of toxicity and disease response. Such evaluations can be based on dose volume histogram or even the full 3-dimensional dose distribution and its relation to the critical anatomy. This session will provide an understanding of ontologies and standard terminologies used to capture clinical knowledge into structured databases; How data can be organized and accessed to utilize the knowledge in planning; and examples of research and clinical efforts to incorporate that clinical knowledge into planning for improved care for our patients. Learning Objectives: Understand the role of standard terminologies, ontologies and data organization in oncology Understand methods to capture clinical toxicity and outcomes in a clinical setting Understand opportunities to learn from clinical data and its application to treatment planning Todd McNutt receives funding from Philips, Elekta and Toshiba for some of the work presented.« less
An, Gary C
2010-01-01
The greatest challenge facing the biomedical research community is the effective translation of basic mechanistic knowledge into clinically effective therapeutics. This challenge is most evident in attempts to understand and modulate "systems" processes/disorders, such as sepsis, cancer, and wound healing. Formulating an investigatory strategy for these issues requires the recognition that these are dynamic processes. Representation of the dynamic behavior of biological systems can aid in the investigation of complex pathophysiological processes by augmenting existing discovery procedures by integrating disparate information sources and knowledge. This approach is termed Translational Systems Biology. Focusing on the development of computational models capturing the behavior of mechanistic hypotheses provides a tool that bridges gaps in the understanding of a disease process by visualizing "thought experiments" to fill those gaps. Agent-based modeling is a computational method particularly well suited to the translation of mechanistic knowledge into a computational framework. Utilizing agent-based models as a means of dynamic hypothesis representation will be a vital means of describing, communicating, and integrating community-wide knowledge. The transparent representation of hypotheses in this dynamic fashion can form the basis of "knowledge ecologies," where selection between competing hypotheses will apply an evolutionary paradigm to the development of community knowledge.
NASA Technical Reports Server (NTRS)
Lee, Mun Wai
2015-01-01
Crew exercise is important during long-duration space flight not only for maintaining health and fitness but also for preventing adverse health problems, such as losses in muscle strength and bone density. Monitoring crew exercise via motion capture and kinematic analysis aids understanding of the effects of microgravity on exercise and helps ensure that exercise prescriptions are effective. Intelligent Automation, Inc., has developed ESPRIT to monitor exercise activities, detect body markers, extract image features, and recover three-dimensional (3D) kinematic body poses. The system relies on prior knowledge and modeling of the human body and on advanced statistical inference techniques to achieve robust and accurate motion capture. In Phase I, the company demonstrated motion capture of several exercises, including walking, curling, and dead lifting. Phase II efforts focused on enhancing algorithms and delivering an ESPRIT prototype for testing and demonstration.
Knowledge Framework Implementation with Multiple Architectures - 13090
DOE Office of Scientific and Technical Information (OSTI.GOV)
Upadhyay, H.; Lagos, L.; Quintero, W.
2013-07-01
Multiple kinds of knowledge management systems are operational in public and private enterprises, large and small organizations with a variety of business models that make the design, implementation and operation of integrated knowledge systems very difficult. In recent days, there has been a sweeping advancement in the information technology area, leading to the development of sophisticated frameworks and architectures. These platforms need to be used for the development of integrated knowledge management systems which provides a common platform for sharing knowledge across the enterprise, thereby reducing the operational inefficiencies and delivering cost savings. This paper discusses the knowledge framework andmore » architecture that can be used for the system development and its application to real life need of nuclear industry. A case study of deactivation and decommissioning (D and D) is discussed with the Knowledge Management Information Tool platform and framework. D and D work is a high priority activity across the Department of Energy (DOE) complex. Subject matter specialists (SMS) associated with DOE sites, the Energy Facility Contractors Group (EFCOG) and the D and D community have gained extensive knowledge and experience over the years in the cleanup of the legacy waste from the Manhattan Project. To prevent the D and D knowledge and expertise from being lost over time from the evolving and aging workforce, DOE and the Applied Research Center (ARC) at Florida International University (FIU) proposed to capture and maintain this valuable information in a universally available and easily usable system. (authors)« less
The BioIntelligence Framework: a new computational platform for biomedical knowledge computing.
Farley, Toni; Kiefer, Jeff; Lee, Preston; Von Hoff, Daniel; Trent, Jeffrey M; Colbourn, Charles; Mousses, Spyro
2013-01-01
Breakthroughs in molecular profiling technologies are enabling a new data-intensive approach to biomedical research, with the potential to revolutionize how we study, manage, and treat complex diseases. The next great challenge for clinical applications of these innovations will be to create scalable computational solutions for intelligently linking complex biomedical patient data to clinically actionable knowledge. Traditional database management systems (DBMS) are not well suited to representing complex syntactic and semantic relationships in unstructured biomedical information, introducing barriers to realizing such solutions. We propose a scalable computational framework for addressing this need, which leverages a hypergraph-based data model and query language that may be better suited for representing complex multi-lateral, multi-scalar, and multi-dimensional relationships. We also discuss how this framework can be used to create rapid learning knowledge base systems to intelligently capture and relate complex patient data to biomedical knowledge in order to automate the recovery of clinically actionable information.
Introduction of knowledge bases in patient's data management system: role of the user interface.
Chambrin, M C; Ravaux, P; Jaborska, A; Beugnet, C; Lestavel, P; Chopin, C; Boniface, M
1995-02-01
As the number of signals and data to be handled grows in intensive care unit, it is necessary to design more powerful computing systems that integrate and summarize all this information. The manual input of data as e.g. clinical signs and drug prescription and the synthetic representation of these data requires an ever more sophisticated user interface. The introduction of knowledge bases in the data management allows to conceive contextual interfaces. The objective of this paper is to show the importance of the design of the user interface, in the daily use of clinical information system. Then we describe a methodology that uses the man-machine interaction to capture the clinician knowledge during the clinical practice. The different steps are the audit of the user's actions, the elaboration of statistic models allowing the definition of new knowledge, and the validation that is performed before complete integration. A part of this knowledge can be used to improve the user interface. Finally, we describe the implementation of these concepts on a UNIX platform using OSF/MOTIF graphical interface.
Physiological ramifications for loggerhead turtles captured in pelagic longlines
Williard, Amanda; Parga, Mariluz; Sagarminaga, Ricardo; Swimmer, Yonat
2015-01-01
Bycatch of endangered loggerhead turtles in longline fisheries results in high rates of post-release mortality that may negatively impact populations. The factors contributing to post-release mortality have not been well studied, but traumatic injuries and physiological disturbances experienced as a result of capture are thought to play a role. The goal of our study was to gauge the physiological status of loggerhead turtles immediately upon removal from longline gear in order to refine our understanding of the impacts of capture and the potential for post-release mortality. We analysed blood samples collected from longline- and hand-captured loggerhead turtles, and discovered that capture in longline gear results in blood loss, induction of the systemic stress response, and a moderate increase in lactate. The method by which turtles are landed and released, particularly if released with the hook or line still attached, may exacerbate stress and lead to chronic injuries, sublethal effects or delayed mortality. Our study is the first, to the best of our knowledge, to document the physiological impacts of capture in longline gear, and our findings underscore the importance of best practices gear removal to promote post-release survival in longline-captured turtles. PMID:26490415
Physiological ramifications for loggerhead turtles captured in pelagic longlines.
Williard, Amanda; Parga, Mariluz; Sagarminaga, Ricardo; Swimmer, Yonat
2015-10-01
Bycatch of endangered loggerhead turtles in longline fisheries results in high rates of post-release mortality that may negatively impact populations. The factors contributing to post-release mortality have not been well studied, but traumatic injuries and physiological disturbances experienced as a result of capture are thought to play a role. The goal of our study was to gauge the physiological status of loggerhead turtles immediately upon removal from longline gear in order to refine our understanding of the impacts of capture and the potential for post-release mortality. We analysed blood samples collected from longline- and hand-captured loggerhead turtles, and discovered that capture in longline gear results in blood loss, induction of the systemic stress response, and a moderate increase in lactate. The method by which turtles are landed and released, particularly if released with the hook or line still attached, may exacerbate stress and lead to chronic injuries, sublethal effects or delayed mortality. Our study is the first, to the best of our knowledge, to document the physiological impacts of capture in longline gear, and our findings underscore the importance of best practices gear removal to promote post-release survival in longline-captured turtles. © 2015 The Author(s).
The Knowledge Program: an Innovative, Comprehensive Electronic Data Capture System and Warehouse
Katzan, Irene; Speck, Micheal; Dopler, Chris; Urchek, John; Bielawski, Kay; Dunphy, Cheryl; Jehi, Lara; Bae, Charles; Parchman, Alandra
2011-01-01
Data contained in the electronic health record (EHR) present a tremendous opportunity to improve quality-of-care and enhance research capabilities. However, the EHR is not structured to provide data for such purposes: most clinical information is entered as free text and content varies substantially between providers. Discrete information on patients’ functional status is typically not collected. Data extraction tools are often unavailable. We have developed the Knowledge Program (KP), a comprehensive initiative to improve the collection of discrete clinical information into the EHR and the retrievability of data for use in research, quality, and patient care. A distinct feature of the KP is the systematic collection of patient-reported outcomes, which is captured discretely, allowing more refined analyses of care outcomes. The KP capitalizes on features of the Epic EHR and utilizes an external IT infrastructure distinct from Epic for enhanced functionality. Here, we describe the development and implementation of the KP. PMID:22195124
Cui, Meng; Yang, Shuo; Yu, Tong; Yang, Ce; Gao, Yonghong; Zhu, Haiyan
2013-10-01
To design a model to capture information on the state and trends of knowledge creation, at both an individual and an organizational level, in order to enhance knowledge management. We designed a graph-theoretic knowledge model, the expert knowledge map (EKM), based on literature-based annotation. A case study in the domain of Traditional Chinese Medicine research was used to illustrate the usefulness of the model. The EKM successfully captured various aspects of knowledge and enhanced knowledge management within the case-study organization through the provision of knowledge graphs, expert graphs, and expert-knowledge biography. Our model could help to reveal the hot topics, trends, and products of the research done by an organization. It can potentially be used to facilitate knowledge learning, sharing and decision-making among researchers, academicians, students, and administrators of organizations.
Planning representation for automated exploratory data analysis
NASA Astrophysics Data System (ADS)
St. Amant, Robert; Cohen, Paul R.
1994-03-01
Igor is a knowledge-based system for exploratory statistical analysis of complex systems and environments. Igor has two related goals: to help automate the search for interesting patterns in data sets, and to help develop models that capture significant relationships in the data. We outline a language for Igor, based on techniques of opportunistic planning, which balances control and opportunism. We describe the application of Igor to the analysis of the behavior of Phoenix, an artificial intelligence planning system.
Knowledge management in the engineering design environment
NASA Technical Reports Server (NTRS)
Briggs, Hugh C.
2006-01-01
The Aerospace and Defense industry is experiencing an increasing loss of knowledge through workforce reductions associated with business consolidation and retirement of senior personnel. Significant effort is being placed on process definition as part of ISO certification and, more recently, CMMI certification. The process knowledge in these efforts represents the simplest of engineering knowledge and many organizations are trying to get senior engineers to write more significant guidelines, best practices and design manuals. A new generation of design software, known as Product Lifecycle Management systems, has many mechanisms for capturing and deploying a wider variety of engineering knowledge than simple process definitions. These hold the promise of significant improvements through reuse of prior designs, codification of practices in workflows, and placement of detailed how-tos at the point of application.
Using Modern Technologies to Capture and Share Indigenous Astronomical Knowledge
NASA Astrophysics Data System (ADS)
Nakata, Martin; Hamacher, Duane W.; Warren, John; Byrne, Alex; Pagnucco, Maurice; Harley, Ross; Venugopal, Srikumar; Thorpe, Kirsten; Neville, Richard; Bolt, Reuben
2014-06-01
Indigenous Knowledge is important for Indigenous communities across the globe and for the advancement of our general scientific knowledge. In particular, Indigenous astronomical knowledge integrates many aspects of Indigenous Knowledge, including seasonal calendars, navigation, food economics, law, ceremony, and social structure. Capturing, managing, and disseminating this knowledge in the digital environment poses a number of challenges, which we aim to address using a collaborative project emerging between experts in the higher education, library, archive and industry sectors. Using Microsoft's WorldWide Telescope and Rich Interactive Narratives technologies, we propose to develop software, media design, and archival management solutions to allow Indigenous communities to share their astronomical knowledge with the world on their terms and in a culturally sensitive manner.
Effective domain-dependent reuse in medical knowledge bases.
Dojat, M; Pachet, F
1995-12-01
Knowledge reuse is now a critical issue for most developers of medical knowledge-based systems. As a rule, reuse is addressed from an ambitious, knowledge-engineering perspective that focuses on reusable general purpose knowledge modules, concepts, and methods. However, such a general goal fails to take into account the specific aspects of medical practice. From the point of view of the knowledge engineer, whose goal is to capture the specific features and intricacies of a given domain, this approach addresses the wrong level of generality. In this paper, we adopt a more pragmatic viewpoint, introducing the less ambitious goal of "domain-dependent limited reuse" and suggesting effective means of achieving it in practice. In a knowledge representation framework combining objects and production rules, we propose three mechanisms emerging from the combination of object-oriented programming and rule-based programming. We show these mechanisms contribute to achieve limited reuse and to introduce useful limited variations in medical expertise.
Atmospheric inverse modeling via sparse reconstruction
NASA Astrophysics Data System (ADS)
Hase, Nils; Miller, Scot M.; Maaß, Peter; Notholt, Justus; Palm, Mathias; Warneke, Thorsten
2017-10-01
Many applications in atmospheric science involve ill-posed inverse problems. A crucial component of many inverse problems is the proper formulation of a priori knowledge about the unknown parameters. In most cases, this knowledge is expressed as a Gaussian prior. This formulation often performs well at capturing smoothed, large-scale processes but is often ill equipped to capture localized structures like large point sources or localized hot spots. Over the last decade, scientists from a diverse array of applied mathematics and engineering fields have developed sparse reconstruction techniques to identify localized structures. In this study, we present a new regularization approach for ill-posed inverse problems in atmospheric science. It is based on Tikhonov regularization with sparsity constraint and allows bounds on the parameters. We enforce sparsity using a dictionary representation system. We analyze its performance in an atmospheric inverse modeling scenario by estimating anthropogenic US methane (CH4) emissions from simulated atmospheric measurements. Different measures indicate that our sparse reconstruction approach is better able to capture large point sources or localized hot spots than other methods commonly used in atmospheric inversions. It captures the overall signal equally well but adds details on the grid scale. This feature can be of value for any inverse problem with point or spatially discrete sources. We show an example for source estimation of synthetic methane emissions from the Barnett shale formation.
Autonomous Navigation Performance During The Hartley 2 Comet Flyby
NASA Technical Reports Server (NTRS)
Abrahamson, Matthew J; Kennedy, Brian A.; Bhaskaran, Shyam
2012-01-01
On November 4, 2010, the EPOXI spacecraft performed a 700-km flyby of the comet Hartley 2 as follow-on to the successful 2005 Deep Impact prime mission. EPOXI, an extended mission for the Deep Impact Flyby spacecraft, returned a wealth of visual and infrared data from Hartley 2, marking the fifth time that high-resolution images of a cometary nucleus have been captured by a spacecraft. The highest resolution science return, captured at closest approach to the comet nucleus, was enabled by use of an onboard autonomous navigation system called AutoNav. AutoNav estimates the comet-relative spacecraft trajectory using optical measurements from the Medium Resolution Imager (MRI) and provides this relative position information to the Attitude Determination and Control System (ADCS) for maintaining instrument pointing on the comet. For the EPOXI mission, AutoNav was tasked to enable continuous tracking of a smaller, more active Hartley 2, as compared to Tempel 1, through the full encounter while traveling at a higher velocity. To meet the mission goal of capturing the comet in all MRI science images, position knowledge accuracies of +/- 3.5 km (3-?) cross track and +/- 0.3 seconds (3-?) time of flight were required. A flight-code-in-the-loop Monte Carlo simulation assessed AutoNav's statistical performance under the Hartley 2 flyby dynamics and determined optimal configuration. The AutoNav performance at Hartley 2 was successful, capturing the comet in all of the MRI images. The maximum residual between observed and predicted comet locations was 20 MRI pixels, primarily influenced by the center of brightness offset from the center of mass in the observations and attitude knowledge errors. This paper discusses the Monte Carlo-based analysis that led to the final AutoNav configuration and a comparison of the predicted performance with the flyby performance.
Semantic Likelihood Models for Bayesian Inference in Human-Robot Interaction
NASA Astrophysics Data System (ADS)
Sweet, Nicholas
Autonomous systems, particularly unmanned aerial systems (UAS), remain limited in au- tonomous capabilities largely due to a poor understanding of their environment. Current sensors simply do not match human perceptive capabilities, impeding progress towards full autonomy. Recent work has shown the value of humans as sources of information within a human-robot team; in target applications, communicating human-generated 'soft data' to autonomous systems enables higher levels of autonomy through large, efficient information gains. This requires development of a 'human sensor model' that allows soft data fusion through Bayesian inference to update the probabilistic belief representations maintained by autonomous systems. Current human sensor models that capture linguistic inputs as semantic information are limited in their ability to generalize likelihood functions for semantic statements: they may be learned from dense data; they do not exploit the contextual information embedded within groundings; and they often limit human input to restrictive and simplistic interfaces. This work provides mechanisms to synthesize human sensor models from constraints based on easily attainable a priori knowledge, develops compression techniques to capture information-dense semantics, and investigates the problem of capturing and fusing semantic information contained within unstructured natural language. A robotic experimental testbed is also developed to validate the above contributions.
Biomechanical analysis using Kinovea for sports application
NASA Astrophysics Data System (ADS)
Muaza Nor Adnan, Nor; Patar, Mohd Nor Azmi Ab; Lee, Hokyoo; Yamamoto, Shin-Ichiroh; Jong-Young, Lee; Mahmud, Jamaluddin
2018-04-01
This paper assesses the reliability of HD VideoCam–Kinovea as an alternative tool in conducting motion analysis and measuring knee relative angle of drop jump movement. The motion capture and analysis procedure were conducted in the Biomechanics Lab, Shibaura Institute of Technology, Omiya Campus, Japan. A healthy subject without any gait disorder (BMI of 28.60 ± 1.40) was recruited. The volunteered subject was asked to per the drop jump movement on preset platform and the motion was simultaneously recorded using an established infrared motion capture system (Hawk–Cortex) and a HD VideoCam in the sagittal plane only. The capture was repeated for 5 times. The outputs (video recordings) from the HD VideoCam were input into Kinovea (an open-source software) and the drop jump pattern was tracked and analysed. These data are compared with the drop jump pattern tracked and analysed earlier using the Hawk–Cortex system. In general, the results obtained (drop jump pattern) using the HD VideoCam–Kinovea are close to the results obtained using the established motion capture system. Basic statistical analyses show that most average variances are less than 10%, thus proving the repeatability of the protocol and the reliability of the results. It can be concluded that the integration of HD VideoCam–Kinovea has the potential to become a reliable motion capture–analysis system. Moreover, it is low cost, portable and easy to use. As a conclusion, the current study and its findings are found useful and has contributed to enhance significant knowledge pertaining to motion capture-analysis, drop jump movement and HD VideoCam–Kinovea integration.
Capturing, using, and managing quality assurance knowledge for shuttle post-MECO flight design
NASA Technical Reports Server (NTRS)
Peters, H. L.; Fussell, L. R.; Goodwin, M. A.; Schultz, Roger D.
1991-01-01
Ascent initialization values used by the Shuttle's onboard computer for nominal and abort mission scenarios are verified by a six degrees of freedom computer simulation. The procedure that the Ascent Post Main Engine Cutoff (Post-MECO) group uses to perform quality assurance (QA) of the simulation is time consuming. Also, the QA data, checklists and associated rationale, though known by the group members, is not sufficiently documented, hindering transfer of knowledge and problem resolution. A new QA procedure which retains the current high level of integrity while reducing the time required to perform QA is needed to support the increasing Shuttle flight rate. Documenting the knowledge is also needed to increase its availability for training and problem resolution. To meet these needs, a knowledge capture process, embedded into the group activities, was initiated to verify the existing QA checks, define new ones, and document all rationale. The resulting checks were automated in a conventional software program to achieve the desired standardization, integrity, and time reduction. A prototype electronic knowledge base was developed with Macintosh's HyperCard to serve as a knowledge capture tool and data storage.
Tolaymat, Thabet; El Badawy, Amro; Sequeira, Reynold; Genaidy, Ash
2015-11-15
There is an urgent need for broad and integrated studies that address the risks of engineered nanomaterials (ENMs) along the different endpoints of the society, environment, and economy (SEE) complex adaptive system. This article presents an integrated science-based methodology to assess the potential risks of engineered nanomaterials. To achieve the study objective, two major tasks are accomplished, knowledge synthesis and algorithmic computational methodology. The knowledge synthesis task is designed to capture "what is known" and to outline the gaps in knowledge from ENMs risk perspective. The algorithmic computational methodology is geared toward the provision of decisions and an understanding of the risks of ENMs along different endpoints for the constituents of the SEE complex adaptive system. The approach presented herein allows for addressing the formidable task of assessing the implications and risks of exposure to ENMs, with the long term goal to build a decision-support system to guide key stakeholders in the SEE system towards building sustainable ENMs and nano-enabled products. Published by Elsevier B.V.
Castaneda, Christian; Nalley, Kip; Mannion, Ciaran; Bhattacharyya, Pritish; Blake, Patrick; Pecora, Andrew; Goy, Andre; Suh, K Stephen
2015-01-01
As research laboratories and clinics collaborate to achieve precision medicine, both communities are required to understand mandated electronic health/medical record (EHR/EMR) initiatives that will be fully implemented in all clinics in the United States by 2015. Stakeholders will need to evaluate current record keeping practices and optimize and standardize methodologies to capture nearly all information in digital format. Collaborative efforts from academic and industry sectors are crucial to achieving higher efficacy in patient care while minimizing costs. Currently existing digitized data and information are present in multiple formats and are largely unstructured. In the absence of a universally accepted management system, departments and institutions continue to generate silos of information. As a result, invaluable and newly discovered knowledge is difficult to access. To accelerate biomedical research and reduce healthcare costs, clinical and bioinformatics systems must employ common data elements to create structured annotation forms enabling laboratories and clinics to capture sharable data in real time. Conversion of these datasets to knowable information should be a routine institutionalized process. New scientific knowledge and clinical discoveries can be shared via integrated knowledge environments defined by flexible data models and extensive use of standards, ontologies, vocabularies, and thesauri. In the clinical setting, aggregated knowledge must be displayed in user-friendly formats so that physicians, non-technical laboratory personnel, nurses, data/research coordinators, and end-users can enter data, access information, and understand the output. The effort to connect astronomical numbers of data points, including '-omics'-based molecular data, individual genome sequences, experimental data, patient clinical phenotypes, and follow-up data is a monumental task. Roadblocks to this vision of integration and interoperability include ethical, legal, and logistical concerns. Ensuring data security and protection of patient rights while simultaneously facilitating standardization is paramount to maintaining public support. The capabilities of supercomputing need to be applied strategically. A standardized, methodological implementation must be applied to developed artificial intelligence systems with the ability to integrate data and information into clinically relevant knowledge. Ultimately, the integration of bioinformatics and clinical data in a clinical decision support system promises precision medicine and cost effective and personalized patient care.
Augmenting Oracle Text with the UMLS for enhanced searching of free-text medical reports.
Ding, Jing; Erdal, Selnur; Dhaval, Rakesh; Kamal, Jyoti
2007-10-11
The intrinsic complexity of free-text medical reports imposes great challenges for information retrieval systems. We have developed a prototype search engine for retrieving clinical reports that leverages the powerful indexing and querying capabilities of Oracle Text, and the rich biomedical domain knowledge and semantic structures that are captured in the UMLS Metathesaurus.
Diluting Education? An Ethnographic Study of Change in an Australian Ministry of Education
ERIC Educational Resources Information Center
Robinson, Sarah
2011-01-01
This ethnographic study captures the processes that led to change in an Australian public education system. The changes were driven by strong neo-liberal discourses which resulted in a shift from a shared understanding about leading educational change in schools by knowledge transfer to managing educational change as a process, in other words,…
Cognitive Task Analysis for Instruction in Single-Injection Ultrasound Guided-Regional Anesthesia
ERIC Educational Resources Information Center
Gucev, Gligor V.
2012-01-01
Cognitive task analysis (CTA) is methodology for eliciting knowledge from subject matter experts. CTA has been used to capture the cognitive processes, decision-making, and judgments that underlie expert behaviors. A review of the literature revealed that CTA has not yet been used to capture the knowledge required to perform ultrasound guided…
MSL Lessons Learned and Knowledge Capture
NASA Technical Reports Server (NTRS)
Buxbaum, Karen L.
2012-01-01
The Mars Program has recently been informed of the Planetary Protection Subcommittee (PPS) recommendation, which was endorsed by the NAC, concerning Mars Science Lab (MSL) lessons learned and knowledge capture. The Mars Program has not had an opportunity to consider any decisions specific to the PPS recommendation. Some of the activities recommended by the PPS would involve members of the MSL flight team who are focused on cruise, entry descent & landing, and early surface operations; those activities would have to wait. Members of the MSL planetary protection team at JPL are still available to support MSL lessons learned and knowledge capture; some of the specifically recommended activities have already begun. The Mars Program shares the PPS/NAC concerns about loss of potential information & expertise in planetary protection practice.
A Lyapunov based approach to energy maximization in renewable energy technologies
NASA Astrophysics Data System (ADS)
Iyasere, Erhun
This dissertation describes the design and implementation of Lyapunov-based control strategies for the maximization of the power captured by renewable energy harnessing technologies such as (i) a variable speed, variable pitch wind turbine, (ii) a variable speed wind turbine coupled to a doubly fed induction generator, and (iii) a solar power generating system charging a constant voltage battery. First, a torque control strategy is presented to maximize wind energy captured in variable speed, variable pitch wind turbines at low to medium wind speeds. The proposed strategy applies control torque to the wind turbine pitch and rotor subsystems to simultaneously control the blade pitch and tip speed ratio, via the rotor angular speed, to an optimum point at which the capture efficiency is maximum. The control method allows for aerodynamic rotor power maximization without exact knowledge of the wind turbine model. A series of numerical results show that the wind turbine can be controlled to achieve maximum energy capture. Next, a control strategy is proposed to maximize the wind energy captured in a variable speed wind turbine, with an internal induction generator, at low to medium wind speeds. The proposed strategy controls the tip speed ratio, via the rotor angular speed, to an optimum point at which the efficiency constant (or power coefficient) is maximal for a particular blade pitch angle and wind speed by using the generator rotor voltage as a control input. This control method allows for aerodynamic rotor power maximization without exact wind turbine model knowledge. Representative numerical results demonstrate that the wind turbine can be controlled to achieve near maximum energy capture. Finally, a power system consisting of a photovoltaic (PV) array panel, dc-to-dc switching converter, charging a battery is considered wherein the environmental conditions are time-varying. A backstepping PWM controller is developed to maximize the power of the solar generating system. The controller tracks a desired array voltage, designed online using an incremental conductance extremum-seeking algorithm, by varying the duty cycle of the switching converter. The stability of the control algorithm is demonstrated by means of Lyapunov analysis. Representative numerical results demonstrate that the grid power system can be controlled to track the maximum power point of the photovoltaic array panel in varying atmospheric conditions. Additionally, the performance of the proposed strategy is compared to the typical maximum power point tracking (MPPT) method of perturb and observe (P&O), where the converter dynamics are ignored, and is shown to yield better results.
Knowledge management for chronic patient control and monitoring
NASA Astrophysics Data System (ADS)
Pedreira, Nieves; Aguiar-Pulido, Vanessa; Dorado, Julián; Pazos, Alejandro; Pereira, Javier
2014-10-01
Knowledge Management (KM) can be seen as the process of capturing, developing, sharing, and effectively using organizational knowledge. In this context, the work presented here proposes a KM System to be used in the scope of chronic patient control and monitoring for distributed research projects. It was designed in order to enable communication between patient and doctors, as well as to be usedbythe researchers involved in the project for its management. The proposed model integrates all the information concerning every patient and project management tasks in the Institutional Memory of a KMSystem and uses an ontology to maintain the information and its categorization independently. Furthermore, taking the philosophy of intelligent agents, the system will interact with the user to show him the information according to his preferences and access rights. Finally, three different scenarios of application are described.
Management of Knowledge Representation Standards Activities
NASA Technical Reports Server (NTRS)
Patil, Ramesh S.
1993-01-01
Ever since the mid-seventies, researchers have recognized that capturing knowledge is the key to building large and powerful AI systems. In the years since, we have also found that representing knowledge is difficult and time consuming. In spite of the tools developed to help with knowledge acquisition, knowledge base construction remains one of the major costs in building an Al system: For almost every system we build, a new knowledge base must be constructed from scratch. As a result, most systems remain small to medium in size. Even if we build several systems within a general area, such as medicine or electronics diagnosis, significant portions of the domain must be represented for every system we create. The cost of this duplication of effort has been high and will become prohibitive as we attempt to build larger and larger systems. To overcome this barrier we must find ways of preserving existing knowledge bases and of sharing, re-using, and building on them. This report describes the efforts undertaken over the last two years to identify the issues underlying the current difficulties in sharing and reuse, and a community wide initiative to overcome them. First, we discuss four bottlenecks to sharing and reuse, present a vision of a future in which these bottlenecks have been ameliorated, and describe the efforts of the initiative's four working groups to address these bottlenecks. We then address the supporting technology and infrastructure that is critical to enabling the vision of the future. Finally, we consider topics of longer-range interest by reviewing some of the research issues raised by our vision.
Small Particles Intact Capture Experiment (SPICE)
NASA Technical Reports Server (NTRS)
Nishioka, Ken-Ji; Carle, G. C.; Bunch, T. E.; Mendez, David J.; Ryder, J. T.
1994-01-01
The Small Particles Intact Capture Experiment (SPICE) will develop technologies and engineering techniques necessary to capture nearly intact, uncontaminated cosmic and interplanetary dust particles (IDP's). Successful capture of such particles will benefit the exobiology and planetary science communities by providing particulate samples that may have survived unaltered since the formation of the solar system. Characterization of these particles may contribute fundamental data to our knowledge of how these particles could have formed into our planet Earth and, perhaps, contributed to the beginnings of life. The term 'uncontaminated' means that captured cosmic and IDP particles are free of organic contamination from the capture process and the term 'nearly intact capture' means that their chemical and elemental components are not materially altered during capture. The key to capturing cosmic and IDP particles that are organic-contamination free and nearly intact is the capture medium. Initial screening of capture media included organic foams, multiple thin foil layers, and aerogel (a silica gel); but, with the exception of aerogel, the requirements of no contamination or nearly intact capture were not met. To ensure no contamination of particles in the capture process, high-purity aerogel was chosen. High-purity aerogel results in high clarity (visual clearness), a useful quality in detection and recovery of embedded captured particles from the aerogel. P. Tsou at the Jet Propulsion Laboratory (JPL) originally described the use of aerogel for this purpose and reported laboratory test results. He has flown aerogel as a 'GAS-can Lid' payload on STS-47 and is evaluating the results. The Timeband Capture Cell Experiment (TICCE), a Eureca 1 experiment, is also flying aerogel and is scheduled for recovery in late April.
Adaptive simplification of complex multiscale systems.
Chiavazzo, Eliodoro; Karlin, Ilya
2011-03-01
A fully adaptive methodology is developed for reducing the complexity of large dissipative systems. This represents a significant step toward extracting essential physical knowledge from complex systems, by addressing the challenging problem of a minimal number of variables needed to exactly capture the system dynamics. Accurate reduced description is achieved, by construction of a hierarchy of slow invariant manifolds, with an embarrassingly simple implementation in any dimension. The method is validated with the autoignition of the hydrogen-air mixture where a reduction to a cascade of slow invariant manifolds is observed.
Volcanic Activity at Shiveluch and Plosky Tolbachik
2017-12-08
On March 7, 2013 the Terra satellite passed over eastern Russia, allowing the Moderate Resolution Imaging Spectroradiometer (MODIS) flying aboard to capture volcanic activity at Shiveluch and Plosky Tolbachik, on the Kamchatka Peninsula, in eastern Russia. This image was captured at 0050 UTC. Credit: NASA/GSFC/Jeff Schmaltz/MODIS Land Rapid Response Team NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram
NATO Human View Architecture and Human Networks
NASA Technical Reports Server (NTRS)
Handley, Holly A. H.; Houston, Nancy P.
2010-01-01
The NATO Human View is a system architectural viewpoint that focuses on the human as part of a system. Its purpose is to capture the human requirements and to inform on how the human impacts the system design. The viewpoint contains seven static models that include different aspects of the human element, such as roles, tasks, constraints, training and metrics. It also includes a Human Dynamics component to perform simulations of the human system under design. One of the static models, termed Human Networks, focuses on the human-to-human communication patterns that occur as a result of ad hoc or deliberate team formation, especially teams distributed across space and time. Parameters of human teams that effect system performance can be captured in this model. Human centered aspects of networks, such as differences in operational tempo (sense of urgency), priorities (common goal), and team history (knowledge of the other team members), can be incorporated. The information captured in the Human Network static model can then be included in the Human Dynamics component so that the impact of distributed teams is represented in the simulation. As the NATO militaries transform to a more networked force, the Human View architecture is an important tool that can be used to make recommendations on the proper mix of technological innovations and human interactions.
Case-Based Capture and Reuse of Aerospace Design Rationale
NASA Technical Reports Server (NTRS)
Leake, David B.
1998-01-01
The goal of this project is to apply artificial intelligence techniques to facilitate capture and reuse of aerospace design rationale. The project applies case-based reasoning (CBR) and concept mapping (CMAP) tools to the task of capturing, organizing, and interactively accessing experiences or "cases" encapsulating the methods and rationale underlying expert aerospace design. As stipulated in the award, Indiana University and Ames personnel are collaborating on performance of research and determining the direction of research, to assure that the project focuses on high-value tasks. In the first five months of the project, we have made two visits to Ames Research Center to consult with our NASA collaborators, to learn about the advanced aerospace design tools being developed there, and to identify specific needs for intelligent design support. These meetings identified a number of task areas for applying CBR and concept mapping technology. We jointly selected a first task area to focus on: Acquiring the convergence criteria that experts use to guide the selection of useful data from a set of numerical simulations of high-lift systems. During the first funding period, we developed two software systems. First, we have adapted a CBR system developed at Indiana University into a prototype case-based reasoning shell to capture and retrieve information about design experiences, with the sample task of capturing and reusing experts' intuitive criteria for determining convergence (work conducted at Indiana University). Second, we have also adapted and refined existing concept mapping tools that will be used to clarify and capture the rationale underlying those experiences, to facilitate understanding of the expert's reasoning and guide future reuse of captured information (work conducted at the University of West Florida). The tools we have developed are designed to be the basis for a general framework for facilitating tasks within systems developed by the Advanced Design Technologies Testbed (ADTT) project at ARC. The tenets of our framework are (1) that the systems developed should leverage a designer's knowledge, rather than attempting to replace it; (2) that learning and user feedback must play a central role, so that the system can adapt to how it is used, and (3) that the learning and feedback processes must be as natural and as unobtrusive as possible. In the second funding period we will extend our current work, applying the tools to capturing higher-level design rationale.
Security and confidentiality of health information systems: implications for physicians.
Dorodny, V S
1998-01-01
Adopting and developing the new generation of information systems will be essential to remain competitive in a quality conscious health care environment. These systems enable physicians to document patient encounters and aggregate the information from the population they treat, while capturing detailed data on chronic medical conditions, medications, treatment plans, risk factors, severity of conditions, and health care resource utilization and management. Today, the knowledge-based information systems should offer instant, around-the-clock access for the provider, support simple order entry, facilitate data capture and retrieval, and provide eligibility verification, electronic authentication, prescription writing, security, and reporting that benchmarks outcomes management based upon clinical/financial decisions and treatment plans. It is an integral part of any information system to incorporate and integrate transactional (financial/administrative) information, as well as analytical (clinical/medical) data in a user-friendly, readily accessible, and secure form. This article explores the technical, financial, logistical, and behavioral obstacles on the way to the Promised Land.
Blackboard architecture for medical image interpretation
NASA Astrophysics Data System (ADS)
Davis, Darryl N.; Taylor, Christopher J.
1991-06-01
There is a growing interest in using sophisticated knowledge-based systems for biomedical image interpretation. We present a principled attempt to use artificial intelligence methodologies in interpreting lateral skull x-ray images. Such radiographs are routinely used in cephalometric analysis to provide quantitative measurements useful to clinical orthodontists. Manual and interactive methods of analysis are known to be error prone and previous attempts to automate this analysis typically fail to capture the expertise and adaptability required to cope with the variability in biological structure and image quality. An integrated model-based system has been developed which makes use of a blackboard architecture and multiple knowledge sources. A model definition interface allows quantitative models, of feature appearance and location, to be built from examples as well as more qualitative modelling constructs. Visual task definition and blackboard control modules allow task-specific knowledge sources to act on information available to the blackboard in a hypothesise and test reasoning cycle. Further knowledge-based modules include object selection, location hypothesis, intelligent segmentation, and constraint propagation systems. Alternative solutions to given tasks are permitted.
An ontological knowledge framework for adaptive medical workflow.
Dang, Jiangbo; Hedayati, Amir; Hampel, Ken; Toklu, Candemir
2008-10-01
As emerging technologies, semantic Web and SOA (Service-Oriented Architecture) allow BPMS (Business Process Management System) to automate business processes that can be described as services, which in turn can be used to wrap existing enterprise applications. BPMS provides tools and methodologies to compose Web services that can be executed as business processes and monitored by BPM (Business Process Management) consoles. Ontologies are a formal declarative knowledge representation model. It provides a foundation upon which machine understandable knowledge can be obtained, and as a result, it makes machine intelligence possible. Healthcare systems can adopt these technologies to make them ubiquitous, adaptive, and intelligent, and then serve patients better. This paper presents an ontological knowledge framework that covers healthcare domains that a hospital encompasses-from the medical or administrative tasks, to hospital assets, medical insurances, patient records, drugs, and regulations. Therefore, our ontology makes our vision of personalized healthcare possible by capturing all necessary knowledge for a complex personalized healthcare scenario involving patient care, insurance policies, and drug prescriptions, and compliances. For example, our ontology facilitates a workflow management system to allow users, from physicians to administrative assistants, to manage, even create context-aware new medical workflows and execute them on-the-fly.
Paramedir: A Tool for Programmable Performance Analysis
NASA Technical Reports Server (NTRS)
Jost, Gabriele; Labarta, Jesus; Gimenez, Judit
2004-01-01
Performance analysis of parallel scientific applications is time consuming and requires great expertise in areas such as programming paradigms, system software, and computer hardware architectures. In this paper we describe a tool that facilitates the programmability of performance metric calculations thereby allowing the automation of the analysis and reducing the application development time. We demonstrate how the system can be used to capture knowledge and intuition acquired by advanced parallel programmers in order to be transferred to novice users.
Model-based diagnostics for Space Station Freedom
NASA Technical Reports Server (NTRS)
Fesq, Lorraine M.; Stephan, Amy; Martin, Eric R.; Lerutte, Marcel G.
1991-01-01
An innovative approach to fault management was recently demonstrated for the NASA LeRC Space Station Freedom (SSF) power system testbed. This project capitalized on research in model-based reasoning, which uses knowledge of a system's behavior to monitor its health. The fault management system (FMS) can isolate failures online, or in a post analysis mode, and requires no knowledge of failure symptoms to perform its diagnostics. An in-house tool called MARPLE was used to develop and run the FMS. MARPLE's capabilities are similar to those available from commercial expert system shells, although MARPLE is designed to build model-based as opposed to rule-based systems. These capabilities include functions for capturing behavioral knowledge, a reasoning engine that implements a model-based technique known as constraint suspension, and a tool for quickly generating new user interfaces. The prototype produced by applying MARPLE to SSF not only demonstrated that model-based reasoning is a valuable diagnostic approach, but it also suggested several new applications of MARPLE, including an integration and testing aid, and a complement to state estimation.
The BioIntelligence Framework: a new computational platform for biomedical knowledge computing
Farley, Toni; Kiefer, Jeff; Lee, Preston; Von Hoff, Daniel; Trent, Jeffrey M; Colbourn, Charles
2013-01-01
Breakthroughs in molecular profiling technologies are enabling a new data-intensive approach to biomedical research, with the potential to revolutionize how we study, manage, and treat complex diseases. The next great challenge for clinical applications of these innovations will be to create scalable computational solutions for intelligently linking complex biomedical patient data to clinically actionable knowledge. Traditional database management systems (DBMS) are not well suited to representing complex syntactic and semantic relationships in unstructured biomedical information, introducing barriers to realizing such solutions. We propose a scalable computational framework for addressing this need, which leverages a hypergraph-based data model and query language that may be better suited for representing complex multi-lateral, multi-scalar, and multi-dimensional relationships. We also discuss how this framework can be used to create rapid learning knowledge base systems to intelligently capture and relate complex patient data to biomedical knowledge in order to automate the recovery of clinically actionable information. PMID:22859646
When Informationists Get Involved: the CHICA-GIS Project.
Whipple, Elizabeth C; Odell, Jere D; Ralston, Rick K; Liu, Gilbert C
2013-01-01
Child Health Improvement through Computer Automation (CHICA) is a computer decision support system (CDSS) that interfaces with existing electronic medical record systems (EMRS) and delivers "just-in-time" patient-relevant guidelines to physicians during the clinical encounter and accurately captures structured data from all who interact with the system. "Delivering Geospatial Intelligence to Health Care Professionals (CHICA-GIS)" (1R01LM010923-01) expands the medical application of Geographic Information Systems (GIS) by integrating a geographic information system with CHICA. To provide knowledge management support for CHICA-GIS, three informationists at the Indiana University School of Medicine were awarded a supplement from the National Library Medicine. The informationists will enhance CHICA-GIS by: improving the accuracy and accessibility of information, managing and mapping the knowledge which undergirds the CHICA-GIS decision support tool, supporting community engagement and consumer health information outreach, and facilitating the dissemination of new CHICA-GIS research results and services.
Derate Mitigation Options for Pulverized Coal Power Plant Carbon Capture Retrofits
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hoffmann, Jeffrey W.; Hackett, Gregory A.; Lewis, Eric G.
Carbon capture and storage (CCS) technologies available in the near-term for pulverized coal-fueled power plants (i.e., post combustion solvent technologies) require substantial capital investment and result in marked decrease in electricity available for sale to the grid. The impact to overall plant economics can be mitigated for new plant designs (where the entire plant can be optimized around the CCS system). However, existing coal-fueled power plants were designed without the knowledge or intent to retrofit a CCS process, and it is simply not possible to re-engineer an existing plant in a manner that it could achieve the same performance asmore » if it was originally designed and optimized for CCS technology. Pairing an auxiliary steam supply to the capture system is a technically feasible option to mitigate the derate resulting from diverting steam away from an existing steam turbine and continuing to run that turbine at steam flow rates and properties outside of the original design specifications. The results of this analysis strongly support the merits of meeting the steam and power requirements for a retrofitted post-combustion solvent based carbon dioxide (CO2) capture system with an auxiliary combined heat and power (CHP) plant rather than robbing the base plant (i.e., diverting steam from the existing steam cycle and electricity from sale to the grid).« less
Skill Acquisition: Compilation of Weak-Method Problem Solutions.
1985-08-12
difference largely disappears by the fourth day when they are still working - with Perverse EMACS. Compared to Day 1 on EMACS. there is large postive ...This reinforces the idea that production representation captures significant features of our procedural knowledge and that differences between...memory load is certainly consistent with the working memory plus production system hypothesis. Immediate Feedback The importance of immediate
Jing, Xia; Cimino, James J; Del Fiol, Guilherme
2015-11-30
The Librarian Infobutton Tailoring Environment (LITE) is a Web-based knowledge capture, management, and configuration tool with which users can build profiles used by OpenInfobutton, an open source infobutton manager, to provide electronic health record users with context-relevant links to online knowledge resources. We conducted a multipart evaluation study to explore users' attitudes and acceptance of LITE and to guide future development. The evaluation consisted of an initial online survey to all LITE users, followed by an observational study of a subset of users in which evaluators' sessions were recorded while they conducted assigned tasks. The observational study was followed by administration of a modified System Usability Scale (SUS) survey. Fourteen users responded to the survey and indicated good acceptance of LITE with feedback that was mostly positive. Six users participated in the observational study, demonstrating average task completion time of less than 6 minutes and an average SUS score of 72, which is considered good compared with other SUS scores. LITE can be used to fulfill its designated tasks quickly and successfully. Evaluators proposed suggestions for improvements in LITE functionality and user interface.
Guillen-Ahlers, Hector; Rao, Prahlad K; Perumalla, Danu S; Montoya, Maria J; Jadhav, Avinash Y L; Shortreed, Michael R; Smith, Lloyd M; Olivier, Michael
2018-06-01
The hybridization capture of chromatin-associated proteins for proteomics (HyCCAPP) technology was initially developed to uncover novel DNA-protein interactions in yeast. It allows analysis of a target region of interest without the need for prior knowledge about likely proteins bound to the target region. This, in theory, allows HyCCAPP to be used to analyze any genomic region of interest, and it provides sufficient flexibility to work in different cell systems. This method is not meant to study binding sites of known transcription factors, a task better suited for Chromatin Immunoprecipitation (ChIP) and ChIP-like methods. The strength of HyCCAPP lies in its ability to explore DNA regions for which there is limited or no knowledge about the proteins bound to it. It can also be a convenient method to avoid biases (present in ChIP-like methods) introduced by protein-based chromatin enrichment using antibodies. Potentially, HyCCAPP can be a powerful tool to uncover truly novel DNA-protein interactions. To date, the technology has been predominantly applied to yeast cells or to high copy repeat sequences in mammalian cells. In order to become the powerful tool we envision, HyCCAPP approaches need to be optimized to efficiently capture single-copy loci in mammalian cells. Here, we present our adaptation of the initial yeast HyCCAPP capture protocol to human cell lines, and show that single-copy chromatin regions can be efficiently isolated with this modified protocol.
Burns, Gully APC; Cheng, Wei-Cheng
2006-01-01
Background Knowledge bases that summarize the published literature provide useful online references for specific areas of systems-level biology that are not otherwise supported by large-scale databases. In the field of neuroanatomy, groups of small focused teams have constructed medium size knowledge bases to summarize the literature describing tract-tracing experiments in several species. Despite years of collation and curation, these databases only provide partial coverage of the available published literature. Given that the scientists reading these papers must all generate the interpretations that would normally be entered into such a system, we attempt here to provide general-purpose annotation tools to make it easy for members of the community to contribute to the task of data collation. Results In this paper, we describe an open-source, freely available knowledge management system called 'NeuroScholar' that allows straightforward structured markup of the PDF files according to a well-designed schema to capture the essential details of this class of experiment. Although, the example worked through in this paper is quite specific to neuroanatomical connectivity, the design is freely extensible and could conceivably be used to construct local knowledge bases for other experiment types. Knowledge representations of the experiment are also directly linked to the contributing textual fragments from the original research article. Through the use of this system, not only could members of the community contribute to the collation task, but input data can be gathered for automated approaches to permit knowledge acquisition through the use of Natural Language Processing (NLP). Conclusion We present a functional, working tool to permit users to populate knowledge bases for neuroanatomical connectivity data from the literature through the use of structured questionnaires. This system is open-source, fully functional and available for download from [1]. PMID:16895608
The design and implementation of the immune epitope database and analysis resource
Peters, Bjoern; Sidney, John; Bourne, Phil; Bui, Huynh-Hoa; Buus, Soeren; Doh, Grace; Fleri, Ward; Kronenberg, Mitch; Kubo, Ralph; Lund, Ole; Nemazee, David; Ponomarenko, Julia V.; Sathiamurthy, Muthu; Schoenberger, Stephen P.; Stewart, Scott; Surko, Pamela; Way, Scott; Wilson, Steve; Sette, Alessandro
2016-01-01
Epitopes are defined as parts of antigens interacting with receptors of the immune system. Knowledge about their intrinsic structure and how they affect the immune response is required to continue development of techniques that detect, monitor, and fight diseases. Their scientific importance is reflected in the vast amount of epitope-related information gathered, ranging from interactions between epitopes and major histocompatibility complex molecules determined by X-ray crystallography to clinical studies analyzing correlates of protection for epitope based vaccines. Our goal is to provide a central resource capable of capturing this information, allowing users to access and connect realms of knowledge that are currently separated and difficult to access. Here, we portray a new initiative, “The Immune Epitope Database and Analysis Resource.” We describe how we plan to capture, structure, and store this information, what query interfaces we will make available to the public, and what additional predictive and analytical tools we will provide. PMID:15895191
Mallouk, Kaitlin E; Rood, Mark J
2013-07-02
The use of adsorption on activated carbon fiber cloth (ACFC) followed by electrothermal swing adsorption (ESA) and postdesorption pressure and temperature control allows organic gases with boiling points below 0 °C to be captured from air streams and recovered as liquids. This technology has the potential to be a more sustainable abatement technique when compared to thermal oxidation. In this paper, we determine the process performance and energy requirements of a gas recovery system (GRS) using ACFC-ESA for three adsorbates with relative pressures between 8.3 × 10(-5) and 3.4 × 10(-3) and boiling points as low as -26.3 °C. The GRS is able to capture > 99% of the organic gas from the feed air stream, which is comparable to destruction efficiencies for thermal oxidizers. The energy used per liquid mole recovered ranges from 920 to 52,000 kJ/mol and is a function of relative pressure of the adsorbate in the feed gas. Quantifying the performance of the bench-scale gas recovery system in terms of its ability to remove organic gases from the adsorption stream and the energy required to liquefy the recovered organic gases is a critical step in developing new technologies to allow manufacturing to occur in a more sustainable manner. To our knowledge, this is the first time an ACFC-ESA system has been used to capture, recover, and liquefy organic compounds with vapor pressures as low as 8.3 × 10(-5) and the first time such a system has been analyzed for process performance and energy consumption.
IDEF5 Ontology Description Capture Method: Concept Paper
NASA Technical Reports Server (NTRS)
Menzel, Christopher P.; Mayer, Richard J.
1990-01-01
The results of research towards an ontology capture method referred to as IDEF5 are presented. Viewed simply as the study of what exists in a domain, ontology is an activity that can be understood to be at work across the full range of human inquiry prompted by the persistent effort to understand the world in which it has found itself - and which it has helped to shape. In the contest of information management, ontology is the task of extracting the structure of a given engineering, manufacturing, business, or logistical domain and storing it in an usable representational medium. A key to effective integration is a system ontology that can be accessed and modified across domains and which captures common features of the overall system relevant to the goals of the disparate domains. If the focus is on information integration, then the strongest motivation for ontology comes from the need to support data sharing and function interoperability. In the correct architecture, an enterprise ontology base would allow th e construction of an integrated environment in which legacy systems appear to be open architecture integrated resources. If the focus is on system/software development, then support for the rapid acquisition of reliable systems is perhaps the strongest motivation for ontology. Finally, ontological analysis was demonstrated to be an effective first step in the construction of robust knowledge based systems.
NASA Astrophysics Data System (ADS)
Carr, G.
2017-12-01
Real world problems rarely regard disciplinary boundaries. This is particularly apparent in catchments, where knowledge and understanding from many different research disciplines is essential to address the water resource challenges facing society. People are an integral part of any catchment. Therefore a comprehensive understanding of catchment evolution needs to include the social system. Socio-hydrological models that can simulate the co-evolution of human-water systems, for example, with regards to floods and droughts, show great promise in their capacity to capture and understand such systems. Yet, to develop socio-hydrological models into more comprehensive analysis tools that adequately capture the social components of the system, researchers need to embrace interdisciplinary working and multi-disciplinary research teams. By exploring the development of interdisciplinary research in a water programme, several key practices have been identified that support interdisciplinary collaboration. These include clarification where researchers discuss and re-explain their research or position to expose all the assumptions being made until all involved understand it; harnessing differences where different opinions and types of knowledge are treated respectfully to minimise tensions and disputes; and boundary setting where defensible limits to the research enquiry are set with consideration for the restrictions (funds, skills, resources) through negotiation and discussion between the research team members. Focussing on these research practices while conducting interdisciplinary collaborative research into the human-water system, is anticipated to support the development of more integrated approaches and models.
Students Approach to Learning and Their Use of Lecture Capture
ERIC Educational Resources Information Center
Vajoczki, Susan; Watt, Susan; Marquis, Nick; Liao, Rose; Vine, Michelle
2011-01-01
This study examined lecture capture as a way of enhancing university education, and explored how students with different learning approaches used lecture capturing (i.e., podcasts and vodcasts). Results indicate that both deep and surface learners report increased course satisfaction and better retention of knowledge in courses with traditional…
ACES: Space shuttle flight software analysis expert system
NASA Technical Reports Server (NTRS)
Satterwhite, R. Scott
1990-01-01
The Analysis Criteria Evaluation System (ACES) is a knowledge based expert system that automates the final certification of the Space Shuttle onboard flight software. Guidance, navigation and control of the Space Shuttle through all its flight phases are accomplished by a complex onboard flight software system. This software is reconfigured for each flight to allow thousands of mission-specific parameters to be introduced and must therefore be thoroughly certified prior to each flight. This certification is performed in ground simulations by executing the software in the flight computers. Flight trajectories from liftoff to landing, including abort scenarios, are simulated and the results are stored for analysis. The current methodology of performing this analysis is repetitive and requires many man-hours. The ultimate goals of ACES are to capture the knowledge of the current experts and improve the quality and reduce the manpower required to certify the Space Shuttle onboard flight software.
NASA Technical Reports Server (NTRS)
1990-01-01
The present conference on artificial intelligence (AI), robotics, and automation in space encompasses robot systems, lunar and planetary robots, advanced processing, expert systems, knowledge bases, issues of operation and management, manipulator control, and on-orbit service. Specific issues addressed include fundamental research in AI at NASA, the FTS dexterous telerobot, a target-capture experiment by a free-flying robot, the NASA Planetary Rover Program, the Katydid system for compiling KEE applications to Ada, and speech recognition for robots. Also addressed are a knowledge base for real-time diagnosis, a pilot-in-the-loop simulation of an orbital docking maneuver, intelligent perturbation algorithms for space scheduling optimization, a fuzzy control method for a space manipulator system, hyperredundant manipulator applications, robotic servicing of EOS instruments, and a summary of astronaut inputs on automation and robotics for the Space Station Freedom.
Ontologies and Information Systems: A Literature Survey
2011-06-01
Science and Technology Organisation DSTO–TN–1002 ABSTRACT An ontology captures in a computer-processable language the important con - cepts in a...knowledge shara- bility, reusability and scalability, and that support collaborative and distributed con - struction of ontologies, the DOGMA and DILIGENT...and assemble the received information). In the second stage, the designers determine how ontologies should be used in the pro - cess of adding
ERIC Educational Resources Information Center
Jara-Ettinger, Julian; Piantadosi, Steve; Spelke, Elizabeth S.; Levy, Roger; Gibson, Edward
2017-01-01
To master the natural number system, children must understand both the concepts that number words capture and the counting procedure by which they are applied. These two types of knowledge develop in childhood, but their connection is poorly understood. Here we explore the relationship between the mastery of counting and the mastery of exact…
Reducing the cognitive workload - Trouble managing power systems
NASA Technical Reports Server (NTRS)
Manner, David B.; Liberman, Eugene M.; Dolce, James L.; Mellor, Pamela A.
1993-01-01
The complexity of space-based systems makes monitoring them and diagnosing their faults taxing for human beings. When a problem arises, immediate attention and quick resolution is mandatory. To aid humans in these endeavors we have developed an automated advisory system. Our advisory expert system, Trouble, incorporates the knowledge of the power system designers for Space Station Freedom. Trouble is designed to be a ground-based advisor for the mission controllers in the Control Center Complex at Johnson Space Center (JSC). It has been developed at NASA Lewis Research Center (LeRC) and tested in conjunction with prototype flight hardware contained in the Power Management and Distribution testbed and the Engineering Support Center, ESC, at LeRC. Our work will culminate with the adoption of these techniques by the mission controllers at JSC. This paper elucidates how we have captured power system failure knowledge, how we have built and tested our expert system, and what we believe its potential uses are.
Using fuzzy logic to integrate neural networks and knowledge-based systems
NASA Technical Reports Server (NTRS)
Yen, John
1991-01-01
Outlined here is a novel hybrid architecture that uses fuzzy logic to integrate neural networks and knowledge-based systems. The author's approach offers important synergistic benefits to neural nets, approximate reasoning, and symbolic processing. Fuzzy inference rules extend symbolic systems with approximate reasoning capabilities, which are used for integrating and interpreting the outputs of neural networks. The symbolic system captures meta-level information about neural networks and defines its interaction with neural networks through a set of control tasks. Fuzzy action rules provide a robust mechanism for recognizing the situations in which neural networks require certain control actions. The neural nets, on the other hand, offer flexible classification and adaptive learning capabilities, which are crucial for dynamic and noisy environments. By combining neural nets and symbolic systems at their system levels through the use of fuzzy logic, the author's approach alleviates current difficulties in reconciling differences between low-level data processing mechanisms of neural nets and artificial intelligence systems.
Knowledge portal: a tool to capture university requirements
NASA Astrophysics Data System (ADS)
Mansourvar, Marjan; Binti Mohd Yasin, Norizan
2011-10-01
New technologies, especially, the Internet have made a huge impact on knowledge management and information dissemination in education. The web portal as a knowledge management system is very popular topics in many organizations including universities. Generally, a web portal defines as a gateway to online network accessible resources through the intranet, extranet or Internet. This study develops a knowledge portal for the students in the Faculty of Computer Science and Information Technology (FCSIT), University of Malaya (UM). The goals of this portal are to provide information for the students to help them to choose the right courses and major that are relevant to their intended future jobs or career in IT. A quantitative approach used as the selected method for this research. Quantitative method provides an easy and useful way to collect data from a large sample population.
ERIC Educational Resources Information Center
Taintor, Spence
2008-01-01
Every year, teachers leave the profession and take valuable experience and knowledge with them. An increasing retirement rate makes schools vulnerable to a significant loss of knowledge. This article describes how implementing a knowledge management process will ensure that valuable assets are captured and shared. (Contains 3 online resources.)
An integrated science-based methodology to assess potential ...
There is an urgent need for broad and integrated studies that address the risks of engineered nanomaterials (ENMs) along the different endpoints of the society, environment, and economy (SEE) complex adaptive system. This article presents an integrated science-based methodology to assess the potential risks of engineered nanomaterials. To achieve the study objective, two major tasks are accomplished, knowledge synthesis and algorithmic computational methodology. The knowledge synthesis task is designed to capture “what is known” and to outline the gaps in knowledge from ENMs risk perspective. The algorithmic computational methodology is geared toward the provision of decisions and an understanding of the risks of ENMs along different endpoints for the constituents of the SEE complex adaptive system. The approach presented herein allows for addressing the formidable task of assessing the implications and risks of exposure to ENMs, with the long term goal to build a decision-support system to guide key stakeholders in the SEE system towards building sustainable ENMs and nano-enabled products. The following specific aims are formulated to achieve the study objective: (1) to propose a system of systems (SoS) architecture that builds a network management among the different entities in the large SEE system to track the flow of ENMs emission, fate and transport from the source to the receptor; (2) to establish a staged approach for knowledge synthesis methodo
Winship, Kathy
2012-01-01
Concern over the impending retirement of several top-level managers led a county agency to engage in efforts aimed at more efficient succession management. Administrators developed plans to prevent the loss of invaluable knowledge and wisdom accompanying retirement of experienced agency leaders. The agency's Director of Finance (DoF) was one of the first key figures projected to retire, and a succession plan was implemented to transfer his knowledge for use after his departure. The knowledge transfer process involved three stages, including: (1) employing the DoF as teacher, having him develop curricula and conduct trainings; (2) engaging the DoF as mentor, allowing an existing staff member and the DoF's successor to shadow and be coached by the DoF; and (3) developing a knowledge management system that could be used after the DoF departed. This case study describes the knowledge transfer process and experiences shared by the DoF and this agency. Copyright © Taylor & Francis Group, LLC
Temperature dependence of carrier capture by defects in gallium arsenide
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wampler, William R.; Modine, Normand A.
2015-08-01
This report examines the temperature dependence of the capture rate of carriers by defects in gallium arsenide and compares two previously published theoretical treatments of this based on multi phonon emission (MPE). The objective is to reduce uncertainty in atomistic simulations of gain degradation in III-V HBTs from neutron irradiation. A major source of uncertainty in those simulations is poor knowledge of carrier capture rates, whose values can differ by several orders of magnitude between various defect types. Most of this variation is due to different dependence on temperature, which is closely related to the relaxation of the defect structuremore » that occurs as a result of the change in charge state of the defect. The uncertainty in capture rate can therefore be greatly reduced by better knowledge of the defect relaxation.« less
A pilot study of distributed knowledge management and clinical decision support in the cloud.
Dixon, Brian E; Simonaitis, Linas; Goldberg, Howard S; Paterno, Marilyn D; Schaeffer, Molly; Hongsermeier, Tonya; Wright, Adam; Middleton, Blackford
2013-09-01
Implement and perform pilot testing of web-based clinical decision support services using a novel framework for creating and managing clinical knowledge in a distributed fashion using the cloud. The pilot sought to (1) develop and test connectivity to an external clinical decision support (CDS) service, (2) assess the exchange of data to and knowledge from the external CDS service, and (3) capture lessons to guide expansion to more practice sites and users. The Clinical Decision Support Consortium created a repository of shared CDS knowledge for managing hypertension, diabetes, and coronary artery disease in a community cloud hosted by Partners HealthCare. A limited data set for primary care patients at a separate health system was securely transmitted to a CDS rules engine hosted in the cloud. Preventive care reminders triggered by the limited data set were returned for display to clinician end users for review and display. During a pilot study, we (1) monitored connectivity and system performance, (2) studied the exchange of data and decision support reminders between the two health systems, and (3) captured lessons. During the six month pilot study, there were 1339 patient encounters in which information was successfully exchanged. Preventive care reminders were displayed during 57% of patient visits, most often reminding physicians to monitor blood pressure for hypertensive patients (29%) and order eye exams for patients with diabetes (28%). Lessons learned were grouped into five themes: performance, governance, semantic interoperability, ongoing adjustments, and usability. Remote, asynchronous cloud-based decision support performed reasonably well, although issues concerning governance, semantic interoperability, and usability remain key challenges for successful adoption and use of cloud-based CDS that will require collaboration between biomedical informatics and computer science disciplines. Decision support in the cloud is feasible and may be a reasonable path toward achieving better support of clinical decision-making across the widest range of health care providers. Published by Elsevier B.V.
Semantics of the visual environment encoded in parahippocampal cortex
Bonner, Michael F.; Price, Amy Rose; Peelle, Jonathan E.; Grossman, Murray
2016-01-01
Semantic representations capture the statistics of experience and store this information in memory. A fundamental component of this memory system is knowledge of the visual environment, including knowledge of objects and their associations. Visual semantic information underlies a range of behaviors, from perceptual categorization to cognitive processes such as language and reasoning. Here we examine the neuroanatomic system that encodes visual semantics. Across three experiments, we found converging evidence indicating that knowledge of verbally mediated visual concepts relies on information encoded in a region of the ventral-medial temporal lobe centered on parahippocampal cortex. In an fMRI study, this region was strongly engaged by the processing of concepts relying on visual knowledge but not by concepts relying on other sensory modalities. In a study of patients with the semantic variant of primary progressive aphasia (semantic dementia), atrophy that encompassed this region was associated with a specific impairment in verbally mediated visual semantic knowledge. Finally, in a structural study of healthy adults from the fMRI experiment, gray matter density in this region related to individual variability in the processing of visual concepts. The anatomic location of these findings aligns with recent work linking the ventral-medial temporal lobe with high-level visual representation, contextual associations, and reasoning through imagination. Together this work suggests a critical role for parahippocampal cortex in linking the visual environment with knowledge systems in the human brain. PMID:26679216
Semantics of the Visual Environment Encoded in Parahippocampal Cortex.
Bonner, Michael F; Price, Amy Rose; Peelle, Jonathan E; Grossman, Murray
2016-03-01
Semantic representations capture the statistics of experience and store this information in memory. A fundamental component of this memory system is knowledge of the visual environment, including knowledge of objects and their associations. Visual semantic information underlies a range of behaviors, from perceptual categorization to cognitive processes such as language and reasoning. Here we examine the neuroanatomic system that encodes visual semantics. Across three experiments, we found converging evidence indicating that knowledge of verbally mediated visual concepts relies on information encoded in a region of the ventral-medial temporal lobe centered on parahippocampal cortex. In an fMRI study, this region was strongly engaged by the processing of concepts relying on visual knowledge but not by concepts relying on other sensory modalities. In a study of patients with the semantic variant of primary progressive aphasia (semantic dementia), atrophy that encompassed this region was associated with a specific impairment in verbally mediated visual semantic knowledge. Finally, in a structural study of healthy adults from the fMRI experiment, gray matter density in this region related to individual variability in the processing of visual concepts. The anatomic location of these findings aligns with recent work linking the ventral-medial temporal lobe with high-level visual representation, contextual associations, and reasoning through imagination. Together, this work suggests a critical role for parahippocampal cortex in linking the visual environment with knowledge systems in the human brain.
Patel, Vimla L; Arocha, José F; Kushniruk, André W
2002-02-01
The aim of this paper is to examine knowledge organization and reasoning strategies involved in physician-patient communication and to consider how these are affected by the use of computer tools, in particular, electronic medical record (EMR) systems. In the first part of the paper, we summarize results from a study in which patients were interviewed before their interactions with physicians and where physician-patient interactions were recorded and analyzed to evaluate patients' and physicians' understanding of the patient problem. We give a detailed presentation of one of such interaction, with characterizations of physician and patient models. In a second set of studies, the contents of both paper and EMRs were compared and in addition, physician-patient interactions (involving the use of EMR technology) were video recorded and analyzed to assess physicians' information gathering and knowledge organization for medical decision-making. Physicians explained the patient problems in terms of causal pathophysiological knowledge underlying the disease (disease model), whereas patients explained them in terms of narrative structures of illness (illness model). The data-driven nature of the traditional physician-patient interaction allows physicians to capture the temporal flow of events and to document key aspects of the patients' narratives. Use of electronic medical records was found to influence the way patient data were gathered, resulting in information loss and disruption of temporal sequence of events in assessing patient problem. The physician-patient interview allows physicians to capture crucial aspects of the patient's illness model, which are necessary for understanding the problem from the patients' perspective. Use of computer-based patient record technology may lead to a loss of this relevant information. As a consequence, designers of such systems should take into account information relevant to the patient comprehension of medical problems, which will influence their compliance.
WE-F-BRB-01: The Power of Ontologies and Standardized Terminologies for Capturing Clinical Knowledge
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gabriel, P.
2015-06-15
Advancements in informatics in radiotherapy are opening up opportunities to improve our ability to assess treatment plans. Models on individualizing patient dose constraints from prior patient data and shape relationships have been extensively researched and are now making their way into commercial products. New developments in knowledge based treatment planning involve understanding the impact of the radiation dosimetry on the patient. Akin to radiobiology models that have driven intensity modulated radiotherapy optimization, toxicity and outcome predictions based on treatment plans and prior patient experiences may be the next step in knowledge based planning. In order to realize these predictions, itmore » is necessary to understand how the clinical information can be captured, structured and organized with ontologies and databases designed for recall. Large databases containing radiation dosimetry and outcomes present the opportunity to evaluate treatment plans against predictions of toxicity and disease response. Such evaluations can be based on dose volume histogram or even the full 3-dimensional dose distribution and its relation to the critical anatomy. This session will provide an understanding of ontologies and standard terminologies used to capture clinical knowledge into structured databases; How data can be organized and accessed to utilize the knowledge in planning; and examples of research and clinical efforts to incorporate that clinical knowledge into planning for improved care for our patients. Learning Objectives: Understand the role of standard terminologies, ontologies and data organization in oncology Understand methods to capture clinical toxicity and outcomes in a clinical setting Understand opportunities to learn from clinical data and its application to treatment planning Todd McNutt receives funding from Philips, Elekta and Toshiba for some of the work presented.« less
RAVE: Rapid Visualization Environment
NASA Technical Reports Server (NTRS)
Klumpar, D. M.; Anderson, Kevin; Simoudis, Avangelos
1994-01-01
Visualization is used in the process of analyzing large, multidimensional data sets. However, the selection and creation of visualizations that are appropriate for the characteristics of a particular data set and the satisfaction of the analyst's goals is difficult. The process consists of three tasks that are performed iteratively: generate, test, and refine. The performance of these tasks requires the utilization of several types of domain knowledge that data analysts do not often have. Existing visualization systems and frameworks do not adequately support the performance of these tasks. In this paper we present the RApid Visualization Environment (RAVE), a knowledge-based system that interfaces with commercial visualization frameworks and assists a data analyst in quickly and easily generating, testing, and refining visualizations. RAVE was used for the visualization of in situ measurement data captured by spacecraft.
NASA Astrophysics Data System (ADS)
Benzi, Roberto; Ching, Emily S. C.
2018-03-01
The interaction of flexible polymers with fluid flows leads to a number of intriguing phenomena observed in laboratory experiments, namely drag reduction, elastic turbulence, and heat transport modification in natural convection, and is one of the most challenging subjects in soft matter physics. In this review, we examine our present knowledge on the subject. Our present knowledge is mostly based on direct numerical simulations performed in the last twenty years, which have successfully explained, at least qualitatively, most of the experimental results. Our goal is to disentangle as much as possible the basic mechanisms acting in the system in order to capture the basic features underlying different theoretical approaches and explanations.
Anantha M. Prasad; Louis R. Iverson; Stephen N. Matthews; Matthew P. Peters
2016-01-01
Context. No single model can capture the complex species range dynamics under changing climates--hence the need for a combination approach that addresses management concerns. Objective. A multistage approach is illustrated to manage forested landscapes under climate change. We combine a tree species habitat model--DISTRIB II, a species colonization model--SHIFT, and...
Enabling Security, Stability, Transition, and Reconstruction Operations through Knowledge Management
2009-03-18
strategy. Overall, the cultural barriers to knowledge sharing center on knowledge creation and capture. The primary barrier to knowledge sharing is lack ... Lacking a shared identity decreases the likelihood of knowledge sharing, which is essential to effective collaboration.84 Related to collaboration...to adapt, develop, and change based on experience-derived knowledge.90 A second cultural barrier to knowledge acquisition is the lack receptiveness
Interoperable Data Sharing for Diverse Scientific Disciplines
NASA Astrophysics Data System (ADS)
Hughes, John S.; Crichton, Daniel; Martinez, Santa; Law, Emily; Hardman, Sean
2016-04-01
For diverse scientific disciplines to interoperate they must be able to exchange information based on a shared understanding. To capture this shared understanding, we have developed a knowledge representation framework using ontologies and ISO level archive and metadata registry reference models. This framework provides multi-level governance, evolves independent of implementation technologies, and promotes agile development, namely adaptive planning, evolutionary development, early delivery, continuous improvement, and rapid and flexible response to change. The knowledge representation framework is populated through knowledge acquisition from discipline experts. It is also extended to meet specific discipline requirements. The result is a formalized and rigorous knowledge base that addresses data representation, integrity, provenance, context, quantity, and their relationships within the community. The contents of the knowledge base is translated and written to files in appropriate formats to configure system software and services, provide user documentation, validate ingested data, and support data analytics. This presentation will provide an overview of the framework, present the Planetary Data System's PDS4 as a use case that has been adopted by the international planetary science community, describe how the framework is being applied to other disciplines, and share some important lessons learned.
Empirical study using network of semantically related associations in bridging the knowledge gap.
Abedi, Vida; Yeasin, Mohammed; Zand, Ramin
2014-11-27
The data overload has created a new set of challenges in finding meaningful and relevant information with minimal cognitive effort. However designing robust and scalable knowledge discovery systems remains a challenge. Recent innovations in the (biological) literature mining tools have opened new avenues to understand the confluence of various diseases, genes, risk factors as well as biological processes in bridging the gaps between the massive amounts of scientific data and harvesting useful knowledge. In this paper, we highlight some of the findings using a text analytics tool, called ARIANA--Adaptive Robust and Integrative Analysis for finding Novel Associations. Empirical study using ARIANA reveals knowledge discovery instances that illustrate the efficacy of such tool. For example, ARIANA can capture the connection between the drug hexamethonium and pulmonary inflammation and fibrosis that caused the tragic death of a healthy volunteer in a 2001 John Hopkins asthma study, even though the abstract of the study was not part of the semantic model. An integrated system, such as ARIANA, could assist the human expert in exploratory literature search by bringing forward hidden associations, promoting data reuse and knowledge discovery as well as stimulating interdisciplinary projects by connecting information across the disciplines.
The BioHub Knowledge Base: Ontology and Repository for Sustainable Biosourcing.
Read, Warren J; Demetriou, George; Nenadic, Goran; Ruddock, Noel; Stevens, Robert; Winter, Jerry
2016-06-01
The motivation for the BioHub project is to create an Integrated Knowledge Management System (IKMS) that will enable chemists to source ingredients from bio-renewables, rather than from non-sustainable sources such as fossil oil and its derivatives. The BioHubKB is the data repository of the IKMS; it employs Semantic Web technologies, especially OWL, to host data about chemical transformations, bio-renewable feedstocks, co-product streams and their chemical components. Access to this knowledge base is provided to other modules within the IKMS through a set of RESTful web services, driven by SPARQL queries to a Sesame back-end. The BioHubKB re-uses several bio-ontologies and bespoke extensions, primarily for chemical feedstocks and products, to form its knowledge organisation schema. Parts of plants form feedstocks, while various processes generate co-product streams that contain certain chemicals. Both chemicals and transformations are associated with certain qualities, which the BioHubKB also attempts to capture. Of immediate commercial and industrial importance is to estimate the cost of particular sets of chemical transformations (leading to candidate surfactants) performed in sequence, and these costs too are captured. Data are sourced from companies' internal knowledge and document stores, and from the publicly available literature. Both text analytics and manual curation play their part in populating the ontology. We describe the prototype IKMS, the BioHubKB and the services that it supports for the IKMS. The BioHubKB can be found via http://biohub.cs.manchester.ac.uk/ontology/biohub-kb.owl .
ERIC Educational Resources Information Center
Jones, Rebecca, Ed.; Nixon, Carol, Comp.; Burmood, Jennifer, Comp.
This publication contains presentations, notes, and illustrative materials used in the annual KMWorld Conference and Exposition, "Knowledge Nets: Defining and Driving the E-Enterprise." Presentations include: "Knowledge Management Applied to the Manufacturing Enterprise" (Matthew Artibee); "Ryder Knowledge Center: Building…
NASA Technical Reports Server (NTRS)
Jiang, Jian-Ping; Murphy, Elizabeth D.; Bailin, Sidney C.; Truszkowski, Walter F.
1993-01-01
Capturing human factors knowledge about the design of graphical user interfaces (GUI's) and applying this knowledge on-line are the primary objectives of the Computer-Human Interaction Models (CHIMES) project. The current CHIMES prototype is designed to check a GUI's compliance with industry-standard guidelines, general human factors guidelines, and human factors recommendations on color usage. Following the evaluation, CHIMES presents human factors feedback and advice to the GUI designer. The paper describes the approach to modeling human factors guidelines, the system architecture, a new method developed to convert quantitative RGB primaries into qualitative color representations, and the potential for integrating CHIMES with user interface management systems (UIMS). Both the conceptual approach and its implementation are discussed. This paper updates the presentation on CHIMES at the first International Symposium on Ground Data Systems for Spacecraft Control.
The use of automatic programming techniques for fault tolerant computing systems
NASA Technical Reports Server (NTRS)
Wild, C.
1985-01-01
It is conjectured that the production of software for ultra-reliable computing systems such as required by Space Station, aircraft, nuclear power plants and the like will require a high degree of automation as well as fault tolerance. In this paper, the relationship between automatic programming techniques and fault tolerant computing systems is explored. Initial efforts in the automatic synthesis of code from assertions to be used for error detection as well as the automatic generation of assertions and test cases from abstract data type specifications is outlined. Speculation on the ability to generate truly diverse designs capable of recovery from errors by exploring alternate paths in the program synthesis tree is discussed. Some initial thoughts on the use of knowledge based systems for the global detection of abnormal behavior using expectations and the goal-directed reconfiguration of resources to meet critical mission objectives are given. One of the sources of information for these systems would be the knowledge captured during the automatic programming process.
Semantic Analysis of Email Using Domain Ontologies and WordNet
NASA Technical Reports Server (NTRS)
Berrios, Daniel C.; Keller, Richard M.
2005-01-01
The problem of capturing and accessing knowledge in paper form has been supplanted by a problem of providing structure to vast amounts of electronic information. Systems that can construct semantic links for natural language documents like email messages automatically will be a crucial element of semantic email tools. We have designed an information extraction process that can leverage the knowledge already contained in an existing semantic web, recognizing references in email to existing nodes in a network of ontology instances by using linguistic knowledge and knowledge of the structure of the semantic web. We developed a heuristic score that uses several forms of evidence to detect references in email to existing nodes in the Semanticorganizer repository's network. While these scores cannot directly support automated probabilistic inference, they can be used to rank nodes by relevance and link those deemed most relevant to email messages.
NASA Astrophysics Data System (ADS)
Wei, Gongjin; Bai, Weijing; Yin, Meifang; Zhang, Songmao
We present a practice of applying the Semantic Web technologies in the domain of Chinese traditional architecture. A knowledge base consisting of one ontology and four rule bases is built to support the automatic generation of animations that demonstrate the construction of various Chinese timber structures based on the user's input. Different Semantic Web formalisms are used, e.g., OWL DL, SWRL and Jess, to capture the domain knowledge, including the wooden components needed for a given building, construction sequence, and the 3D size and position of every piece of wood. Our experience in exploiting the current Semantic Web technologies in real-world application systems indicates their prominent advantages (such as the reasoning facilities and modeling tools) as well as the limitations (such as low efficiency).
NASA Satellite Image of Japan Captured March 11, 2011
2017-12-08
NASA's Aqua satellite passed over Japan one hour and 41 minutes before the quake hit. At the time Aqua passed overhead, the Moderate Resolution Imaging Spectroradiometer (MODIS) instrument captured a visible of Japan covered by clouds. The image was taken at 0405 UTC on March 11 (1:05 p.m. local time Japan /11:05 p.m. EST March 10). The quake hit at 2:46 p.m. local time/Japan. Satellite: Aqua Credit: NASA/GSFC/Aqua NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Join us on Facebook
A proposed intracortical visual prosthesis image processing system.
Srivastava, N R; Troyk, P
2005-01-01
It has been a goal of neuroprosthesis researchers to develop a system, which could provide artifical vision to a large population of individuals with blindness. It has been demonstrated by earlier researches that stimulating the visual cortex area electrically can evoke spatial visual percepts, i.e. phosphenes. The goal of visual cortex prosthesis is to stimulate the visual cortex area and generate a visual perception in real time to restore vision. Even though the normal working of the visual system is not been completely understood, the existing knowledge has inspired research groups to develop strategies to develop visual cortex prosthesis which can help blind patients in their daily activities. A major limitation in this work is the development of an image proceessing system for converting an electronic image, as captured by a camera, into a real-time data stream for stimulation of the implanted electrodes. This paper proposes a system, which will capture the image using a camera and use a dedicated hardware real time image processor to deliver electrical pulses to intracortical electrodes. This system has to be flexible enough to adapt to individual patients and to various strategies of image reconstruction. Here we consider a preliminary architecture for this system.
What Hansel and Gretel’s Trail Teach Us about Knowledge Management
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wayne Simpson; Troy Hiltbrand
Background At Idaho National Laboratory (INL), we are on the cusp of a significant era of change. INL is the lead Department of Energy Nuclear Research and Development Laboratory, focused on finding innovative solutions to the nation’s energy challenges. Not only has the Laboratory grown at an unprecedented rate over the last five years, but also has a significant segment of its workforce that is ready for retirement. Over the next 10 years, it is anticipated that upwards of 60% of the current workforce at INL will be eligible for retirement. Since the Laboratory is highly dependent on the intellectualmore » capabilities of its scientists and engineers and their efforts to ensure the future of the nation’s energy portfolio, this attrition of resources has the potential of seriously impacting the ability of the Laboratory to sustain itself and the growth that it has achieved in the past years. Similar to Germany in the early nineteenth century, we face the challenge of our self-identity and must find a way to solidify our legacy to propel us into the future. Approach As the Brothers Grimm set out to collect their fairy tales, they focused on gathering information from the people that were most knowledgeable in the subject. For them, it was the peasants, with their rich knowledge of the region’s sub-culture of folk lore that was passed down from generation to generation around the evening fire. As we look to capture this tacit knowledge, it is requisite that we also seek this information from those individuals that are most versed in it. In our case, it is the scientists and researchers who have dedicated their lives to providing the nation with nuclear energy. This information comes in many forms, both digital and non-digital. Some of this information still resides in the minds of these scientists and researchers who are close to retirement, or who have already retired. Once the information has been collected, it has to be sorted through to identify where the “shining stones” can be found. The quantity of this information makes it improbable for an individual or set of individuals to sort through it and pick out those ideas which are most important. To accomplish both the step of information capture and classification, modern advancements in technology give us the tools that we need to successfully capture this tacit knowledge. To assist in this process, we have evaluated multiple tools and methods that will help us to unlock the power of tacit knowledge. Tools The first challenge that stands in the way of success is the capture of information. More than 50 years of nuclear research is captured in log books, microfiche, and other non-digital formats. To transform this information from its current form into a format that can “shine,” requires a number of different tools. These tools fall into three major categories: Information Capture, Content Retrieval, and Information Classification. Information Capture The first step is to capture the information from a myriad of sources. With knowledge existing in multiple formats, this step requires multiple approaches to be successful. Some of the sources that require consideration include handwritten documents, typed documents, microfiche, images, audio and video feeds, and electronic images. To make this step feasible for a large body of knowledge requires automation.« less
Visual control of prey-capture flight in dragonflies.
Olberg, Robert M
2012-04-01
Interacting with a moving object poses a computational problem for an animal's nervous system. This problem has been elegantly solved by the dragonfly, a formidable visual predator on flying insects. The dragonfly computes an interception flight trajectory and steers to maintain it during its prey-pursuit flight. This review summarizes current knowledge about pursuit behavior and neurons thought to control interception in the dragonfly. When understood, this system has the potential for explaining how a small group of neurons can control complex interactions with moving objects. Copyright © 2011 Elsevier Ltd. All rights reserved.
Modeling and optimal design of CO2 Direct Air Capture systems in large arrays
NASA Astrophysics Data System (ADS)
Sadri Irani, Samaneh; Luzzatto-Fegiz, Paolo
2017-11-01
As noted by the 2014 IPCC report, while the rise in atmospheric CO2 would be slowed by emissions reductions, removing atmospheric CO2 is an important part of possible paths to climate stabilization. Direct Air Capture of CO2 with chemicals (DAC) is one of several proposed carbon capture technologies. There is an ongoing debate on whether DAC is an economically viable approach to alleviate climate change. In addition, like all air capture strategies, DAC is strongly constrained by the net-carbon problem, namely the need to control CO2 emissions associated with the capture process (for example, if DAC not powered by renewables). Research to date has focused on the chemistry and economics of individual DAC devices. However, the fluid mechanics of their large-scale deployment has not been examined in the literature, to the best of our knowledge. In this presentation, we develop a model for flow through an array of DAC devices, varying their lateral extent and their separation. We build on a recent theory of canopy flows, introducing terms for CO2 entrainment into the array boundary layer, and transport into the farm. In addition, we examine the possibility of driving flow passively by wind, thereby reducing energy consumption. The optimal operational design is established considering the total cost, drag force, energy consumption and total CO2 capture.
Artificial intelligence-assisted occupational lung disease diagnosis.
Harber, P; McCoy, J M; Howard, K; Greer, D; Luo, J
1991-08-01
An artificial intelligence expert-based system for facilitating the clinical recognition of occupational and environmental factors in lung disease has been developed in a pilot fashion. It utilizes a knowledge representation scheme to capture relevant clinical knowledge into structures about specific objects (jobs, diseases, etc) and pairwise relations between objects. Quantifiers describe both the closeness of association and risk, as well as the degree of belief in the validity of a fact. An independent inference engine utilizes the knowledge, combining likelihoods and uncertainties to achieve estimates of likelihood factors for specific paths from work to illness. The system creates a series of "paths," linking work activities to disease outcomes. One path links a single period of work to a single possible disease outcome. In a preliminary trial, the number of "paths" from job to possible disease averaged 18 per subject in a general population and averaged 25 per subject in an asthmatic population. Artificial intelligence methods hold promise in the future to facilitate diagnosis in pulmonary and occupational medicine.
Information-theoretic decomposition of embodied and situated systems.
Da Rold, Federico
2018-07-01
The embodied and situated view of cognition stresses the importance of real-time and nonlinear bodily interaction with the environment for developing concepts and structuring knowledge. In this article, populations of robots controlled by an artificial neural network learn a wall-following task through artificial evolution. At the end of the evolutionary process, time series are recorded from perceptual and motor neurons of selected robots. Information-theoretic measures are estimated on pairings of variables to unveil nonlinear interactions that structure the agent-environment system. Specifically, the mutual information is utilized to quantify the degree of dependence and the transfer entropy to detect the direction of the information flow. Furthermore, the system is analyzed with the local form of such measures, thus capturing the underlying dynamics of information. Results show that different measures are interdependent and complementary in uncovering aspects of the robots' interaction with the environment, as well as characteristics of the functional neural structure. Therefore, the set of information-theoretic measures provides a decomposition of the system, capturing the intricacy of nonlinear relationships that characterize robots' behavior and neural dynamics. Copyright © 2018 Elsevier Ltd. All rights reserved.
Cognitive task analysis for instruction in single-injection ultrasound guided-regional anesthesia
NASA Astrophysics Data System (ADS)
Gucev, Gligor V.
Cognitive task analysis (CTA) is methodology for eliciting knowledge from subject matter experts. CTA has been used to capture the cognitive processes, decision-making, and judgments that underlie expert behaviors. A review of the literature revealed that CTA has not yet been used to capture the knowledge required to perform ultrasound guided regional anesthesia (UGRA). The purpose of this study was to utilize CTA to extract knowledge from UGRA experts and to determine whether instruction based on CTA of UGRA will produce results superior to the results of traditional training. This study adds to the knowledge base of CTA in being the first one to effectively capture the expert knowledge of UGRA. The derived protocol was used in a randomized, double blinded experiment involving UGRA instruction to 39 novice learners. The results of this study strongly support the hypothesis that CTA-based instruction in UGRA is more effective than conventional clinical instruction, as measured by conceptual pre- and post-tests, performance of a simulated UGRA procedure, and time necessary for the task performance. This study adds to the number of studies that have proven the superiority of CTA-informed instruction. Finally, it produced several validated instruments that can be used in instructing and evaluating UGRA.
"JOB SEEKER"(Job Shadowing for Employee Engagement through Knowledge and Experience Retention).
DOT National Transportation Integrated Search
2016-05-01
The main objective of this study was to explore how to optimally use the particular knowledge : retention/transfer technique of job shadowing as an informal method for knowledge capture and : transfer as well as increasing communication and emp...
Importance of Knowledge Management in the Higher Educational Institutes
ERIC Educational Resources Information Center
Namdev Dhamdhere, Sangeeta
2015-01-01
Every academic institution contributes to knowledge. The generated information and knowledge is to be compiled at a central place and disseminated among the society for further growth. It is observed that the generated knowledge in the academic institute is not stored or captured properly. It is also observed that many a times generated…
New Knowledge Derived from Learned Knowledge: Functional-Anatomic Correlates of Stimulus Equivalence
ERIC Educational Resources Information Center
Schlund, Michael W.; Hoehn-Saric, Rudolf; Cataldo, Michael F.
2007-01-01
Forming new knowledge based on knowledge established through prior learning is a central feature of higher cognition that is captured in research on stimulus equivalence (SE). Numerous SE investigations show that reinforcing behavior under control of distinct sets of arbitrary conditional relations gives rise to stimulus control by new, "derived"…
A Diagram Editor for Efficient Biomedical Knowledge Capture and Integration
Yu, Bohua; Jakupovic, Elvis; Wilson, Justin; Dai, Manhong; Xuan, Weijian; Mirel, Barbara; Athey, Brian; Watson, Stanley; Meng, Fan
2008-01-01
Understanding the molecular mechanisms underlying complex disorders requires the integration of data and knowledge from different sources including free text literature and various biomedical databases. To facilitate this process, we created the Biomedical Concept Diagram Editor (BCDE) to help researchers distill knowledge from data and literature and aid the process of hypothesis development. A key feature of BCDE is the ability to capture information with a simple drag-and-drop. This is a vast improvement over manual methods of knowledge and data recording and greatly increases the efficiency of the biomedical researcher. BCDE also provides a unique concept matching function to enforce consistent terminology, which enables conceptual relationships deposited by different researchers in the BCDE database to be mined and integrated for intelligible and useful results. We hope BCDE will promote the sharing and integration of knowledge from different researchers for effective hypothesis development. PMID:21347131
Exploring creative activity: a software environment for multimedia systems
NASA Astrophysics Data System (ADS)
Farrett, Peter W.; Jardine, David A.
1992-03-01
This paper examines various issues related to the theory, design, and implementation of a system that supports creative activity for a multimedia environment. The system incorporates artificial intelligence notions to acquire concepts of the problem domain. This paper investigates this environment by considering a model that is a basis for a system, which supports a history of user interaction. A multimedia system that supports creative activity is problematic. It must function as a tool allowing users to experiment dynamically with their own creative reasoning process--a very nebulous task environment. It should also support the acquisition of domain knowledge so that empirical observation can be further evaluated. This paper aims to illustrate that via the reuse of domain-specific knowledge, closely related ideas can be quickly developed. This approach is useful in the following sense: Multimedia navigational systems hardcode referential links with respect to a web or network. Although users can access or control navigation in a nonlinear (static) manner, these referential links are 'frozen' and can not capture their creative actions, which are essential in tutoring or learning applications. This paper describes a multimedia assistant based on the notion of knowledge- links, which allows users to navigate through creative information in a nonlinear (dynamic) fashion. A selection of prototype code based on object-oriented techniques and logic programming partially demonstrates this.
Lunar Satellite Snaps Image of Earth
2014-05-07
This image, captured Feb. 1, 2014, shows a colorized view of Earth from the moon-based perspective of NASA's Lunar Reconnaissance Orbiter. Credit: NASA/Goddard/Arizona State University -- NASA's Lunar Reconnaissance Orbiter (LRO) experiences 12 "earthrises" every day, however LROC (short for LRO Camera) is almost always busy imaging the lunar surface so only rarely does an opportunity arise such that LROC can capture a view of Earth. On Feb. 1, 2014, LRO pitched forward while approaching the moon's north pole allowing the LROC Wide Angle Camera to capture Earth rising above Rozhdestvenskiy crater (112 miles, or 180 km, in diameter). Read more: go.nasa.gov/1oqMlgu NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram
Assessing the Relative Risk of Aerocapture Using Probabalistic Risk Assessment
NASA Technical Reports Server (NTRS)
Percy, Thomas K.; Bright, Ellanee; Torres, Abel O.
2005-01-01
A recent study performed for the Aerocapture Technology Area in the In-Space Propulsion Technology Projects Office at the Marshall Space Flight Center investigated the relative risk of various capture techniques for Mars missions. Aerocapture has been proposed as a possible capture technique for future Mars missions but has been perceived by many in the community as a higher risk option as compared to aerobraking and propulsive capture. By performing a probabilistic risk assessment on aerocapture, aerobraking and propulsive capture, a comparison was made to uncover the projected relative risks of these three maneuvers. For mission planners, this knowledge will allow them to decide if the mass savings provided by aerocapture warrant any incremental risk exposure. The study focuses on a Mars Sample Return mission currently under investigation at the Jet Propulsion Laboratory (JPL). In each case (propulsive, aerobraking and aerocapture), the Earth return vehicle is inserted into Martian orbit by one of the three techniques being investigated. A baseline spacecraft was established through initial sizing exercises performed by JPL's Team X. While Team X design results provided the baseline and common thread between the spacecraft, in each case the Team X results were supplemented by historical data as needed. Propulsion, thermal protection, guidance, navigation and control, software, solar arrays, navigation and targeting and atmospheric prediction were investigated. A qualitative assessment of human reliability was also included. Results show that different risk drivers contribute significantly to each capture technique. For aerocapture, the significant drivers include propulsion system failures and atmospheric prediction errors. Software and guidance hardware contribute the most to aerobraking risk. Propulsive capture risk is mainly driven by anomalous solar array degradation and propulsion system failures. While each subsystem contributes differently to the risk of each technique, results show that there exists little relative difference in the reliability of these capture techniques although uncertainty for the aerocapture estimates remains high given the lack of in-space demonstration.
Kwf-Grid workflow management system for Earth science applications
NASA Astrophysics Data System (ADS)
Tran, V.; Hluchy, L.
2009-04-01
In this paper, we present workflow management tool for Earth science applications in EGEE. The workflow management tool was originally developed within K-wf Grid project for GT4 middleware and has many advanced features like semi-automatic workflow composition, user-friendly GUI for managing workflows, knowledge management. In EGEE, we are porting the workflow management tool to gLite middleware for Earth science applications K-wf Grid workflow management system was developed within "Knowledge-based Workflow System for Grid Applications" under the 6th Framework Programme. The workflow mangement system intended to - semi-automatically compose a workflow of Grid services, - execute the composed workflow application in a Grid computing environment, - monitor the performance of the Grid infrastructure and the Grid applications, - analyze the resulting monitoring information, - capture the knowledge that is contained in the information by means of intelligent agents, - and finally to reuse the joined knowledge gathered from all participating users in a collaborative way in order to efficiently construct workflows for new Grid applications. Kwf Grid workflow engines can support different types of jobs (e.g. GRAM job, web services) in a workflow. New class of gLite job has been added to the system, allows system to manage and execute gLite jobs in EGEE infrastructure. The GUI has been adapted to the requirements of EGEE users, new credential management servlet is added to portal. Porting K-wf Grid workflow management system to gLite would allow EGEE users to use the system and benefit from its avanced features. The system is primarly tested and evaluated with applications from ES clusters.
Tu, S W; Eriksson, H; Gennari, J H; Shahar, Y; Musen, M A
1995-06-01
PROTEGE-II is a suite of tools and a methodology for building knowledge-based systems and domain-specific knowledge-acquisition tools. In this paper, we show how PROTEGE-II can be applied to the task of providing protocol-based decision support in the domain of treating HIV-infected patients. To apply PROTEGE-II, (1) we construct a decomposable problem-solving method called episodic skeletal-plan refinement, (2) we build an application ontology that consists of the terms and relations in the domain, and of method-specific distinctions not already captured in the domain terms, and (3) we specify mapping relations that link terms from the application ontology to the domain-independent terms used in the problem-solving method. From the application ontology, we automatically generate a domain-specific knowledge-acquisition tool that is custom-tailored for the application. The knowledge-acquisition tool is used for the creation and maintenance of domain knowledge used by the problem-solving method. The general goal of the PROTEGE-II approach is to produce systems and components that are reusable and easily maintained. This is the rationale for constructing ontologies and problem-solving methods that can be composed from a set of smaller-grained methods and mechanisms. This is also why we tightly couple the knowledge-acquisition tools to the application ontology that specifies the domain terms used in the problem-solving systems. Although our evaluation is still preliminary, for the application task of providing protocol-based decision support, we show that these goals of reusability and easy maintenance can be achieved. We discuss design decisions and the tradeoffs that have to be made in the development of the system.
D and D Knowledge Management Information Tool - 2012 - 12106
DOE Office of Scientific and Technical Information (OSTI.GOV)
Upadhyay, H.; Lagos, L.; Quintero, W.
2012-07-01
Deactivation and decommissioning (D and D) work is a high priority activity across the Department of Energy (DOE) complex. Subject matter specialists (SMS) associated with the different ALARA (As-Low-As-Reasonably-Achievable) Centers, DOE sites, Energy Facility Contractors Group (EFCOG) and the D and D community have gained extensive knowledge and experience over the years in the cleanup of the legacy waste from the Manhattan Project. To prevent the D and D knowledge and expertise from being lost over time from the evolving and aging workforce, DOE and the Applied Research Center (ARC) at Florida International University (FIU) proposed to capture and maintainmore » this valuable information in a universally available and easily usable system. D and D KM-IT provides single point access to all D and D related activities through its knowledge base. It is a community driven system. D and D KM-IT makes D and D knowledge available to the people who need it at the time they need it and in a readily usable format. It uses the World Wide Web as the primary source for content in addition to information collected from subject matter specialists and the D and D community. It brings information in real time through web based custom search processes and its dynamic knowledge repository. Future developments include developing a document library, providing D and D information access on mobile devices for the Technology module and Hotline, and coordinating multiple subject matter specialists to support the Hotline. The goal is to deploy a high-end sophisticated and secured system to serve as a single large knowledge base for all the D and D activities. The system consolidates a large amount of information available on the web and presents it to users in the simplest way possible. (authors)« less
Organizational culture and knowledge management in the electric power generation industry
NASA Astrophysics Data System (ADS)
Mayfield, Robert D.
Scarcity of knowledge and expertise is a challenge in the electric power generation industry. Today's most pervasive knowledge issues result from employee turnover and the constant movement of employees from project to project inside organizations. To address scarcity of knowledge and expertise, organizations must enable employees to capture, transfer, and use mission-critical explicit and tacit knowledge. The purpose of this qualitative grounded theory research was to examine the relationship between and among organizations within the electric power generation industry developing knowledge management processes designed to retain, share, and use the industry, institutional, and technical knowledge upon which the organizations depend. The research findings show that knowledge management is a business problem within the domain of information systems and management. The risks associated with losing mission critical-knowledge can be measured using metrics on employee retention, recruitment, productivity, training and benchmarking. Certain enablers must be in place in order to engage people, encourage cooperation, create a knowledge-sharing culture, and, ultimately change behavior. The research revealed the following change enablers that support knowledge management strategies: (a) training - blended learning, (b) communities of practice, (c) cross-functional teams, (d) rewards and recognition programs, (e) active senior management support, (f) communication and awareness, (g) succession planning, and (h) team organizational culture.
ERIC Educational Resources Information Center
Kraft, Donald H., Ed.
The 2000 ASIS (American Society for Information Science) conference explored knowledge innovation. The tracks in the conference program included knowledge discovery, capture, and creation; classification and representation; information retrieval; knowledge dissemination; and social, behavioral, ethical, and legal aspects. This proceedings is…
A knowledge management system for new drug submission by pharma-industries.
Pinciroli, Francesco; Mottadelli, Sara; Vinci, Maurizio; Fabbro, Luigi; Gothager, Klas
2004-01-01
The pharma-industries are facing a number of crucial business issues to improve operational excellence in product time-to-market and wide regulatory compliance. These organizations own, produce, and manipulate a lot of knowledge. The new regulations by Health Authorities (HA) to pharma-industries should make the content and format of new drug application uniform worldwide. In this paper we suggest a novel approach of a pharma-industry to capture, process, and transmit clinical data electronically. The approach begins with an analysis of the knowledge generation points, some of them being outside the company. Implementations are grounded on the use of a de facto standard platform being Microsoft, having acceptable cost levels. The proposed infrastructure is integrated into existing company environment and technological platform, minimizing cost and risks, but improving efficiency and efficacy of new drug dossier compilation.
Mapping the 2017 Eclipse: Education, Navigation, Inspiration
NASA Astrophysics Data System (ADS)
Zeiler, M.
2015-12-01
Eclipse maps are a unique vessel of knowledge. At a glance, they communicate the essential knowledge of where and when to successfully view a total eclipse of the sun. An eclipse map also provides detailed knowledge of eclipse circumstances superimposed on the highway system for optimal navigation, especially in the event that weather forces relocation. Eclipse maps are also a vital planning tool for solar physicists and astrophotographers capturing high-resolution imagery of the solar corona. Michael Zeiler will speak to the role of eclipse maps in educating the American public and inspiring people to make the effort to reach the path of totality for the sight of a lifetime. Michael will review the role of eclipse maps in astronomical research and discuss a project under development, the 2017 Eclipse Atlas for smartphones, tablets, and desktop computers.
An Ontology-Based Archive Information Model for the Planetary Science Community
NASA Technical Reports Server (NTRS)
Hughes, J. Steven; Crichton, Daniel J.; Mattmann, Chris
2008-01-01
The Planetary Data System (PDS) information model is a mature but complex model that has been used to capture over 30 years of planetary science data for the PDS archive. As the de-facto information model for the planetary science data archive, it is being adopted by the International Planetary Data Alliance (IPDA) as their archive data standard. However, after seventeen years of evolutionary change the model needs refinement. First a formal specification is needed to explicitly capture the model in a commonly accepted data engineering notation. Second, the core and essential elements of the model need to be identified to help simplify the overall archive process. A team of PDS technical staff members have captured the PDS information model in an ontology modeling tool. Using the resulting knowledge-base, work continues to identify the core elements, identify problems and issues, and then test proposed modifications to the model. The final deliverables of this work will include specifications for the next generation PDS information model and the initial set of IPDA archive data standards. Having the information model captured in an ontology modeling tool also makes the model suitable for use by Semantic Web applications.
Nascimento, Douglas M; Ferreira, Emmanoela N; Bezerra, Dandara M M S Q; Rocha, Pollyana D; Alves, Rômulo R N; Mourão, José S
2012-12-01
The present study was undertaken in two traditional communities that are located on the margins of the estuary and mangrove complex of the Mamanguape River, Paraíba state (PB), Brazil. This work describes the crabs capture techniques tapamento and redinha, and identifies the negative socio-environmental impacts of redinha, using qualitative methods (open and semi-structured interviews, guided tours, direct observation and the administration of questionnaires). Results indicate that currently only two principle techniques are used to capture Ucides cordatus: redinha and tapamento. Tapamento has a low impact in relation to redinha. Redinha was pointed out by interviewees as a system that has social impact (social conflicts, breaking of traditions, substitution and extinction of techniques) and environmental impact (less selective captures and high productivity, mangrove pollution, death of crabs caught in traps, cutting of the roots of Rhizophora mangle, micro-habitat loss resulting from galleries destroyed and polluted). Knowledge of crab harvesting carried out using these two techniques and the possible social and environmental impacts caused by redinha, can lead to more effective planning and actions towards the conservation of the species.
ISACS-DOC: Monitoring and Diagnostic System for AKARI and HINODE
NASA Astrophysics Data System (ADS)
Mizutani, Mitsue; Hirose, Toshinori; Takaki, Ryoji; Honda, Hideyuki
ISACS-DOC (Intelligent Satellite Control Software-DOCtor), which is an automatic monitoring and diagnostic system for scientific satellites or spacecraft, aims to rapidly and accurately capture important changes and sign of anomaly during daily satellite operations. After three systems for deep space missions, the new generation of ISACS-DOC with a higher speed processing performance had been developed for the satellites in earth orbit, AKARI and HINODE. This paper reports on the newest ISACS-DOC about enhanced functions, operating status, and an approach to create standards to build and keep up the knowledge data base. Continuous enhancements through the actual operations are the advantage of ISACS-DOC.
GPM Captures Hurricane Joaquin
2017-12-08
Joaquin became a tropical storm Monday evening (EDT) midway between the Bahamas and Bermuda and has now formed into Hurricane Joaquin, the 3rd of the season--the difference is Joaquin could impact the US East Coast. NASA's GPM satellite captured Joaquin Tuesday, September 29th at 21:39 UTC (5:39 pm EDT). Credit: NASA's Scientific Visualization Studio Data provided by the joint NASA/JAXA GPM mission. Download/read more: svs.gsfc.nasa.gov/cgi-bin/details.cgi?aid=4367 NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram
Using local ecological knowledge to monitor threatened Mekong megafauna in Lao PDR
Phommachak, Amphone; Vannachomchan, Kongseng; Guegan, Francois
2017-01-01
Pressures on freshwater biodiversity in Southeast Asia are accelerating yet the status and conservation needs of many of the region’s threatened fish species are unclear. This impacts the ability to implement conservation activities and to understand the effects of infrastructure developments and other hydrological changes. We used Local Ecological Knowledge from fishing communities on the Mekong River in the Siphandone waterscape, Lao PDR to estimate mean and mode last capture dates of eight rare or culturally significant fish species in order to provide conservation monitoring baselines. One hundred and twenty fishermen, from six villages, were interviewed. All eight species had been captured, by at least one of the interviewees, within the waterscape within the past year. However the mean and mode last capture dates varied between the species. Larger species, and those with higher Red List threat status, were caught less recently than smaller species of less conservation concern. The status of the Critically Endangered Pangasius sanitwongsei (mean last capture date 116.4 months) is particularly worrying suggesting severe population decline although cultural issues may have caused this species to have been under-reported. This highlights that studies making use of Local Ecological Knowledge need to understand the cultural background and context from which data is collected. Nevertheless we recommend our approach, of stratified random interviews to establish mean last capture dates, may be an effective methodology for monitoring freshwater fish species of conservation concern within artisanal fisheries. If fishing effort remains relatively constant, or if changes in fishing effort are accounted for, differences over time in mean last capture dates are likely to represent changes in the status of species. We plan to repeat our interview surveys within the waterscape as part of a long-term fish-monitoring program. PMID:28820901
Using local ecological knowledge to monitor threatened Mekong megafauna in Lao PDR.
Gray, Thomas N E; Phommachak, Amphone; Vannachomchan, Kongseng; Guegan, Francois
2017-01-01
Pressures on freshwater biodiversity in Southeast Asia are accelerating yet the status and conservation needs of many of the region's threatened fish species are unclear. This impacts the ability to implement conservation activities and to understand the effects of infrastructure developments and other hydrological changes. We used Local Ecological Knowledge from fishing communities on the Mekong River in the Siphandone waterscape, Lao PDR to estimate mean and mode last capture dates of eight rare or culturally significant fish species in order to provide conservation monitoring baselines. One hundred and twenty fishermen, from six villages, were interviewed. All eight species had been captured, by at least one of the interviewees, within the waterscape within the past year. However the mean and mode last capture dates varied between the species. Larger species, and those with higher Red List threat status, were caught less recently than smaller species of less conservation concern. The status of the Critically Endangered Pangasius sanitwongsei (mean last capture date 116.4 months) is particularly worrying suggesting severe population decline although cultural issues may have caused this species to have been under-reported. This highlights that studies making use of Local Ecological Knowledge need to understand the cultural background and context from which data is collected. Nevertheless we recommend our approach, of stratified random interviews to establish mean last capture dates, may be an effective methodology for monitoring freshwater fish species of conservation concern within artisanal fisheries. If fishing effort remains relatively constant, or if changes in fishing effort are accounted for, differences over time in mean last capture dates are likely to represent changes in the status of species. We plan to repeat our interview surveys within the waterscape as part of a long-term fish-monitoring program.
NASA Technical Reports Server (NTRS)
Dennehy, Cornelius J.; Labbe, Steve; Lebsock, Kenneth L.
2010-01-01
Within the broad aerospace community the importance of identifying, documenting and widely sharing lessons learned during system development, flight test, operational or research programs/projects is broadly acknowledged. Documenting and sharing lessons learned helps managers and engineers to minimize project risk and improve performance of their systems. Often significant lessons learned on a project fail to get captured even though they are well known 'tribal knowledge' amongst the project team members. The physical act of actually writing down and documenting these lessons learned for the next generation of NASA GN&C engineers fails to happen on some projects for various reasons. In this paper we will first review the importance of capturing lessons learned and then will discuss reasons why some lessons are not documented. A simple proven approach called 'Pause and Learn' will be highlighted as a proven low-impact method of organizational learning that could foster the timely capture of critical lessons learned. Lastly some examples of 'lost' GN&C lessons learned from the aeronautics, spacecraft and launch vehicle domains are briefly highlighted. In the context of this paper 'lost' refers to lessons that have not achieved broad visibility within the NASA-wide GN&C CoP because they are either undocumented, masked or poorly documented in the NASA Lessons Learned Information System (LLIS).
A METHODOLOGY FOR INTEGRATING IMAGES AND TEXT FOR OBJECT IDENTIFICATION
DOE Office of Scientific and Technical Information (OSTI.GOV)
Paulson, Patrick R.; Hohimer, Ryan E.; Doucette, Peter J.
2006-02-13
Often text and imagery contain information that must be combined to solve a problem. One approach begins with transforming the raw text and imagery into a common structure that contains the critical information in a usable form. This paper presents an application in which the imagery of vehicles and the text from police reports were combined to demonstrate the power of data fusion to correctly identify the target vehicle--e.g., a red 2002 Ford truck identified in a police report--from a collection of diverse vehicle images. The imagery was abstracted into a common signature by first capturing the conceptual models ofmore » the imagery experts in software. Our system then (1) extracted fundamental features (e.g., wheel base, color), (2) made inferences about the information (e.g., it’s a red Ford) and then (3) translated the raw information into an abstract knowledge signature that was designed to both capture the important features and account for uncertainty. Likewise, the conceptual models of text analysis experts were instantiated into software that was used to generate an abstract knowledge signature that could be readily compared to the imagery knowledge signature. While this experiment primary focus was to demonstrate the power of text and imagery fusion for a specific example it also suggested several ways that text and geo-registered imagery could be combined to help solve other types of problems.« less
A Qualitative Approach to Assessing Technological Pedagogical Content Knowledge
ERIC Educational Resources Information Center
Groth, Randall; Spickler, Donald; Bergner, Jennifer; Bardzell, Michael
2009-01-01
Because technological pedagogical content knowledge is becoming an increasingly important construct in the field of teacher education, there is a need for assessment mechanisms that capture teachers' development of this portion of the knowledge base for teaching. The paper describes a proposal drawing on qualitative data produced during lesson…
Model authoring system for fail safe analysis
NASA Technical Reports Server (NTRS)
Sikora, Scott E.
1990-01-01
The Model Authoring System is a prototype software application for generating fault tree analyses and failure mode and effects analyses for circuit designs. Utilizing established artificial intelligence and expert system techniques, the circuits are modeled as a frame-based knowledge base in an expert system shell, which allows the use of object oriented programming and an inference engine. The behavior of the circuit is then captured through IF-THEN rules, which then are searched to generate either a graphical fault tree analysis or failure modes and effects analysis. Sophisticated authoring techniques allow the circuit to be easily modeled, permit its behavior to be quickly defined, and provide abstraction features to deal with complexity.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lindsay, Haile; Garcia-Santos, Norma; Saverot, Pierre
2012-07-01
The U.S. Nuclear Regulatory Commission (NRC) was established in 1974 with the mission to license and regulate the civilian use of nuclear materials for commercial, industrial, academic, and medical uses in order to protect public health and safety, and the environment, and promote the common defense and security. Currently, approximately half (∼49%) of the workforce at the NRC has been with the Agency for less than six years. As part of the Agency's mission, the NRC has partial responsibility for the oversight of the transportation and storage of radioactive materials. The NRC has experienced a significant level of expertise leavingmore » the Agency due to staff attrition. Factors that contribute to this attrition include retirement of the experienced nuclear workforce and mobility of staff within or outside the Agency. Several knowledge management (KM) initiatives have been implemented within the Agency, with one of them including the formation of a Division of Spent Fuel Storage and Transportation (SFST) KM team. The team, which was formed in the fall of 2008, facilitates capturing, transferring, and documenting regulatory knowledge for staff to effectively perform their safety oversight of transportation and storage of radioactive materials, regulated under Title 10 of the Code of Federal Regulations (10 CFR) Part 71 and Part 72. In terms of KM, the SFST goal is to share critical information among the staff to reduce the impact from staff's mobility and attrition. KM strategies in place to achieve this goal are: (1) development of communities of practice (CoP) (SFST Qualification Journal and the Packaging and Storing Radioactive Material) in the on-line NRC Knowledge Center (NKC); (2) implementation of a SFST seminar program where the seminars are recorded and placed in the Agency's repository, Agency-wide Documents Access and Management System (ADAMS); (3) meeting of technical discipline group programs to share knowledge within specialty areas; (4) development of written guidance to capture 'administrative and technical' knowledge (e.g., office instructions (OIs), generic communications (e.g., bulletins, generic letters, regulatory issue summary), standard review plans (SRPs), interim staff guidance (ISGs)); (5) use of mentoring strategies for experienced staff to train new staff members; (6) use of Microsoft SharePoint portals in capturing, transferring, and documenting knowledge for staff across the Division from Division management and administrative assistants to the project managers, inspectors, and technical reviewers; and (7) development and implementation of a Division KM Plan. A discussion and description of the successes and challenges of implementing these KM strategies at the NRC/SFST will be provided. (authors)« less
What are the Real Risks of Knowing and Not Knowing - Leading Knowledge on Cyber
2014-06-01
Kullback S, & R. A., Leibler . (1951) On information and sufficiency. The Annals of Mathematical Statistics 1: pp. 79-86. [73] Schreiber T. (2000...The active gathering and capture of information (and data) for testing (abducting, inducting and deducting) through social exchange’ [4, 15-20...Further research by the first author led to considerations of Fitness and Finessing: On Fitness : ‘As a function of a systems ability to test its
Evidence from machines that learn and think like people.
Forbus, Kenneth D; Gentner, Dedre
2017-01-01
We agree with Lake et al.'s trenchant analysis of deep learning systems, including that they are highly brittle and that they need vastly more examples than do people. We also agree that human cognition relies heavily on structured relational representations. However, we differ in our analysis of human cognitive processing. We argue that (1) analogical comparison processes are central to human cognition; and (2) intuitive physical knowledge is captured by qualitative representations, rather than quantitative simulations.
Knippenberg, Els; Verbrugghe, Jonas; Lamers, Ilse; Palmaers, Steven; Timmermans, Annick; Spooren, Annemie
2017-06-24
Client-centred task-oriented training is important in neurological rehabilitation but is time consuming and costly in clinical practice. The use of technology, especially motion capture systems (MCS) which are low cost and easy to apply in clinical practice, may be used to support this kind of training, but knowledge and evidence of their use for training is scarce. The present review aims to investigate 1) which motion capture systems are used as training devices in neurological rehabilitation, 2) how they are applied, 3) in which target population, 4) what the content of the training and 5) efficacy of training with MCS is. A computerised systematic literature review was conducted in four databases (PubMed, Cinahl, Cochrane Database and IEEE). The following MeSH terms and key words were used: Motion, Movement, Detection, Capture, Kinect, Rehabilitation, Nervous System Diseases, Multiple Sclerosis, Stroke, Spinal Cord, Parkinson Disease, Cerebral Palsy and Traumatic Brain Injury. The Van Tulder's Quality assessment was used to score the methodological quality of the selected studies. The descriptive analysis is reported by MCS, target population, training parameters and training efficacy. Eighteen studies were selected (mean Van Tulder score = 8.06 ± 3.67). Based on methodological quality, six studies were selected for analysis of training efficacy. Most commonly used MCS was Microsoft Kinect, training was mostly conducted in upper limb stroke rehabilitation. Training programs varied in intensity, frequency and content. None of the studies reported an individualised training program based on client-centred approach. Motion capture systems are training devices with potential in neurological rehabilitation to increase the motivation during training and may assist improvement on one or more International Classification of Functioning, Disability and Health (ICF) levels. Although client-centred task-oriented training is important in neurological rehabilitation, the client-centred approach was not included. Future technological developments should take up the challenge to combine MCS with the principles of a client-centred task-oriented approach and prove efficacy using randomised controlled trials with long-term follow-up. Prospero registration number 42016035582 .
Dearing, James W; Greene, Sarah M; Stewart, Walter F; Williams, Andrew E
2011-03-01
The improvement of health outcomes for both individual patients and entire populations requires improvement in the array of structures that support decisions and activities by healthcare practitioners. Yet, many gaps remain in how even sophisticated healthcare organizations manage knowledge. Here we describe the value of a trans-institutional network for identifying and capturing how-to knowledge that contributes to improved outcomes. Organizing and sharing on-the-job experience would concentrate and organize the activities of individual practitioners and subject their rapid cycle improvement testing and refinement to a form of collective intelligence for subsequent diffusion back through the network. We use the existing Cancer Research Network as an example of how a loosely structured consortium of healthcare delivery organizations could create and grow an implementation registry to foster innovation and implementation success by communicating what works, how, and which practitioners are using each innovation. We focus on the principles and parameters that could be used as a basis for infrastructure design. As experiential knowledge from across institutions builds within such a system, the system could ultimately motivate rapid learning and adoption of best practices. Implications for research about healthcare IT, invention, and organizational learning are discussed.
Dokas, Ioannis M; Panagiotakopoulos, Demetrios C
2006-08-01
The available expertise on managing and operating solid waste management (SWM) facilities varies among countries and among types of facilities. Few experts are willing to record their experience, while few researchers systematically investigate the chains of events that could trigger operational failures in a facility; expertise acquisition and dissemination, in SWM, is neither popular nor easy, despite the great need for it. This paper presents a knowledge acquisition process aimed at capturing, codifying and expanding reliable expertise and propagating it to non-experts. The knowledge engineer (KE), the person performing the acquisition, must identify the events (or causes) that could trigger a failure, determine whether a specific event could trigger more than one failure, and establish how various events are related among themselves and how they are linked to specific operational problems. The proposed process, which utilizes logic diagrams (fault trees) widely used in system safety and reliability analyses, was used for the analysis of 24 common landfill operational problems. The acquired knowledge led to the development of a web-based expert system (Landfill Operation Management Advisor, http://loma.civil.duth.gr), which estimates the occurrence possibility of operational problems, provides advice and suggests solutions.
NASA Astrophysics Data System (ADS)
Chmiel, P.; Ganzha, M.; Jaworska, T.; Paprzycki, M.
2017-10-01
Nowadays, as a part of systematic growth of volume, and variety, of information that can be found on the Internet, we observe also dramatic increase in sizes of available image collections. There are many ways to help users browsing / selecting images of interest. One of popular approaches are Content-Based Image Retrieval (CBIR) systems, which allow users to search for images that match their interests, expressed in the form of images (query by example). However, we believe that image search and retrieval could take advantage of semantic technologies. We have decided to test this hypothesis. Specifically, on the basis of knowledge captured in the CBIR, we have developed a domain ontology of residential real estate (detached houses, in particular). This allows us to semantically represent each image (and its constitutive architectural elements) represented within the CBIR. The proposed ontology was extended to capture not only the elements resulting from image segmentation, but also "spatial relations" between them. As a result, a new approach to querying the image database (semantic querying) has materialized, thus extending capabilities of the developed system.
Ambulatory Healthcare Utilization in the United States: A System Dynamics Approach
NASA Technical Reports Server (NTRS)
Diaz, Rafael; Behr, Joshua G.; Tulpule, Mandar
2011-01-01
Ambulatory health care needs within the United States are served by a wide range of hospitals, clinics, and private practices. The Emergency Department (ED) functions as an important point of supply for ambulatory healthcare services. Growth in our aging populations as well as changes stemming from broader healthcare reform are expected to continue trend in congestion and increasing demand for ED services. While congestion is, in part, a manifestation of unmatched demand, the state of the alignment between the demand for, and supply of, emergency department services affects quality of care and profitability. The central focus of this research is to provide an explanation of the salient factors at play within the dynamic demand-supply tensions within which ambulatory care is provided within an Emergency Department. A System Dynamics (SO) simulation model is used to capture the complexities among the intricate balance and conditional effects at play within the demand-supply emergency department environment. Conceptual clarification of the forces driving the elements within the system , quantifying these elements, and empirically capturing the interaction among these elements provides actionable knowledge for operational and strategic decision-making.
Fourth Conference on Artificial Intelligence for Space Applications
NASA Technical Reports Server (NTRS)
Odell, Stephen L. (Compiler); Denton, Judith S. (Compiler); Vereen, Mary (Compiler)
1988-01-01
Proceedings of a conference held in Huntsville, Alabama, on November 15-16, 1988. The Fourth Conference on Artificial Intelligence for Space Applications brings together diverse technical and scientific work in order to help those who employ AI methods in space applications to identify common goals and to address issues of general interest in the AI community. Topics include the following: space applications of expert systems in fault diagnostics, in telemetry monitoring and data collection, in design and systems integration; and in planning and scheduling; knowledge representation, capture, verification, and management; robotics and vision; adaptive learning; and automatic programming.
Calling on a million minds for community annotation in WikiProteins
Mons, Barend; Ashburner, Michael; Chichester, Christine; van Mulligen, Erik; Weeber, Marc; den Dunnen, Johan; van Ommen, Gert-Jan; Musen, Mark; Cockerill, Matthew; Hermjakob, Henning; Mons, Albert; Packer, Abel; Pacheco, Roberto; Lewis, Suzanna; Berkeley, Alfred; Melton, William; Barris, Nickolas; Wales, Jimmy; Meijssen, Gerard; Moeller, Erik; Roes, Peter Jan; Borner, Katy; Bairoch, Amos
2008-01-01
WikiProteins enables community annotation in a Wiki-based system. Extracts of major data sources have been fused into an editable environment that links out to the original sources. Data from community edits create automatic copies of the original data. Semantic technology captures concepts co-occurring in one sentence and thus potential factual statements. In addition, indirect associations between concepts have been calculated. We call on a 'million minds' to annotate a 'million concepts' and to collect facts from the literature with the reward of collaborative knowledge discovery. The system is available for beta testing at . PMID:18507872
NASA Astrophysics Data System (ADS)
Kuster, E.; Fox, G.
2016-12-01
Climate change is happening; scientists have already observed changes in sea level, increases in atmospheric carbon dioxide, and declining polar ice. The students of today are the leaders of tomorrow, and it is our duty to make sure they are well equipped and they understand the implications of climate change as part of their research and professional careers. Graduate students, in particular, are gaining valuable and necessary research, leadership, and critical thinking skills, but we need to ensure that they are receiving the appropriate climate education in their graduate training. Previous studies have primarily focused on capturing the K-12, college level, and general publics' knowledge of the climate system, concluding with recommendations on how to improve climate literacy in the classroom. While this is extremely important to study, very few studies have captured the current perception that graduate students hold regarding the amount of climate education being offered to them. This information is important to capture, as it can inform future curriculum development. We developed and distributed a nationwide survey (495 respondents) for graduate students to capture their perception on the level of climate system education being offered and their view on the importance of having climate education. We also investigated differences in the responses based on either geographic area or discipline. We compared how important graduate students felt it was to include climate education in their own discipline versus outside disciplines. The authors will discuss key findings from this ongoing research.
Using Ontologies to Formalize Services Specifications in Multi-Agent Systems
NASA Technical Reports Server (NTRS)
Breitman, Karin Koogan; Filho, Aluizio Haendchen; Haeusler, Edward Hermann
2004-01-01
One key issue in multi-agent systems (MAS) is their ability to interact and exchange information autonomously across applications. To secure agent interoperability, designers must rely on a communication protocol that allows software agents to exchange meaningful information. In this paper we propose using ontologies as such communication protocol. Ontologies capture the semantics of the operations and services provided by agents, allowing interoperability and information exchange in a MAS. Ontologies are a formal, machine processable, representation that allows to capture the semantics of a domain and, to derive meaningful information by way of logical inference. In our proposal we use a formal knowledge representation language (OWL) that translates into Description Logics (a subset of first order logic), thus eliminating ambiguities and providing a solid base for machine based inference. The main contribution of this approach is to make the requirements explicit, centralize the specification in a single document (the ontology itself), at the same that it provides a formal, unambiguous representation that can be processed by automated inference machines.
Statistical Physics of Complex Substitutive Systems
NASA Astrophysics Data System (ADS)
Jin, Qing
Diffusion processes are central to human interactions. Despite extensive studies that span multiple disciplines, our knowledge is limited to spreading processes in non-substitutive systems. Yet, a considerable number of ideas, products, and behaviors spread by substitution; to adopt a new one, agents must give up an existing one. This captures the spread of scientific constructs--forcing scientists to choose, for example, a deterministic or probabilistic worldview, as well as the adoption of durable items, such as mobile phones, cars, or homes. In this dissertation, I develop a statistical physics framework to describe, quantify, and understand substitutive systems. By empirically exploring three collected high-resolution datasets pertaining to such systems, I build a mechanistic model describing substitutions, which not only analytically predicts the universal macroscopic phenomenon discovered in the collected datasets, but also accurately captures the trajectories of individual items in a complex substitutive system, demonstrating a high degree of regularity and universality in substitutive systems. I also discuss the origins and insights of the parameters in the substitution model and possible generalization form of the mathematical framework. The systematical study of substitutive systems presented in this dissertation could potentially guide the understanding and prediction of all spreading phenomena driven by substitutions, from electric cars to scientific paradigms, and from renewable energy to new healthy habits.
Key Provenance of Earth Science Observational Data Products
NASA Astrophysics Data System (ADS)
Conover, H.; Plale, B.; Aktas, M.; Ramachandran, R.; Purohit, P.; Jensen, S.; Graves, S. J.
2011-12-01
As the sheer volume of data increases, particularly evidenced in the earth and environmental sciences, local arrangements for sharing data need to be replaced with reliable records about the what, who, how, and where of a data set or collection. This is frequently called the provenance of a data set. While observational data processing systems in the earth sciences have a long history of capturing metadata about the processing pipeline, current processes are limited in both what is captured and how it is disseminated to the science community. Provenance capture plays a role in scientific data preservation and stewardship precisely because it can automatically capture and represent a coherent picture of the what, how and who of a particular scientific collection. It reflects the transformations that a data collection underwent prior to its current form and the sequence of tasks that were executed and data products applied to generate a new product. In the NASA-funded Instant Karma project, we examine provenance capture in earth science applications, specifically the Advanced Microwave Scanning Radiometer - Earth Observing System (AMSR-E) Science Investigator-led Processing system (SIPS). The project is integrating the Karma provenance collection and representation tool into the AMSR-E SIPS production environment, with an initial focus on Sea Ice. This presentation will describe capture and representation of provenance that is guided by the Open Provenance Model (OPM). Several things have become clear during the course of the project to date. One is that core OPM entities and relationships are not adequate for expressing the kinds of provenance that is of interest in the science domain. OPM supports name-value pair annotations that can be used to augment what is known about the provenance entities and relationships, but in Karma, annotations cannot be added during capture, but only after the fact. This limits the capture system's ability to record something it learned about an entity after the event of its creation in the provenance record. We will discuss extensions to the Open Provenance Model (OPM) and modifications to the Karma tool suite to address this issue, more efficient representations of earth science kinds of provenance, and definition of metadata structures for capturing related knowledge about the data products and science algorithms used to generate them. Use scenarios for provenance information is an active topic of investigation. It has additionally become clear through the project that not all provenance is created equal. In processing pipelines, some provenance is repetitive and uninteresting. Because of the volume of provenance, this obscures what are the interesting pieces of provenance. Methodologies to reveal science-relevant provenance will be presented, along with a preview of the AMSR-E Provenance Browser.
Fuzzy logic techniques for rendezvous and docking of two geostationary satellites
NASA Technical Reports Server (NTRS)
Ortega, Guillermo
1995-01-01
Large assemblings in space require the ability to manage rendezvous and docking operations. In future these techniques will be required for the gradual build up of big telecommunication platforms in the geostationary orbit. The paper discusses the use of fuzzy logic to model and implement a control system for the docking/berthing of two satellites in geostationary orbit. The system mounted in a chaser vehicle determines the actual state of both satellites and generates torques to execute maneuvers to establish the structural latching. The paper describes the proximity operations to collocate the two satellites in the same orbital window, the fuzzy guidance and navigation of the chaser approaching the target and the final Fuzzy berthing. The fuzzy logic system represents a knowledge based controller that realizes the close loop operations autonomously replacing the conventional control algorithms. The goal is to produce smooth control actions in the proximity of the target and during the docking to avoid disturbance torques in the final assembly orbit. The knowledge of the fuzzy controller consists of a data base of rules and the definitions of the fuzzy sets. The knowledge of an experienced spacecraft controller is captured into a set of rules forming the Rules Data Base.
Computational knowledge integration in biopharmaceutical research.
Ficenec, David; Osborne, Mark; Pradines, Joel; Richards, Dan; Felciano, Ramon; Cho, Raymond J; Chen, Richard O; Liefeld, Ted; Owen, James; Ruttenberg, Alan; Reich, Christian; Horvath, Joseph; Clark, Tim
2003-09-01
An initiative to increase biopharmaceutical research productivity by capturing, sharing and computationally integrating proprietary scientific discoveries with public knowledge is described. This initiative involves both organisational process change and multiple interoperating software systems. The software components rely on mutually supporting integration techniques. These include a richly structured ontology, statistical analysis of experimental data against stored conclusions, natural language processing of public literature, secure document repositories with lightweight metadata, web services integration, enterprise web portals and relational databases. This approach has already begun to increase scientific productivity in our enterprise by creating an organisational memory (OM) of internal research findings, accessible on the web. Through bringing together these components it has also been possible to construct a very large and expanding repository of biological pathway information linked to this repository of findings which is extremely useful in analysis of DNA microarray data. This repository, in turn, enables our research paradigm to be shifted towards more comprehensive systems-based understandings of drug action.
Knowledge-based assistance in costing the space station DMS
NASA Technical Reports Server (NTRS)
Henson, Troy; Rone, Kyle
1988-01-01
The Software Cost Engineering (SCE) methodology developed over the last two decades at IBM Systems Integration Division (SID) in Houston is utilized to cost the NASA Space Station Data Management System (DMS). An ongoing project to capture this methodology, which is built on a foundation of experiences and lessons learned, has resulted in the development of an internal-use-only, PC-based prototype that integrates algorithmic tools with knowledge-based decision support assistants. This prototype Software Cost Engineering Automation Tool (SCEAT) is being employed to assist in the DMS costing exercises. At the same time, DMS costing serves as a forcing function and provides a platform for the continuing, iterative development, calibration, and validation and verification of SCEAT. The data that forms the cost engineering database is derived from more than 15 years of development of NASA Space Shuttle software, ranging from low criticality, low complexity support tools to highly complex and highly critical onboard software.
Identification of Boolean Network Models From Time Series Data Incorporating Prior Knowledge.
Leifeld, Thomas; Zhang, Zhihua; Zhang, Ping
2018-01-01
Motivation: Mathematical models take an important place in science and engineering. A model can help scientists to explain dynamic behavior of a system and to understand the functionality of system components. Since length of a time series and number of replicates is limited by the cost of experiments, Boolean networks as a structurally simple and parameter-free logical model for gene regulatory networks have attracted interests of many scientists. In order to fit into the biological contexts and to lower the data requirements, biological prior knowledge is taken into consideration during the inference procedure. In the literature, the existing identification approaches can only deal with a subset of possible types of prior knowledge. Results: We propose a new approach to identify Boolean networks from time series data incorporating prior knowledge, such as partial network structure, canalizing property, positive and negative unateness. Using vector form of Boolean variables and applying a generalized matrix multiplication called the semi-tensor product (STP), each Boolean function can be equivalently converted into a matrix expression. Based on this, the identification problem is reformulated as an integer linear programming problem to reveal the system matrix of Boolean model in a computationally efficient way, whose dynamics are consistent with the important dynamics captured in the data. By using prior knowledge the number of candidate functions can be reduced during the inference. Hence, identification incorporating prior knowledge is especially suitable for the case of small size time series data and data without sufficient stimuli. The proposed approach is illustrated with the help of a biological model of the network of oxidative stress response. Conclusions: The combination of efficient reformulation of the identification problem with the possibility to incorporate various types of prior knowledge enables the application of computational model inference to systems with limited amount of time series data. The general applicability of this methodological approach makes it suitable for a variety of biological systems and of general interest for biological and medical research.
ERIC Educational Resources Information Center
Charalambous, Charalambos Y.
2016-01-01
Central in the frameworks proposed to capture the knowledge needed for teaching mathematics is the assumption that teachers need more than pure subject-matter knowledge. Validation studies exploring this assumption by recruiting contrasting populations are relatively scarce. Drawing on a sample of 644 Greek-Cypriots preservice and inservice…
Knowledge as a Resource--Networks Do Matter: A Study of SME Firms in Rural Illinois.
ERIC Educational Resources Information Center
Solymossy, Emeric
2000-01-01
Networks among people and businesses facilitate the capture and diffusion of technical and organizational knowledge and can be classified by type of knowledge being exchanged. Types include buyer-supplier information, technical problem-solving information, and informal community information. A survey of 141 small and medium-sized enterprises…
ERIC Educational Resources Information Center
Nezhnov, Peter; Kardanova, Elena; Vasilyeva, Marina; Ludlow, Larry
2015-01-01
The present study tested the possibility of operationalizing levels of knowledge acquisition based on Vygotsky's theory of cognitive growth. An assessment tool (SAM-Math) was developed to capture a hypothesized hierarchical structure of mathematical knowledge consisting of procedural, conceptual, and functional levels. In Study 1, SAM-Math was…
A Knowledge Base for FIA Data Uses
Victor A. Rudis
2005-01-01
Knowledge management provides a way to capture the collective wisdom of an organization, facilitate organizational learning, and foster opportunities for improvement. This paper describes a knowledge base compiled from uses of field observations made by the U.S. Department of Agriculture Forest Service, Forest Inventory and Analysis program and a citation database of...
Wilkins, J J; Chan, Pls; Chard, J; Smith, G; Smith, M K; Beer, M; Dunn, A; Flandorfer, C; Franklin, C; Gomeni, R; Harnisch, L; Kaye, R; Moodie, S; Sardu, M L; Wang, E; Watson, E; Wolstencroft, K; Cheung, Sya
2017-05-01
Pharmacometric analyses are complex and multifactorial. It is essential to check, track, and document the vast amounts of data and metadata that are generated during these analyses (and the relationships between them) in order to comply with regulations, support quality control, auditing, and reporting. It is, however, challenging, tedious, error-prone, and time-consuming, and diverts pharmacometricians from the more useful business of doing science. Automating this process would save time, reduce transcriptional errors, support the retention and transfer of knowledge, encourage good practice, and help ensure that pharmacometric analyses appropriately impact decisions. The ability to document, communicate, and reconstruct a complete pharmacometric analysis using an open standard would have considerable benefits. In this article, the Innovative Medicines Initiative (IMI) Drug Disease Model Resources (DDMoRe) consortium proposes a set of standards to facilitate the capture, storage, and reporting of knowledge (including assumptions and decisions) in the context of model-informed drug discovery and development (MID3), as well as to support reproducibility: "Thoughtflow." A prototype software implementation is provided. © 2017 The Authors CPT: Pharmacometrics & Systems Pharmacology published by Wiley Periodicals, Inc. on behalf of American Society for Clinical Pharmacology and Therapeutics.
Reducing the cognitive workload: Trouble managing power systems
NASA Technical Reports Server (NTRS)
Manner, David B.; Liberman, Eugene M.; Dolce, James L.; Mellor, Pamela A.
1993-01-01
The complexity of space-based systems makes monitoring them and diagnosing their faults taxing for human beings. Mission control operators are well-trained experts but they can not afford to have their attention diverted by extraneous information. During normal operating conditions monitoring the status of the components of a complex system alone is a big task. When a problem arises, immediate attention and quick resolution is mandatory. To aid humans in these endeavors we have developed an automated advisory system. Our advisory expert system, Trouble, incorporates the knowledge of the power system designers for Space Station Freedom. Trouble is designed to be a ground-based advisor for the mission controllers in the Control Center Complex at Johnson Space Center (JSC). It has been developed at NASA Lewis Research Center (LeRC) and tested in conjunction with prototype flight hardware contained in the Power Management and Distribution testbed and the Engineering Support Center, ESC, at LeRC. Our work will culminate with the adoption of these techniques by the mission controllers at JSC. This paper elucidates how we have captured power system failure knowledge, how we have built and tested our expert system, and what we believe are its potential uses.
Monitoring Agents for Assisting NASA Engineers with Shuttle Ground Processing
NASA Technical Reports Server (NTRS)
Semmel, Glenn S.; Davis, Steven R.; Leucht, Kurt W.; Rowe, Danil A.; Smith, Kevin E.; Boeloeni, Ladislau
2005-01-01
The Spaceport Processing Systems Branch at NASA Kennedy Space Center has designed, developed, and deployed a rule-based agent to monitor the Space Shuttle's ground processing telemetry stream. The NASA Engineering Shuttle Telemetry Agent increases situational awareness for system and hardware engineers during ground processing of the Shuttle's subsystems. The agent provides autonomous monitoring of the telemetry stream and automatically alerts system engineers when user defined conditions are satisfied. Efficiency and safety are improved through increased automation. Sandia National Labs' Java Expert System Shell is employed as the agent's rule engine. The shell's predicate logic lends itself well to capturing the heuristics and specifying the engineering rules within this domain. The declarative paradigm of the rule-based agent yields a highly modular and scalable design spanning multiple subsystems of the Shuttle. Several hundred monitoring rules have been written thus far with corresponding notifications sent to Shuttle engineers. This chapter discusses the rule-based telemetry agent used for Space Shuttle ground processing. We present the problem domain along with design and development considerations such as information modeling, knowledge capture, and the deployment of the product. We also present ongoing work with other condition monitoring agents.
Thiele, Ines; Hyduke, Daniel R; Steeb, Benjamin; Fankam, Guy; Allen, Douglas K; Bazzani, Susanna; Charusanti, Pep; Chen, Feng-Chi; Fleming, Ronan M T; Hsiung, Chao A; De Keersmaecker, Sigrid C J; Liao, Yu-Chieh; Marchal, Kathleen; Mo, Monica L; Özdemir, Emre; Raghunathan, Anu; Reed, Jennifer L; Shin, Sook-il; Sigurbjörnsdóttir, Sara; Steinmann, Jonas; Sudarsan, Suresh; Swainston, Neil; Thijs, Inge M; Zengler, Karsten; Palsson, Bernhard O; Adkins, Joshua N; Bumann, Dirk
2011-01-18
Metabolic reconstructions (MRs) are common denominators in systems biology and represent biochemical, genetic, and genomic (BiGG) knowledge-bases for target organisms by capturing currently available information in a consistent, structured manner. Salmonella enterica subspecies I serovar Typhimurium is a human pathogen, causes various diseases and its increasing antibiotic resistance poses a public health problem. Here, we describe a community-driven effort, in which more than 20 experts in S. Typhimurium biology and systems biology collaborated to reconcile and expand the S. Typhimurium BiGG knowledge-base. The consensus MR was obtained starting from two independently developed MRs for S. Typhimurium. Key results of this reconstruction jamboree include i) development and implementation of a community-based workflow for MR annotation and reconciliation; ii) incorporation of thermodynamic information; and iii) use of the consensus MR to identify potential multi-target drug therapy approaches. Taken together, with the growing number of parallel MRs a structured, community-driven approach will be necessary to maximize quality while increasing adoption of MRs in experimental design and interpretation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thiele, Ines; Hyduke, Daniel R.; Steeb, Benjamin
2011-01-01
Metabolic reconstructions (MRs) are common denominators in systems biology and represent biochemical, genetic, and genomic (BiGG) knowledge-bases for target organisms by capturing currently available information in a consistent, structured manner. Salmonella enterica subspecies I serovar Typhimurium is a human pathogen, causes various diseases and its increasing antibiotic resistance poses a public health problem. Here, we describe a community-driven effort, in which more than 20 experts in S. Typhimurium biology and systems biology collaborated to reconcile and expand the S. Typhimurium BiGG knowledge-base. The consensus MR was obtained starting from two independently developed MRs for S. Typhimurium. Key results of thismore » reconstruction jamboree include i) development and implementation of a community-based workflow for MR annotation and reconciliation; ii) incorporation of thermodynamic information; and iii) use of the consensus MR to identify potential multi-target drug therapy approaches. Finally, taken together, with the growing number of parallel MRs a structured, community-driven approach will be necessary to maximize quality while increasing adoption of MRs in experimental design and interpretation.« less
Describing content in middle school science curricula
NASA Astrophysics Data System (ADS)
Schwarz-Ballard, Jennifer A.
As researchers and designers, we intuitively recognize differences between curricula and describe them in terms of design strategy: project-based, laboratory-based, modular, traditional, and textbook, among others. We assume that practitioners recognize the differences in how each requires that students use knowledge, however these intuitive differences have not been captured or systematically described by the existing languages for describing learning goals. In this dissertation I argue that we need new ways of capturing relationships among elements of content, and propose a theory that describes some of the important differences in how students reason in differently designed curricula and activities. Educational researchers and curriculum designers have taken a variety of approaches to laying out learning goals for science. Through an analysis of existing descriptions of learning goals I argue that to describe differences in the understanding students come away with, they need to (1) be specific about the form of knowledge, (2) incorporate both the processes through which knowledge is used and its form, and (3) capture content development across a curriculum. To show the value of inquiry curricula, learning goals need to incorporate distinctions among the variety of ways we ask students to use knowledge. Here I propose the Epistemic Structures Framework as one way to describe differences in students reasoning that are not captured by existing descriptions of learning goals. The usefulness of the Epistemic Structures framework is demonstrated in the four curriculum case study examples in Part II of this work. The curricula in the case studies represent a range of content coverage, curriculum structure, and design rationale. They serve both to illustrate the Epistemic Structures analysis process and make the case that it does in fact describe learning goals in a way that captures important differences in students reasoning in differently designed curricula. Describing learning goals in terms of Epistemic Structures provides one way to define what we mean when we talk about "project-based" curricula and demonstrate its "value added" to educators, administrators and policy makers.
The role of fish in a globally changing food system
Lynch, Abigail J.; MacMillan, J. Randy
2017-01-01
Though humans have been fishing for food since they first created tools to hunt, modern food systems are predominately terrestrial focused and fish are frequently overlooked. Yet, within the global food system, fish play an important role in meeting current and future food needs. Capture fisheries are the last large-scale “wild” food, and aquaculture is the fastest growing food production sector in the world. Currently, capture fisheries and aquaculture provide 4.3 billion people with at least 15% of their animal protein. In addition to providing protein and calories, fish are important sources of critical vitamins and vital nutrients that are difficult to acquire through other food sources. As the climate changes, human populations will continue to grow, cultural tastes will evolve, and fish populations will respond. Sustainable fisheries and aquaculture are poised to fill demand for food not met by terrestrial food systems. Climate change and other global changes will increase, decrease, or modify many wild fish populations and aquaculture systems. Understanding the knowledge gaps around these implications for global change on fish production is critical. Applied research and adaptive management techniques can assist with the necessary evolution of sustainable food systems to include a stronger emphasis on fish and other aquatic organisms.
A negotiation methodology and its application to cogeneration planning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, S.M.; Liu, C.C.; Luu, S.
Power system planning has become a complex process in utilities today. This paper presents a methodology for integrated planning with multiple objectives. The methodology uses a graphical representation (Goal-Decision Network) to capture the planning knowledge. The planning process is viewed as a negotiation process that applies three negotiation operators to search for beneficial decisions in a GDN. Also, the negotiation framework is applied to the problem of planning for cogeneration interconnection. The simulation results are presented to illustrate the cogeneration planning process.
2015-04-16
This nearly cloud-free image of Iceland was captured by the MODIS instrument on board the Terra spacecraft on 04/15/2015 at 13:00 UTC. NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram
2017-12-08
On Nov. 22, 2015 at 19:15 UTC the MODIS instrument aboard NASA's Aqua satellite captured this image of Snow across the Midwest. Credit: NASA Goddard MODIS Rapid Response Team NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram
Reasoning about energy in qualitative simulation
NASA Technical Reports Server (NTRS)
Fouche, Pierre; Kuipers, Benjamin J.
1992-01-01
While possible behaviors of a mechanism that are consistent with an incomplete state of knowledge can be predicted through qualitative modeling and simulation, spurious behaviors corresponding to no solution of any ordinary differential equation consistent with the model may be generated. The present method for energy-related reasoning eliminates an important source of spurious behaviors, as demonstrated by its application to a nonlinear, proportional-integral controlled. It is shown that such qualitative properties of such a system as stability and zero-offset control are captured by the simulation.
The Physician's Workstation: Recording a Physical Examination Using a Controlled Vocabulary
Cimino, James J.; Barnett, G. Octo
1987-01-01
A system has been developed which runs on MS-DOS personal computers and serves as an experimental model of a physician's workstation. The program provides an interface to a controlled vocabulary which allows rapid selection of appropriate terms and modifiers for entry of clinical information. Because it captures patient descriptions, it has the ability to serve as an intermediary between the physician and computer-based medical knowledge resources. At present, the vocabulary permits rapid, reliable representation of cardiac physical examination findings.
2017-12-08
NASA image captured August 31, 2000 The tongue of the Malaspina Glacier, the largest glacier in Alaska, fills most of this image. The Malaspina lies west of Yakutat Bay and covers 1,500 sq. MI (3,880 sq. km). Credit: NASA/Landsat NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Join us on Facebook
Building customer capital through knowledge management processes in the health care context.
Liu, Sandra S; Lin, Carol Yuh-Yun
2007-01-01
Customer capital is a value generated and an asset developed from customer relationships. Successfully managing these relationships is enhanced by knowledge management (KM) infrastructure that captures and transfers customer-related knowledge. The execution of such a system relies on the vision and determination of the top management team (TMT). The health care industry in today's knowledge economy encounters similar challenges of consumerism as its business sector. Developing customer capital is critical for hospitals to remain competitive in the market. This study aims to provide taxonomy for cultivating market-based organizational learning that leads to building of customer capital and attaining desirable financial performance in health care. With the advancement of technology, the KM system plays an important moderating role in the entire process. The customer capital issue has not been fully explored either in the business or the health care industry. The exploratory nature of such a pursuit calls for a qualitative approach. This study examines the proposed taxonomy with the case hospital. The lessons learned also are reflected with three US-based health networks. The TMT incorporated the knowledge process of conceptualization and transformation in their organizational mission. The market-oriented learning approach promoted by TMT helps with the accumulation and sharing of knowledge that prepares the hospital for the dynamics in the marketplace. Their key knowledge advancement relies on both the professional arena and the feedback of customers. The institutionalization of the KM system and organizational culture expands the hospital's customer capital. The implication is twofold: (1) the TMT is imperative for the success of building customer capital through KM process; and (2) the team effort should be enhanced with a learning culture and sharing spirit, in particular, active nurse participation in decision making and frontline staff's role in providing a delightfully surprising patient experience.
NASA Technical Reports Server (NTRS)
Butler, D. J.; Kerstman, E.; Saile, L.; Myers, J.; Walton, M.; Lopez, V.; McGrath, T.
2011-01-01
The Integrated Medical Model (IMM) captures organizational knowledge across the space medicine, training, operations, engineering, and research domains. IMM uses this knowledge in the context of a mission and crew profile to forecast risks to crew health and mission success. The IMM establishes a quantified, statistical relationship among medical conditions, risk factors, available medical resources, and crew health and mission outcomes. These relationships may provide an appropriate foundation for developing an in-flight medical decision support tool that helps optimize the use of medical resources and assists in overall crew health management by an autonomous crew with extremely limited interactions with ground support personnel and no chance of resupply.
2015-01-01
Strengthening capacity in poorer countries to generate multi-disciplinary health research and to utilise research findings, is one of the most effective ways of advancing the countries' health and development. This paper explores current knowledge about how to design health research capacity strengthening (RCS) programmes and how to measure their progress and impact. It describes a systematic, evidence-based approach for designing such programmes and highlights some of the key challenges that will be faced in the next 10 years. These include designing and implementing common frameworks to facilitate comparisons among capacity strengthening projects, and developing monitoring indicators that can capture their interactions with knowledge users and their impact on changes in health systems. PMID:28281707
Reputation-based collaborative network biology.
Binder, Jean; Boue, Stephanie; Di Fabio, Anselmo; Fields, R Brett; Hayes, William; Hoeng, Julia; Park, Jennifer S; Peitsch, Manuel C
2015-01-01
A pilot reputation-based collaborative network biology platform, Bionet, was developed for use in the sbv IMPROVER Network Verification Challenge to verify and enhance previously developed networks describing key aspects of lung biology. Bionet was successful in capturing a more comprehensive view of the biology associated with each network using the collective intelligence and knowledge of the crowd. One key learning point from the pilot was that using a standardized biological knowledge representation language such as BEL is critical to the success of a collaborative network biology platform. Overall, Bionet demonstrated that this approach to collaborative network biology is highly viable. Improving this platform for de novo creation of biological networks and network curation with the suggested enhancements for scalability will serve both academic and industry systems biology communities.
Intelligent systems technology infrastructure for integrated systems
NASA Technical Reports Server (NTRS)
Lum, Henry
1991-01-01
A system infrastructure must be properly designed and integrated from the conceptual development phase to accommodate evolutionary intelligent technologies. Several technology development activities were identified that may have application to rendezvous and capture systems. Optical correlators in conjunction with fuzzy logic control might be used for the identification, tracking, and capture of either cooperative or non-cooperative targets without the intensive computational requirements associated with vision processing. A hybrid digital/analog system was developed and tested with a robotic arm. An aircraft refueling application demonstration is planned within two years. Initially this demonstration will be ground based with a follow-on air based demonstration. System dependability measurement and modeling techniques are being developed for fault management applications. This involves usage of incremental solution/evaluation techniques and modularized systems to facilitate reuse and to take advantage of natural partitions in system models. Though not yet commercially available and currently subject to accuracy limitations, technology is being developed to perform optical matrix operations to enhance computational speed. Optical terrain recognition using camera image sequencing processed with optical correlators is being developed to determine position and velocity in support of lander guidance. The system is planned for testing in conjunction with Dryden Flight Research Facility. Advanced architecture technology is defining open architecture design constraints, test bed concepts (processors, multiple hardware/software and multi-dimensional user support, knowledge/tool sharing infrastructure), and software engineering interface issues.
Fernández, María S; Fraschina, Jimena; Acardi, Soraya; Liotta, Domingo J; Lestani, Eduardo; Giuliani, Magalí; Busch, María; Salomón, O Daniel
2018-02-01
To contribute to the knowledge of the role of small mammals in the transmission cycle of tegumentary leishmaniasis caused by Leishmania braziliensis, we studied the small mammal community and its temporal and spatial association with phlebotominae, as well as small mammal infection by Leishmania spp. by PCR-RFLP analyses in an endemic area of northeastern Argentina. Ten small mammal samplings were conducted (2007-2009, 7506 Sherman trap nights and 422 cage trap nights). In two of these samplings, 16 capture stations each one consisting of a CDC light trap to capture phlebotominae, two to four Sherman traps and two cage traps were placed. We found co-occurrence of phlebotominae and small mammal captures in four stations, which were all the stations with small mammal captures and yielded 97% (2295 specimens, including 21 gravid females) of the total phlebotominae captures, suggesting that small mammals may provide a potential source of blood for phlebotominae females. One Didelphis albiventris and two Rattus rattus were associated with high captures of Nyssomyia whitmani, vector of L. braziliensis in the study area. The PCR-RFLP analyses confirm the presence of L. braziliensis in two sigmodontine small mammals (Akodon sp. and Euryoryzomys russatus) for the first time in Argentina, to our knowledge.
A Data-Driven Solution for Performance Improvement
NASA Technical Reports Server (NTRS)
2002-01-01
Marketed as the "Software of the Future," Optimal Engineering Systems P.I. EXPERT(TM) technology offers statistical process control and optimization techniques that are critical to businesses looking to restructure or accelerate operations in order to gain a competitive edge. Kennedy Space Center granted Optimal Engineering Systems the funding and aid necessary to develop a prototype of the process monitoring and improvement software. Completion of this prototype demonstrated that it was possible to integrate traditional statistical quality assurance tools with robust optimization techniques in a user- friendly format that is visually compelling. Using an expert system knowledge base, the software allows the user to determine objectives, capture constraints and out-of-control processes, predict results, and compute optimal process settings.
Normal morphogenesis of epithelial tissues and progression of epithelial tumors
Wang, Chun-Chao; Jamal, Leen; Janes, Kevin A.
2011-01-01
Epithelial cells organize into various tissue architectures that largely maintain their structure throughout the life of an organism. For decades, the morphogenesis of epithelial tissues has fascinated scientists at the interface of cell, developmental, and molecular biology. Systems biology offers ways to combine knowledge from these disciplines by building integrative models that are quantitative and predictive. Can such models be useful for gaining a deeper understanding of epithelial morphogenesis? Here, we take inventory of some recurring themes in epithelial morphogenesis that systems approaches could strive to capture. Predictive understanding of morphogenesis at the systems level would prove especially valuable for diseases such as cancer, where epithelial tissue architecture is profoundly disrupted. PMID:21898857
NASA's SDO Satellite Captures Venus Transit Approach
2012-06-05
NASA image captured June 5, 2012 at 212357 UTC (about 5:24 p.m. EDT). On June 5-6 2012, SDO is collecting images of one of the rarest predictable solar events: the transit of Venus across the face of the sun. This event happens in pairs eight years apart that are separated from each other by 105 or 121 years. The last transit was in 2004 and the next will not happen until 2117. This image was captured by SDO's AIA instrument at 193 Angstroms. Credit: NASA/SDO, AIA To read more about the 2012 Venus Transit go to: sunearthday.nasa.gov/transitofvenus Add your photos of the Transit of Venus to our Flickr Group here: www.flickr.com/groups/venustransit/ NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram
Developing a geoscience knowledge framework for a national geological survey organisation
NASA Astrophysics Data System (ADS)
Howard, Andrew S.; Hatton, Bill; Reitsma, Femke; Lawrie, Ken I. G.
2009-04-01
Geological survey organisations (GSOs) are established by most nations to provide a geoscience knowledge base for effective decision-making on mitigating the impacts of natural hazards and global change, and on sustainable management of natural resources. The value of the knowledge base as a national asset is continually enhanced by the exchange of knowledge between GSOs as data and information providers and the stakeholder community as knowledge 'users and exploiters'. Geological maps and associated narrative texts typically form the core of national geoscience knowledge bases, but have some inherent limitations as methods of capturing and articulating knowledge. Much knowledge about the three-dimensional (3D) spatial interpretation and its derivation and uncertainty, and the wider contextual value of the knowledge, remains intangible in the minds of the mapping geologist in implicit and tacit form. To realise the value of these knowledge assets, the British Geological Survey (BGS) has established a workflow-based cyber-infrastructure to enhance its knowledge management and exchange capability. Future geoscience surveys in the BGS will contribute to a national, 3D digital knowledge base on UK geology, with the associated implicit and tacit information captured as metadata, qualitative assessments of uncertainty, and documented workflows and best practice. Knowledge-based decision-making at all levels of society requires both the accessibility and reliability of knowledge to be enhanced in the grid-based world. Establishment of collaborative cyber-infrastructures and ontologies for geoscience knowledge management and exchange will ensure that GSOs, as knowledge-based organisations, can make their contribution to this wider goal.
E-Learning as a Knowledge Management Approach for Intellectual Capital Utilization
ERIC Educational Resources Information Center
Shehabat, Issa; Mahdi, Saad A.; Khoualdi, Kamel
2008-01-01
This paper addresses human resources utilization at the university environment. We address the design issues of e-learning courses that can capture the teacher knowledge. The underlying objective is that e-learning is a key knowledge and major resources for many universities. Therefore, the design of e-learning should be an important part of the…
NASA Technical Reports Server (NTRS)
Menzel, Christopher; Mayer, Richard J.; Edwards, Douglas D.
1991-01-01
The Process Description Capture Method (IDEF3) is one of several Integrated Computer-Aided Manufacturing (ICAM) DEFinition methods developed by the Air Force to support systems engineering activities, and in particular, to support information systems development. These methods have evolved as a distillation of 'good practice' experience by information system developers and are designed to raise the performance level of the novice practitioner to one comparable with that of an expert. IDEF3 is meant to serve as a knowledge acquisition and requirements definition tool that structures the user's understanding of how a given process, event, or system works around process descriptions. A special purpose graphical language accompanying the method serves to highlight temporal precedence and causality relationships relative to the process or event being described.
An incremental knowledge assimilation system (IKAS) for mine detection
NASA Astrophysics Data System (ADS)
Porway, Jake; Raju, Chaitanya; Varadarajan, Karthik Mahesh; Nguyen, Hieu; Yadegar, Joseph
2010-04-01
In this paper we present an adaptive incremental learning system for underwater mine detection and classification that utilizes statistical models of seabed texture and an adaptive nearest-neighbor classifier to identify varied underwater targets in many different environments. The first stage of processing uses our Background Adaptive ANomaly detector (BAAN), which identifies statistically likely target regions using Gabor filter responses over the image. Using this information, BAAN classifies the background type and updates its detection using background-specific parameters. To perform classification, a Fully Adaptive Nearest Neighbor (FAAN) determines the best label for each detection. FAAN uses an extremely fast version of Nearest Neighbor to find the most likely label for the target. The classifier perpetually assimilates new and relevant information into its existing knowledge database in an incremental fashion, allowing improved classification accuracy and capturing concept drift in the target classes. Experiments show that the system achieves >90% classification accuracy on underwater mine detection tasks performed on synthesized datasets provided by the Office of Naval Research. We have also demonstrated that the system can incrementally improve its detection accuracy by constantly learning from new samples.
Safety and Mission Assurance Knowledge Management Retention
NASA Technical Reports Server (NTRS)
Johnson, Teresa A.
2006-01-01
This viewgraph presentation reviews the issues surrounding the management of knowledge in regards to safety and mission assurance. The JSC workers who were hired in the 1960's are slated to retire in the next two to three years. The experiences and knowledge of these NASA workers must be identified, and disseminated. This paper reviews some of the strategies that the S&MA is developing to capture that valuable institutional knowledge.
Job Knowledge Test Design: A Cognitively-Oriented Approach
1993-07-01
protocol analyses and related methods. We employed a plan-goal graph representation to capture the knowledge content and goal structure of the studied task...between job knowledge and hands-on performance from previous studies was .38. For the subset of Marines in this sample who had recently been examined...the job knowledge test provided similar results to conventional, total number correct scoring. Conclusion The evidence provided by this study supports
Photography Basics. Capturing the Essence of Physical Education and Sport Programs.
ERIC Educational Resources Information Center
Kluka, Darlene A.; Mitchell, Carolyn B.
1990-01-01
The physical educator or coach may be responsible for marketing programs to the public, and skill in 35mm photography can help. Ingredients necessary for successful 35mm movement photography are discussed: knowledge of the movement and the appropriate equipment; techniques for capturing movement; positioning for the ultimate shot; and practice.…
Student Perceptions of Online Tutoring Videos
ERIC Educational Resources Information Center
Sligar, Steven R.; Pelletier, Christopher D.; Bonner, Heidi Stone; Coghill, Elizabeth; Guberman, Daniel; Zeng, Xiaoming; Newman, Joyce J.; Muller, Dorothy; Dennis, Allen
2017-01-01
Online tutoring is made possible by using videos to replace or supplement face to face services. The purpose of this research was to examine student reactions to the use of lecture capture technology in a university tutoring setting and to assess student knowledge of some features of Tegrity lecture capture software. A survey was administered to…
Knowledge of healthcare professionals about rights of patient’s images
Caires, Bianca Rodrigues; Lopes, Maria Carolina Barbosa Teixeira; Okuno, Meiry Fernanda Pinto; Vancini-Campanharo, Cássia Regina; Batista, Ruth Ester Assayag
2015-01-01
Objective To assess knowledge of healthcare professionals about capture and reproduction of images of patients in a hospital setting. Methods A cross-sectional and observational study among 360 healthcare professionals (nursing staff, physical therapists, and physicians), working at a teaching hospital in the city of São Paulo (SP). A questionnaire with sociodemographic information was distributed and data were correlated to capture and reproduction of images at hospitals. Results Of the 360 respondents, 142 had captured images of patients in the last year, and 312 reported seeing other professionals taking photographs of patients. Of the participants who captured images, 61 said they used them for studies and presentation of clinical cases, and 168 professionals reported not knowing of any legislation in the Brazilian Penal Code regarding collection and use of images. Conclusion There is a gap in the training of healthcare professionals regarding the use of patient´s images. It is necessary to include subjects that address this theme in the syllabus of undergraduate courses, and the healthcare organizations should regulate this issue. PMID:26267838
Planning bioinformatics workflows using an expert system.
Chen, Xiaoling; Chang, Jeffrey T
2017-04-15
Bioinformatic analyses are becoming formidably more complex due to the increasing number of steps required to process the data, as well as the proliferation of methods that can be used in each step. To alleviate this difficulty, pipelines are commonly employed. However, pipelines are typically implemented to automate a specific analysis, and thus are difficult to use for exploratory analyses requiring systematic changes to the software or parameters used. To automate the development of pipelines, we have investigated expert systems. We created the Bioinformatics ExperT SYstem (BETSY) that includes a knowledge base where the capabilities of bioinformatics software is explicitly and formally encoded. BETSY is a backwards-chaining rule-based expert system comprised of a data model that can capture the richness of biological data, and an inference engine that reasons on the knowledge base to produce workflows. Currently, the knowledge base is populated with rules to analyze microarray and next generation sequencing data. We evaluated BETSY and found that it could generate workflows that reproduce and go beyond previously published bioinformatics results. Finally, a meta-investigation of the workflows generated from the knowledge base produced a quantitative measure of the technical burden imposed by each step of bioinformatics analyses, revealing the large number of steps devoted to the pre-processing of data. In sum, an expert system approach can facilitate exploratory bioinformatic analysis by automating the development of workflows, a task that requires significant domain expertise. https://github.com/jefftc/changlab. jeffrey.t.chang@uth.tmc.edu. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
Planning bioinformatics workflows using an expert system
Chen, Xiaoling; Chang, Jeffrey T.
2017-01-01
Abstract Motivation: Bioinformatic analyses are becoming formidably more complex due to the increasing number of steps required to process the data, as well as the proliferation of methods that can be used in each step. To alleviate this difficulty, pipelines are commonly employed. However, pipelines are typically implemented to automate a specific analysis, and thus are difficult to use for exploratory analyses requiring systematic changes to the software or parameters used. Results: To automate the development of pipelines, we have investigated expert systems. We created the Bioinformatics ExperT SYstem (BETSY) that includes a knowledge base where the capabilities of bioinformatics software is explicitly and formally encoded. BETSY is a backwards-chaining rule-based expert system comprised of a data model that can capture the richness of biological data, and an inference engine that reasons on the knowledge base to produce workflows. Currently, the knowledge base is populated with rules to analyze microarray and next generation sequencing data. We evaluated BETSY and found that it could generate workflows that reproduce and go beyond previously published bioinformatics results. Finally, a meta-investigation of the workflows generated from the knowledge base produced a quantitative measure of the technical burden imposed by each step of bioinformatics analyses, revealing the large number of steps devoted to the pre-processing of data. In sum, an expert system approach can facilitate exploratory bioinformatic analysis by automating the development of workflows, a task that requires significant domain expertise. Availability and Implementation: https://github.com/jefftc/changlab Contact: jeffrey.t.chang@uth.tmc.edu PMID:28052928
NASA Astrophysics Data System (ADS)
Topousis, Daria E.; Dennehy, Cornelius J.; Lebsock, Kenneth L.
2012-12-01
Historically, engineers at the National Aeronautics and Space Administration (NASA) had few opportunities or incentives to share their technical expertise across the Agency. Its center- and project-focused culture often meant that knowledge never left organizational and geographic boundaries. The need to develop a knowledge sharing culture became critical as a result of increasingly complex missions, closeout of the Shuttle Program, and a new generation of engineers entering the workforce. To address this need, the Office of the Chief Engineer established communities of practice on the NASA Engineering Network. These communities were strategically aligned with NASA's core competencies in such disciplines as avionics, flight mechanics, life support, propulsion, structures, loads and dynamics, human factors, and guidance, navigation, and control. This paper is a case study of NASA's implementation of a system that would identify and develop communities, from establishing simple websites that compiled discipline-specific resources to fostering a knowledge-sharing environment through collaborative and interactive technologies. It includes qualitative evidence of improved availability and transfer of knowledge. It focuses on capabilities that increased knowledge exchange such as a custom-made Ask An Expert system, community contact lists, publication of key resources, and submission forms that allowed any user to propose content for the sites. It discusses the peer relationships that developed through the communities and the leadership and infrastructure that made them possible.
An ontology for factors affecting tuberculosis treatment adherence behavior in sub-Saharan Africa.
Ogundele, Olukunle Ayodeji; Moodley, Deshendran; Pillay, Anban W; Seebregts, Christopher J
2016-01-01
Adherence behavior is a complex phenomenon influenced by diverse personal, cultural, and socioeconomic factors that may vary between communities in different regions. Understanding the factors that influence adherence behavior is essential in predicting which individuals and communities are at risk of nonadherence. This is necessary for supporting resource allocation and intervention planning in disease control programs. Currently, there is no known concrete and unambiguous computational representation of factors that influence tuberculosis (TB) treatment adherence behavior that is useful for prediction. This study developed a computer-based conceptual model for capturing and structuring knowledge about the factors that influence TB treatment adherence behavior in sub-Saharan Africa (SSA). An extensive review of existing categorization systems in the literature was used to develop a conceptual model that captured scientific knowledge about TB adherence behavior in SSA. The model was formalized as an ontology using the web ontology language. The ontology was then evaluated for its comprehensiveness and applicability in building predictive models. The outcome of the study is a novel ontology-based approach for curating and structuring scientific knowledge of adherence behavior in patients with TB in SSA. The ontology takes an evidence-based approach by explicitly linking factors to published clinical studies. Factors are structured around five dimensions: factor type, type of effect, regional variation, cross-dependencies between factors, and treatment phase. The ontology is flexible and extendable and provides new insights into the nature of and interrelationship between factors that influence TB adherence.
An ontology for factors affecting tuberculosis treatment adherence behavior in sub-Saharan Africa
Ogundele, Olukunle Ayodeji; Moodley, Deshendran; Pillay, Anban W; Seebregts, Christopher J
2016-01-01
Purpose Adherence behavior is a complex phenomenon influenced by diverse personal, cultural, and socioeconomic factors that may vary between communities in different regions. Understanding the factors that influence adherence behavior is essential in predicting which individuals and communities are at risk of nonadherence. This is necessary for supporting resource allocation and intervention planning in disease control programs. Currently, there is no known concrete and unambiguous computational representation of factors that influence tuberculosis (TB) treatment adherence behavior that is useful for prediction. This study developed a computer-based conceptual model for capturing and structuring knowledge about the factors that influence TB treatment adherence behavior in sub-Saharan Africa (SSA). Methods An extensive review of existing categorization systems in the literature was used to develop a conceptual model that captured scientific knowledge about TB adherence behavior in SSA. The model was formalized as an ontology using the web ontology language. The ontology was then evaluated for its comprehensiveness and applicability in building predictive models. Conclusion The outcome of the study is a novel ontology-based approach for curating and structuring scientific knowledge of adherence behavior in patients with TB in SSA. The ontology takes an evidence-based approach by explicitly linking factors to published clinical studies. Factors are structured around five dimensions: factor type, type of effect, regional variation, cross-dependencies between factors, and treatment phase. The ontology is flexible and extendable and provides new insights into the nature of and interrelationship between factors that influence TB adherence. PMID:27175067
Verhoef, Marja J; Lewith, George; Ritenbaugh, Cheryl; Boon, Heather; Fleishman, Susan; Leis, Anne
2005-09-01
Complementary and alternative medicine (CAM) often consists of whole systems of care (such as naturopathic medicine or traditional Chinese medicine (TCM)) that combine a wide range of modalities to provide individualised treatment. The complexity of these interventions and their potential synergistic effect requires innovative evaluative approaches. Model validity, which encompasses the need for research to adequately address the unique healing theory and therapeutic context of the intervention, is central to whole systems research (WSR). Classical randomised controlled trials (RCTs) are limited in their ability to address this need. Therefore, we propose a mixed methods approach that includes a range of relevant and holistic outcome measures. As the individual components of most whole systems are inseparable, complementary and synergistic, WSR must not focus only on the "active" ingredients of a system. An emerging WSR framework must be non-hierarchical, cyclical, flexible and adaptive, as knowledge creation is continuous, evolutionary and necessitates a continuous interplay between research methods and "phases" of knowledge. Finally, WSR must hold qualitative and quantitative research methods in equal esteem to realize their unique research contribution. Whole systems are complex and therefore no one method can adequately capture the meaning, process and outcomes of these interventions.
Weaver, Charlotte A; Warren, Judith J; Delaney, Connie
2005-12-01
The rise of evidence-base practice (EBP) as a standard for care delivery is rapidly emerging as a global phenomenon that is transcending political, economic and geographic boundaries. Evidence-based nursing (EBN) addresses the growing body of nursing knowledge supported by different levels of evidence for best practices in nursing care. Across all health care, including nursing, we face the challenge of how to most effectively close the gap between what is known and what is practiced. There is extensive literature on the barriers and difficulties of translating research findings into practical application. While the literature refers to this challenge as the "Bench to Bedside" lag, this paper presents three collaborative strategies that aim to minimize this gap. The Bedside strategy proposes to use the data generated from care delivery and captured in the massive data repositories of electronic health record (EHR) systems as empirical evidence that can be analysed to discover and then inform best practice. In the Classroom strategy, we present a description for how evidence-based nursing knowledge is taught in a baccalaureate nursing program. And finally, the Bench strategy describes applied informatics in converting paper-based EBN protocols into the workflow of clinical information systems. Protocols are translated into reference and executable knowledge with the goal of placing the latest scientific knowledge at the fingertips of front line clinicians. In all three strategies, information technology (IT) is presented as the underlying tool that makes this rapid translation of nursing knowledge into practice and education feasible.
Mercury in coal and the impact of coal quality on mercury emissions from combustion systems
Kolker, A.; Senior, C.L.; Quick, J.C.
2006-01-01
The proportion of Hg in coal feedstock that is emitted by stack gases of utility power stations is a complex function of coal chemistry and properties, combustion conditions, and the positioning and type of air pollution control devices employed. Mercury in bituminous coal is found primarily within Fe-sulfides, whereas lower rank coal tends to have a greater proportion of organic-bound Hg. Preparation of bituminous coal to reduce S generally reduces input Hg relative to in-ground concentrations, but the amount of this reduction varies according to the fraction of Hg in sulfides and the efficiency of sulfide removal. The mode of occurrence of Hg in coal does not directly affect the speciation of Hg in the combustion flue gas. However, other constituents in the coal, notably Cl and S, and the combustion characteristics of the coal, influence the species of Hg that are formed in the flue gas and enter air pollution control devices. The formation of gaseous oxidized Hg or particulate-bound Hg occurs post-combustion; these forms of Hg can be in part captured in the air pollution control devices that exist on coal-fired boilers, without modification. For a given coal type, the capture efficiency of Hg by pollution control systems varies according to type of device and the conditions of its deployment. For bituminous coal, on average, more than 60% of Hg in flue gas is captured by fabric filter (FF) and flue-gas desulfurization (FGD) systems. Key variables affecting performance for Hg control include Cl and S content of the coal, the positioning (hot side vs. cold side) of the system, and the amount of unburned C in coal ash. Knowledge of coal quality parameters and their effect on the performance of air pollution control devices allows optimization of Hg capture co-benefit. ?? 2006 Elsevier Ltd. All rights reserved.
Evans, Rhiannon; Murphy, Simon; Scourfield, Jonathan
2015-07-01
Sporadic and inconsistent implementation remains a significant challenge for social and emotional learning (SEL) interventions. This may be partly explained by the dearth of flexible, causative models that capture the multifarious determinants of implementation practices within complex systems. This paper draws upon Rogers (2003) Diffusion of Innovations Theory to explain the adoption, implementation and discontinuance of a SEL intervention. A pragmatic, formative process evaluation was conducted in alignment with phase 1 of the UK Medical Research Council's framework for Developing and Evaluating Complex Interventions. Employing case-study methodology, qualitative data were generated with four socio-economically and academically contrasting secondary schools in Wales implementing the Student Assistance Programme. Semi-structured interviews were conducted with 15 programme stakeholders. Data suggested that variation in implementation activity could be largely attributed to four key intervention reinvention points, which contributed to the transformation of the programme as it interacted with contextual features and individual needs. These reinvention points comprise the following: intervention training, which captures the process through which adopters acquire knowledge about a programme and delivery expertise; intervention assessment, which reflects adopters' evaluation of an intervention in relation to contextual needs; intervention clarification, which comprises the cascading of knowledge through an organisation in order to secure support in delivery; and intervention responsibility, which refers to the process of assigning accountability for sustainable delivery. Taken together, these points identify opportunities to predict and intervene with potential implementation problems. Further research would benefit from exploring additional reinvention activity.
Cutting Silica Aerogel for Particle Extraction
NASA Technical Reports Server (NTRS)
Tsou, P.; Brownlee, D. E.; Glesias, R.; Grigoropoulos, C. P.; Weschler, M.
2005-01-01
The detailed laboratory analyses of extraterrestrial particles have revolutionized our knowledge of planetary bodies in the last three decades. This knowledge of chemical composition, morphology, mineralogy, and isotopics of particles cannot be provided by remote sensing. In order to acquire these detail information in the laboratories, the samples need be intact, unmelted. Such intact capture of hypervelocity particles has been developed in 1996. Subsequently silica aerogel was introduced as the preferred medium for intact capturing of hypervelocity particles and later showed it to be particularly suitable for the space environment. STARDUST, the 4th NASA Discovery mission to capture samples from 81P/Wild 2 and contemporary interstellar dust, is the culmination of these new technologies. In early laboratory experiments of launching hypervelocity projectiles into aerogel, there was the need to cut aerogel to isolate or extract captured particles/tracks. This is especially challenging for space captures, since there will be many particles/tracks of wide ranging scales closely located, even collocated. It is critical to isolate and extract one particle without compromising its neighbors since the full significance of a particle is not known until it is extracted and analyzed. To date, three basic techniques have been explored: mechanical cutting, lasers cutting and ion beam milling. We report the current findings.
DAWN (Design Assistant Workstation) for advanced physical-chemical life support systems
NASA Technical Reports Server (NTRS)
Rudokas, Mary R.; Cantwell, Elizabeth R.; Robinson, Peter I.; Shenk, Timothy W.
1989-01-01
This paper reports the results of a project supported by the National Aeronautics and Space Administration, Office of Aeronautics and Space Technology (NASA-OAST) under the Advanced Life Support Development Program. It is an initial attempt to integrate artificial intelligence techniques (via expert systems) with conventional quantitative modeling tools for advanced physical-chemical life support systems. The addition of artificial intelligence techniques will assist the designer in the definition and simulation of loosely/well-defined life support processes/problems as well as assist in the capture of design knowledge, both quantitative and qualitative. Expert system and conventional modeling tools are integrated to provide a design workstation that assists the engineer/scientist in creating, evaluating, documenting and optimizing physical-chemical life support systems for short-term and extended duration missions.
The need for knowledge translation in chronic pain
Henry, James L
2008-01-01
One in five Canadians suffers from some form of persistent or chronic pain. The impact on individual lives, families and friends, the health services sector and the economy is huge. Reliable evidence is available that the burden of persistent pain can be markedly reduced when available knowledge is applied. Bridging the quality chasm between chronic pain and the care process will require a unique confluence of opinion from all stakeholders committed within a focused community of practice to address the impact of pain. Various levels of success in this regard have been demonstrated when there is exchange, synthesis and ethically sound application of research findings within a complex set of interactions among researchers and knowledge users. It is now critical to accelerate the capture of the benefits of research for Canadians through improved health, more effective and responsive services and products, and a strengthened health care system to bring about health reform and health care reform across Canada as it pertains to the one in five Canadians living with chronic, disabling pain. The overarching outcome of such an initiative needs to be promoted to sustain a balanced portfolio of curiosity-and needs-based research, which along with existing knowledge, can be mobilized and applied for the benefit of Canadians, the health care system and the economy. PMID:19225603
WE-F-BRB-00: New Developments in Knowledge-Based Treatment Planning and Automation
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
2015-06-15
Advancements in informatics in radiotherapy are opening up opportunities to improve our ability to assess treatment plans. Models on individualizing patient dose constraints from prior patient data and shape relationships have been extensively researched and are now making their way into commercial products. New developments in knowledge based treatment planning involve understanding the impact of the radiation dosimetry on the patient. Akin to radiobiology models that have driven intensity modulated radiotherapy optimization, toxicity and outcome predictions based on treatment plans and prior patient experiences may be the next step in knowledge based planning. In order to realize these predictions, itmore » is necessary to understand how the clinical information can be captured, structured and organized with ontologies and databases designed for recall. Large databases containing radiation dosimetry and outcomes present the opportunity to evaluate treatment plans against predictions of toxicity and disease response. Such evaluations can be based on dose volume histogram or even the full 3-dimensional dose distribution and its relation to the critical anatomy. This session will provide an understanding of ontologies and standard terminologies used to capture clinical knowledge into structured databases; How data can be organized and accessed to utilize the knowledge in planning; and examples of research and clinical efforts to incorporate that clinical knowledge into planning for improved care for our patients. Learning Objectives: Understand the role of standard terminologies, ontologies and data organization in oncology Understand methods to capture clinical toxicity and outcomes in a clinical setting Understand opportunities to learn from clinical data and its application to treatment planning Todd McNutt receives funding from Philips, Elekta and Toshiba for some of the work presented.« less
ERIC Educational Resources Information Center
Nilsson, Pernilla; Vikström, Anna
2015-01-01
One way for teachers to develop their professional knowledge, which also focuses on specific science content and the ways students learn, is through being involved in researching their own practice. The aim of this study was to examine how science teachers changed (or not) their professional knowledge of teaching after inquiring into their own…
Job Knowledge Test Design: A Cognitively-Oriented Approach. Institute Report No. 241.
ERIC Educational Resources Information Center
DuBois, David; And Others
Selected cognitive science methods were used to modify existing test development procedures so that the modified procedures could in turn be used to improve the usefulness of job knowledge tests as a proxy for hands-on performance. A plan-goal graph representation was used to capture the knowledge content and goal structure of the task of using a…
ERIC Educational Resources Information Center
Caruso, Shirley J.
2017-01-01
This paper serves as an exploration into some of the ways in which organizations can promote, capture, share, and manage the valuable knowledge of their employees. The problem is that employees typically do not share valuable information, skills, or expertise with other employees or with the entire organization. The author uses research as well as…
GalenOWL: Ontology-based drug recommendations discovery
2012-01-01
Background Identification of drug-drug and drug-diseases interactions can pose a difficult problem to cope with, as the increasingly large number of available drugs coupled with the ongoing research activities in the pharmaceutical domain, make the task of discovering relevant information difficult. Although international standards, such as the ICD-10 classification and the UNII registration, have been developed in order to enable efficient knowledge sharing, medical staff needs to be constantly updated in order to effectively discover drug interactions before prescription. The use of Semantic Web technologies has been proposed in earlier works, in order to tackle this problem. Results This work presents a semantic-enabled online service, named GalenOWL, capable of offering real time drug-drug and drug-diseases interaction discovery. For enabling this kind of service, medical information and terminology had to be translated to ontological terms and be appropriately coupled with medical knowledge of the field. International standards such as the aforementioned ICD-10 and UNII, provide the backbone of the common representation of medical data, while the medical knowledge of drug interactions is represented by a rule base which makes use of the aforementioned standards. Details of the system architecture are presented while also giving an outline of the difficulties that had to be overcome. A comparison of the developed ontology-based system with a similar system developed using a traditional business logic rule engine is performed, giving insights on the advantages and drawbacks of both implementations. Conclusions The use of Semantic Web technologies has been found to be a good match for developing drug recommendation systems. Ontologies can effectively encapsulate medical knowledge and rule-based reasoning can capture and encode the drug interactions knowledge. PMID:23256945
Knowledge Management and Reference Services
ERIC Educational Resources Information Center
Gandhi, Smiti
2004-01-01
Many corporations are embracing knowledge management (KM) to capture the intellectual capital of their employees. This article focuses on KM applications for reference work in libraries. It defines key concepts of KM, establishes a need for KM for reference services, and reviews various KM initiatives for reference services.
ERIC Educational Resources Information Center
Youens, Bernadette; Smethem, Lindsey; Sullivan, Stefanie
2014-01-01
This paper explores the potential of video capture to generate a collaborative space for teacher preparation; a space in which traditional hierarchies and boundaries between actors (student teacher, school mentor and university tutor) and knowledge (academic, professional and practical) are disrupted. The study, based in a teacher education…
Flight elements: Fault detection and fault management
NASA Technical Reports Server (NTRS)
Lum, H.; Patterson-Hine, A.; Edge, J. T.; Lawler, D.
1990-01-01
Fault management for an intelligent computational system must be developed using a top down integrated engineering approach. An approach proposed includes integrating the overall environment involving sensors and their associated data; design knowledge capture; operations; fault detection, identification, and reconfiguration; testability; causal models including digraph matrix analysis; and overall performance impacts on the hardware and software architecture. Implementation of the concept to achieve a real time intelligent fault detection and management system will be accomplished via the implementation of several objectives, which are: Development of fault tolerant/FDIR requirement and specification from a systems level which will carry through from conceptual design through implementation and mission operations; Implementation of monitoring, diagnosis, and reconfiguration at all system levels providing fault isolation and system integration; Optimize system operations to manage degraded system performance through system integration; and Lower development and operations costs through the implementation of an intelligent real time fault detection and fault management system and an information management system.
NASA's SDO Satellite Captures 2012 Venus Transit
2017-12-08
NASA image captured June 5, 2012. On June 5-6 2012, SDO is collecting images of one of the rarest predictable solar events: the transit of Venus across the face of the sun. This event happens in pairs eight years apart that are separated from each other by 105 or 121 years. The last transit was in 2004 and the next will not happen until 2117. Credit: NASA/SDO, HMI To read more about the 2012 Venus Transit go to: sunearthday.nasa.gov/transitofvenus Add your photos of the Transit of Venus to our Flickr Group here: www.flickr.com/groups/venustransit/ NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram
NASA's SDO Captures Mercury Transit Time-lapses SDO Captures Mercury Transit Time-lapse
2017-12-08
Less than once per decade, Mercury passes between the Earth and the sun in a rare astronomical event known as a planetary transit. The 2016 Mercury transit occurred on May 9th, between roughly 7:12 a.m. and 2:42 p.m. EDT. The images in this video are from NASA’s Solar Dynamics Observatory Music: Encompass by Mark Petrie For more info on the Mercury transit go to: www.nasa.gov/transit This video is public domain and may be downloaded at: svs.gsfc.nasa.gov/12235 NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram
2017-12-08
Twice a year, NASA’s Solar Dynamics Observatory, or SDO, has an eclipse season — a weeks-long period in which Earth blocks SDO’s view of the sun for part of each day. This footage captured by SDO on Feb. 15, 2017, shows one such eclipse. Earth’s edge appears fuzzy, rather than crisp, because the sun’s light is able to shine through Earth’s atmosphere in some places. These images were captured in wavelengths of extreme ultraviolet light, which is typically invisible to our eyes, but is colorized here in gold. Credit: NASA/Goddard/SDO NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram
NASA-NOAA's Suomi NPP Satellite Captures Night-time Look at Cyclone Felleng
2017-12-08
NASA-NOAA's Suomi NPP satellite captured this false-colored night-time image of Cyclone Felleng during the night on Jan. 28, 2013. Felleng is located in the Southern Indian Ocean, and is northwest of Madagascar. The image revealed some pretty cold overshooting tops, topping at ~170K. The image shows some interesting gravity waves propagating out from the storm in both the thermal and visible imagery. For full storm history on NASA's Hurricane Web Page, visit: www.nasa.gov/mission_pages/hurricanes/archives/2013/h2013... Credit: William Straka, UWM/NASA/NOAA NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram
NASA-NOAA's Suomi NPP Satellite Captures Night-time Look at Cyclone Felleng
2013-01-31
NASA-NOAA's Suomi NPP satellite captured this false-colored night-time image of Cyclone Felleng during the night on Jan. 28, 2013. Felleng is located in the Southern Indian Ocean, and is northwest of Madagascar. The image revealed some pretty cold overshooting tops, topping at ~170K. The image shows some interesting gravity waves propagating out from the storm in both the thermal and visible imagery. For full storm history on NASA's Hurricane Web Page, visit: www.nasa.gov/mission_pages/hurricanes/archives/2013/h2013... Credit: William Straka, UWM/NASA/NOAA NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram
Hubble Team Unveils Most Colorful View of Universe Captured by Space Telescope
2014-06-04
Astronomers using NASA's Hubble Space Telescope have assembled a comprehensive picture of the evolving universe – among the most colorful deep space images ever captured by the 24-year-old telescope. Researchers say the image, in new study called the Ultraviolet Coverage of the Hubble Ultra Deep Field, provides the missing link in star formation. The Hubble Ultra Deep Field 2014 image is a composite of separate exposures taken in 2003 to 2012 with Hubble's Advanced Camera for Surveys and Wide Field Camera 3. Credit: NASA/ESA Read more: 1.usa.gov/1neD0se NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram
Furtado, Nercy Virginia Rabelo; Galardo, Allan Kardec Ribeiro; Galardo, Clicia Denis; Firmino, Viviane Caetano
2016-01-01
During 2012–2015, an entomological survey was conducted as part of a phlebotomine (Diptera: Psychodidae) monitoring program in an area influenced by the Santo Antônio do Jari hydroelectric system (Amapá State, Brazil). The purpose was to study aspects of Amazon/Guianan American cutaneous leishmaniasis (ACL) vectors subjected to stresses by anthropogenic environmental changes. For sampling, CDC light traps were positioned 0.5, 1, and 20 m above ground at five capture locations along the Jari River Basin. Fluctuations in phlebotomine numbers were analyzed to determine any correlation with rainfall, dam waterlogging, and/or ACL cases, from May 2012 to March 2015. We captured 2,800 individuals, and among 45 species identified, Bichromomyia flaviscutellata, Nyssomyia umbratilis, and Psychodopygus squamiventris s.l. were determined to be the main putative vectors, based on current knowledge of the Amazon/Guianan ACL scenario. Rainfall, but not complete flooding, was relatively correlated with phlebotomine fluctuation, mainly observed for Ps. squamiventris s.l., as were ACL cases with Ny. umbratilis. Behavioral changes were observed in the unexpected high frequency of Bi. flaviscutellata among CDC captures and the noncanopy dominance of Ny. umbratilis, possibly attributable to environmental stress in the sampled ecotopes. Continuous entomological surveillance is necessary to monitor the outcomes of these findings. PMID:28042300
Tropical Rainfall Measuring Mission (TRMM). Phase B: Data capture facility definition study
NASA Technical Reports Server (NTRS)
1990-01-01
The National Aeronautics and Aerospace Administration (NASA) and the National Space Development Agency of Japan (NASDA) initiated the Tropical Rainfall Measuring Mission (TRMM) to obtain more accurate measurements of tropical rainfall then ever before. The measurements are to improve scientific understanding and knowledge of the mechanisms effecting the intra-annual and interannual variability of the Earth's climate. The TRMM is largely dependent upon the handling and processing of the data by the TRMM Ground System supporting the mission. The objective of the TRMM is to obtain three years of climatological determinations of rainfall in the tropics, culminating in data sets of 30-day average rainfall over 5-degree square areas, and associated estimates of vertical distribution of latent heat release. The scope of this study is limited to the functions performed by TRMM Data Capture Facility (TDCF). These functions include capturing the TRMM spacecraft return link data stream; processing the data in the real-time, quick-look, and routine production modes, as appropriate; and distributing real time, quick-look, and production data products to users. The following topics are addressed: (1) TRMM end-to-end system description; (2) TRMM mission operations concept; (3) baseline requirements; (4) assumptions related to mission requirements; (5) external interface; (6) TDCF architecture and design options; (7) critical issues and tradeoffs; and (8) recommendation for the final TDCF selection process.
Properties of Earth's temporarily-captured flybys
NASA Astrophysics Data System (ADS)
Fedorets, Grigori; Granvik, Mikael
2014-11-01
In addition to the Moon, a population of small temporarily-captured NEOs is predicted to orbit the Earth. The definition of a natural Earth satellite is that it is on an elliptic geocentric orbit within 0.03 au from the Earth. The population is further divided into temporarily-captured orbiters (TCOs, or minimoons, making at least one full revolution around the Earth in a coordinate system co-rotating with the Sun) and temporarily-captured flybys (TCFs) which fail to make a full revolution, but are temporarily on an elliptic orbit around the Earth. Only one minimoon has been discovered to date, but it is expected that next generation surveys will be able to detect these objects regularly.Granvik et al. (2012) performed an extensive analysis of the behaviour of these temporarily-captured objects. One of the main results was that at any given moment there is at least one 1-meter-diameter minimoon in orbit around the Earth. However, the results of Granvik et al. (2012) raised questions considering the NES population such as the bimodality of the capture duration distribution and a distinctive lack of test particles within Earth's Hill sphere, which requires investigating the statistical properties also of the TCF population.In this work we confirm the population characteristics for minimoons described by Granvik et al. (2012), and extend the analysis to TCFs. For the calculations we use a Bulirsch-Stoer integrator implemented in the OpenOrb software package (Granvik et al. 2009). We study, e.g., the capture statistics, residence-time distributions, and steady-state properties of TCFs. Our preliminary results indicate that TCFs may be suitable targets for asteroid-redirect missions. More detailed knowledge of the TCF population will also improve our understanding of the link between temporarily-captured objects and NEOs in general.References: Granvik et al. (2009) MPS 44(12), 1853-1861; Granvik et al. (2012) Icarus 218, 262-277.
NASA Astrophysics Data System (ADS)
Wang, Lanjing; Shao, Wenjing; Wang, Zhiyue; Fu, Wenfeng; Zhao, Wensheng
2018-02-01
Taking the MEA chemical absorption carbon capture system with 85% of the carbon capture rate of a 660MW ultra-super critical unit as an example,this paper puts forward a new type of turbine which dedicated to supply steam to carbon capture system. The comparison of the thermal systems of the power plant under different steam supply schemes by using the EBSILON indicated optimal extraction scheme for Steam Extraction System in Carbon Capture System. The results show that the cycle heat efficiency of the unit introduced carbon capture turbine system is higher than that of the usual scheme without it. With the introduction of the carbon capture turbine, the scheme which extracted steam from high pressure cylinder’ s steam input point shows the highest cycle thermal efficiency. Its indexes are superior to other scheme, and more suitable for existing coal-fired power plant integrated post combustion carbon dioxide capture system.
Integration of E-Learning and Knowledge Management.
ERIC Educational Resources Information Center
Woelk, Darrell; Agarwal, Shailesh
E-Learning technology today is used primarily to handcraft training courses about carefully selected topics for delivery to employees registered for those courses. This paper investigates the integration of e-learning and knowledge management technology to improve the capture, organization and delivery of both traditional training courses and…
Converting laserdisc video to digital video: a demonstration project using brain animations.
Jao, C S; Hier, D B; Brint, S U
1995-01-01
Interactive laserdiscs are of limited value in large group learning situations due to the expense of establishing multiple workstations. The authors implemented an alternative to laserdisc video by using indexed digital video combined with an expert system. High-quality video was captured from a laserdisc player and combined with waveform audio into an audio-video-interleave (AVI) file format in the Microsoft Video-for-Windows environment (Microsoft Corp., Seattle, WA). With the use of an expert system, a knowledge-based computer program provided random access to these indexed AVI files. The program can be played on any multimedia computer without the need for laserdiscs. This system offers a high level of interactive video without the overhead and cost of a laserdisc player.
From Big Data to Knowledge in the Social Sciences.
Hesse, Bradford W; Moser, Richard P; Riley, William T
2015-05-01
One of the challenges associated with high-volume, diverse datasets is whether synthesis of open data streams can translate into actionable knowledge. Recognizing that challenge and other issues related to these types of data, the National Institutes of Health developed the Big Data to Knowledge or BD2K initiative. The concept of translating "big data to knowledge" is important to the social and behavioral sciences in several respects. First, a general shift to data-intensive science will exert an influence on all scientific disciplines, but particularly on the behavioral and social sciences given the wealth of behavior and related constructs captured by big data sources. Second, science is itself a social enterprise; by applying principles from the social sciences to the conduct of research, it should be possible to ameliorate some of the systemic problems that plague the scientific enterprise in the age of big data. We explore the feasibility of recalibrating the basic mechanisms of the scientific enterprise so that they are more transparent and cumulative; more integrative and cohesive; and more rapid, relevant, and responsive.
NASA Astrophysics Data System (ADS)
Moresi, L.; May, D.; Peachey, T.; Enticott, C.; Abramson, D.; Robinson, T.
2004-12-01
Can you teach intuition ? Obviously we think that this is possible (though it's still just a hunch). People undoubtedly develop intuition for non-linear systems through painstaking repetition of complex tasks until they have sufficient feedback to begin to "see" the emergent behaviour. The better the exploration of the system can be exposed, the quicker the potential for developing an intuitive understanding. We have spent some time considering how to incorporate the intuitive knowledge of field geologists into mechanical modeling of geological processes. Our solution has been to allow expert geologist to steer (via a GUI) a genetic algorithm inversion of a mechanical forward model towards "structures" or patterns which are plausible in nature. The expert knowledge is then captured by analysis of the individual model parameters which are constrained by the steering (and by analysis of those which are unconstrained). The same system can also be used in reverse to expose the influence of individual parameters to the non-expert who is trying to learn just what does make a good match between model and observation. The ``distance'' between models preferred by experts, and those by an individual can be shown graphically to provide feedback. The examples we choose are from numerical models of extensional basins. We will first try to give each person some background information on the scientific problem from the poster and then we will let them loose on the numerical modeling tools with specific tasks to achieve. This will be an experiment in progress - we will later analyse how people use the GUI and whether there is really any significant difference between so-called experts and self-styled novices.
Burbidge, E M
1983-07-29
The exploration of the universe has captured mankind's interest since the earliest attempts to understand the sun, moon, planets, comets, and stars. The last few decades have seen explosive advances of knowledge, sparked by technological advances and by our entry into the space age. Achievements in solar system exploration, discoveries both in the Milky Way and in the farther universe, and challenges for the future are discussed. Of major concern worldwide is the need for people of goodwill in all nations to concentrate on the peaceful uses of outer space and on international collaboration.
NASA Sees First Land-falling Tropical Cyclone in Yemen
2017-12-08
On Nov. 3, 2015 at 07:20 UTC (2:20 a.m. EDT) the MODIS instrument aboard NASA's Aqua satellite captured this image of Tropical Cyclone Chapala over Yemen. Credit: NASA Goddard MODIS Rapid Response Team NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram
Tropical Storm Haiyan Makes Landfall in Northern Vietnam
2013-11-12
On Nov. 11 at 05:45 UTC, the MODIS instrument aboard NASA's Aqua satellite captured this image of Tropical Storm Haiyan over mainland China. Credit: NASA Goddard MODIS Rapid Response Team NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram
2017-12-08
On Oct. 19 at 1500 UTC (11 a.m. EDT), the MODIS instrument aboard NASA's Terra satellite captured this visible image of Hurricane Gonzalo east of Newfoundland, Canada. ..Credit: NASA Goddard MODIS Rapid Response Team ..NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram
Jones, Josette; Harris, Marcelline; Bagley-Thompson, Cheryl; Root, Jane
2003-01-01
This poster describes the development of user-centered interfaces in order to extend the functionality of the Virginia Henderson International Nursing Library (VHINL) from library to web based portal to nursing knowledge resources. The existing knowledge structure and computational models are revised and made complementary. Nurses' search behavior is captured and analyzed, and the resulting search models are mapped to the revised knowledge structure and computational model.
Feng, Sheng; Lotz, Thomas; Chase, J Geoffrey; Hann, Christopher E
2010-01-01
Digital Image Elasto Tomography (DIET) is a non-invasive elastographic breast cancer screening technology, based on image-based measurement of surface vibrations induced on a breast by mechanical actuation. Knowledge of frequency response characteristics of a breast prior to imaging is critical to maximize the imaging signal and diagnostic capability of the system. A feasibility analysis for a non-invasive image based modal analysis system is presented that is able to robustly and rapidly identify resonant frequencies in soft tissue. Three images per oscillation cycle are enough to capture the behavior at a given frequency. Thus, a sweep over critical frequency ranges can be performed prior to imaging to determine critical imaging settings of the DIET system to optimize its tumor detection performance.
40 CFR 63.9322 - How do I determine the emission capture system efficiency?
Code of Federal Regulations, 2011 CFR
2011-07-01
... capture system efficiency? 63.9322 Section 63.9322 Protection of Environment ENVIRONMENTAL PROTECTION... capture system efficiency? You must use the procedures and test methods in this section to determine capture efficiency as part of the performance test required by § 63.9310. (a) Assuming 100 percent capture...
40 CFR 63.9322 - How do I determine the emission capture system efficiency?
Code of Federal Regulations, 2010 CFR
2010-07-01
... capture system efficiency? 63.9322 Section 63.9322 Protection of Environment ENVIRONMENTAL PROTECTION... capture system efficiency? You must use the procedures and test methods in this section to determine capture efficiency as part of the performance test required by § 63.9310. (a) Assuming 100 percent capture...
Application of Knowledge Management: Pressing questions and practical answers
DOE Office of Scientific and Technical Information (OSTI.GOV)
FROMM-LEWIS,MICHELLE
2000-02-11
Sandia National Laboratory are working on ways to increase production using Knowledge Management. Knowledge Management is: finding ways to create, identify, capture, and distribute organizational knowledge to the people who need it; to help information and knowledge flow to the right people at the right time so they can act more efficiently and effectively; recognizing, documenting and distributing explicit knowledge (explicit knowledge is quantifiable and definable, it makes up reports, manuals, instructional materials, etc.) and tacit knowledge (tacit knowledge is doing and performing, it is a combination of experience, hunches, intuition, emotions, and beliefs) in order to improve organizational performancemore » and a systematic approach to find, understand and use knowledge to create value.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ferrada, J.J.; Osborne-Lee, I.W.; Grizzaffi, P.A.
Expert systems are known to be useful in capturing expertise and applying knowledge to chemical engineering problems such as diagnosis, process control, process simulation, and process advisory. However, expert system applications are traditionally limited to knowledge domains that are heuristic and involve only simple mathematics. Neural networks, on the other hand, represent an emerging technology capable of rapid recognition of patterned behavior without regard to mathematical complexity. Although useful in problem identification, neural networks are not very efficient in providing in-depth solutions and typically do not promote full understanding of the problem or the reasoning behind its solutions. Hence, applicationsmore » of neural networks have certain limitations. This paper explores the potential for expanding the scope of chemical engineering areas where neural networks might be utilized by incorporating expert systems and neural networks into the same application, a process called hybridization. In addition, hybrid applications are compared with those using more traditional approaches, the results of the different applications are analyzed, and the feasibility of converting the preliminary prototypes described herein into useful final products is evaluated. 12 refs., 8 figs.« less
3D Geological Mapping - uncovering the subsurface to increase environmental understanding
NASA Astrophysics Data System (ADS)
Kessler, H.; Mathers, S.; Peach, D.
2012-12-01
Geological understanding is required for many disciplines studying natural processes from hydrology to landscape evolution. The subsurface structure of rocks and soils and their properties occupies three-dimensional (3D) space and geological processes operate in time. Traditionally geologists have captured their spatial and temporal knowledge in 2 dimensional maps and cross-sections and through narrative, because paper maps and later two dimensional geographical information systems (GIS) were the only tools available to them. Another major constraint on using more explicit and numerical systems to express geological knowledge is the fact that a geologist only ever observes and measures a fraction of the system they study. Only on rare occasions does the geologist have access to enough real data to generate meaningful predictions of the subsurface without the input of conceptual understanding developed from and knowledge of the geological processes responsible for the deposition, emplacement and diagenesis of the rocks. This in turn has led to geology becoming an increasingly marginalised science as other disciplines have embraced the digital world and have increasingly turned to implicit numerical modelling to understand environmental processes and interactions. Recent developments in geoscience methodology and technology have gone some way to overcoming these barriers and geologists across the world are beginning to routinely capture their knowledge and combine it with all available subsurface data (of often highly varying spatial distribution and quality) to create regional and national geological three dimensional geological maps. This is re-defining the way geologists interact with other science disciplines, as their concepts and knowledge are now expressed in an explicit form that can be used downstream to design process models structure. For example, groundwater modellers can refine their understanding of groundwater flow in three dimensions or even directly parameterize their numerical models using outputs from 3D mapping. In some cases model code is being re-designed in order to deal with the increasing geological complexity expressed by Geologists. These 3D maps contain have inherent uncertainty, just as their predecessors, 2D geological maps had, and there remains a significant body of work to quantify and effectively communicate this uncertainty. Here we present examples of regional and national 3D maps from Geological Survey Organisations worldwide and how these are being used to better solve real-life environmental problems. The future challenge for geologists is to make these 3D maps easily available in an accessible and interoperable form so that the environmental science community can truly integrate the hidden subsurface into a common understanding of the whole geosphere.
NASA Human Health and Performance Center (NHHPC)
NASA Technical Reports Server (NTRS)
Davis, J. R.; Richard, E. E.
2010-01-01
The NASA Human Health and Performance Center (NHHPC) will provide a collaborative and virtual forum to integrate all disciplines of the human system to address spaceflight, aviation, and terrestrial human health and performance topics and issues. The NHHPC will serve a vital role as integrator, convening members to share information and capture a diverse knowledge base, while allowing the parties to collaborate to address the most important human health and performance topics of interest to members. The Center and its member organizations will address high-priority risk reduction strategies, including research and technology development, improved medical and environmental health diagnostics and therapeutics, and state-of-the art design approaches for human factors and habitability. Once full established in 2011, the NHHPC will focus on a number of collaborative projects focused on human health and performance, including workshops, education and outreach, information sharing and knowledge management, and research and technology development projects, to advance the study of the human system for spaceflight and other national and international priorities.
40 CFR 63.3554 - How do I determine the emission capture system efficiency?
Code of Federal Regulations, 2011 CFR
2011-07-01
... system efficiency? 63.3554 Section 63.3554 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Requirements for the Control Efficiency/outlet Concentration Option § 63.3554 How do I determine the emission capture system efficiency? The capture efficiency of your emission capture system must be 100 percent to...
40 CFR 63.3554 - How do I determine the emission capture system efficiency?
Code of Federal Regulations, 2010 CFR
2010-07-01
... system efficiency? 63.3554 Section 63.3554 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Requirements for the Control Efficiency/outlet Concentration Option § 63.3554 How do I determine the emission capture system efficiency? The capture efficiency of your emission capture system must be 100 percent to...
Automatic programming of arc welding robots
NASA Astrophysics Data System (ADS)
Padmanabhan, Srikanth
Automatic programming of arc welding robots requires the geometric description of a part from a solid modeling system, expert weld process knowledge and the kinematic arrangement of the robot and positioner automatically. Current commercial solid models are incapable of storing explicitly product and process definitions of weld features. This work presents a paradigm to develop a computer-aided engineering environment that supports complete weld feature information in a solid model and to create an automatic programming system for robotic arc welding. In the first part, welding features are treated as properties or attributes of an object, features which are portions of the object surface--the topological boundary. The structure for representing the features and attributes is a graph called the Welding Attribute Graph (WAGRAPH). The method associates appropriate weld features to geometric primitives, adds welding attributes, and checks the validity of welding specifications. A systematic structure is provided to incorporate welding attributes and coordinate system information in a CSG tree. The specific implementation of this structure using a hybrid solid modeler (IDEAS) and an object-oriented programming paradigm is described. The second part provides a comprehensive methodology to acquire and represent weld process knowledge required for the proper selection of welding schedules. A methodology of knowledge acquisition using statistical methods is proposed. It is shown that these procedures did little to capture the private knowledge of experts (heuristics), but helped in determining general dependencies, and trends. A need was established for building the knowledge-based system using handbook knowledge and to allow the experts further to build the system. A methodology to check the consistency and validity for such knowledge addition is proposed. A mapping shell designed to transform the design features to application specific weld process schedules is described. A new approach using fixed path modified continuation methods is proposed in the final section to plan continuously the trajectory of weld seams in an integrated welding robot and positioner environment. The joint displacement, velocity, and acceleration histories all along the path as a function of the path parameter for the best possible welding condition are provided for the robot and the positioner to track various paths normally encountered in arc welding.
Stochastic Plume Simulations for the Fukushima Accident and the Deep Water Horizon Oil Spill
NASA Astrophysics Data System (ADS)
Coelho, E.; Peggion, G.; Rowley, C.; Hogan, P.
2012-04-01
The Fukushima Dai-ichi power plant suffered damage leading to radioactive contamination of coastal waters. Major issues in characterizing the extent of the affected waters were a poor knowledge of the radiation released to the coastal waters and the rather complex coastal dynamics of the region, not deterministically captured by the available prediction systems. Equivalently, during the Gulf of Mexico Deep Water Horizon oil platform accident in April 2010, significant amounts of oil and gas were released from the ocean floor. For this case, issues in mapping and predicting the extent of the affected waters in real-time were a poor knowledge of the actual amounts of oil reaching the surface and the fact that coastal dynamics over the region were not deterministically captured by the available prediction systems. To assess the ocean regions and times that were most likely affected by these accidents while capturing the above sources of uncertainty, ensembles of the Navy Coastal Ocean Model (NCOM) were configured over the two regions (NE Japan and Northern Gulf of Mexico). For the Fukushima case tracers were released on each ensemble member; their locations at each instant provided reference positions of water volumes where the signature of water released from the plant could be found. For the Deep Water Horizon oil spill case each ensemble member was coupled with a diffusion-advection solution to estimate possible scenarios of oil concentrations using perturbed estimates of the released amounts as the source terms at the surface. Stochastic plumes were then defined using a Risk Assessment Code (RAC) analysis that associates a number from 1 to 5 to each grid point, determined by the likelihood of having tracer particle within short ranges (for the Fukushima case), hence defining the high risk areas and those recommended for monitoring. For the Oil Spill case the RAC codes were determined by the likelihood of reaching oil concentrations as defined in the Bonn Agreement Oil Appearance Code. The likelihoods were taken in both cases from probability distribution functions derived from the ensemble runs. Results were compared with a control-deterministic solution and checked against available reports to assess their skill in capturing the actual observed plumes and other in-situ data, as well as their relevance for planning surveys and reconnaissance flights for both cases.
Facing page test for the astronaut science advisor presentation
NASA Technical Reports Server (NTRS)
Compton, Michael M.
1991-01-01
The goal of the Astronaut Science Advisor (ASA) project is to improve the scientific return of experiments performed in space by providing astronaut experimenters with an 'intelligent assistant' that encapsulates much of the domain- and experiment-related knowledge commanded by the Principal Investigator (PI) on the ground. By using expert systems technology and the availability of flight-qualified personal computers, it is possible to encode the requisite knowledge and make it available to astronauts as they perform experiments in space. The system performs four major functions: diagnosis and troubleshooting of experiment apparatus, data collection, protocol management, and detection of interesting data. The experiment used for development of the system measures human adaptation to weightlessness in the context of the neurovestibular system. This so-called 'Rotating Dome' experiment was flown on the recent Spacelab Life Sciences One (SLS-1) Mission. This mission was used as an opportunity to test some of the system's functionality. Experiment data was downlinked from the orbiter, and the system then captured the data and analyzed it in real time. The system kept track of the time being used by the experiment, recognized occurrences of interesting data, summarized data statistically and generated potential new protocols that could be used to optimize the course of the experiment.
ERIC Educational Resources Information Center
Jackson, Simon A.; Martin, Gregory D.; Aidman, Eugene; Kleitman, Sabina
2018-01-01
This article presents the results of a systematic review of the literature surrounding the effects that acute sleep deprivation has on metacognitive monitoring. Metacognitive monitoring refers to the ability to accurately assess one's own performance and state of knowledge. The mechanism behind this assessment is captured by subjective feelings of…
NASA Technical Reports Server (NTRS)
Fernandez, Becerra
2003-01-01
Expert Seeker is a computer program of the knowledge-management-system (KMS) type that falls within the category of expertise-locator systems. The main goal of the KMS system implemented by Expert Seeker is to organize and distribute knowledge of who are the domain experts within and without a given institution, company, or other organization. The intent in developing this KMS was to enable the re-use of organizational knowledge and provide a methodology for querying existing information (including structured, semistructured, and unstructured information) in a way that could help identify organizational experts. More specifically, Expert Seeker was developed to make it possible, by use of an intranet, to do any or all of the following: Assist an employee in identifying who has the skills needed for specific projects and to determine whether the experts so identified are available. Assist managers in identifying employees who may need training opportunities. Assist managers in determining what expertise is lost when employees retire or otherwise leave. Facilitate the development of new ways of identifying opportunities for innovation and minimization of duplicated efforts. Assist employees in achieving competitive advantages through the application of knowledge-management concepts and related systems. Assist external organizations in requesting speakers for specific engagements or determining from whom they might be able to request help via electronic mail. Help foster an environment of collaboration for rapid development in today's environment, in which it is increasingly necessary to assemble teams of experts from government, universities, research laboratories, and industries, to quickly solve problems anytime, anywhere. Make experts more visible. Provide a central repository of information about employees, including information that, heretofore, has typically not been captured by the human-resources systems (e.g., information about past projects, patents, or hobbies). Unify myriad collections of data into Web-enabled repository that could easily be searched for relevant data.
NASA Astrophysics Data System (ADS)
Martin, Andreas; Emmenegger, Sandro; Hinkelmann, Knut; Thönssen, Barbara
2017-04-01
The accessibility of project knowledge obtained from experiences is an important and crucial issue in enterprises. This information need about project knowledge can be different from one person to another depending on the different roles he or she has. Therefore, a new ontology-based case-based reasoning (OBCBR) approach that utilises an enterprise ontology is introduced in this article to improve the accessibility of this project knowledge. Utilising an enterprise ontology improves the case-based reasoning (CBR) system through the systematic inclusion of enterprise-specific knowledge. This enterprise-specific knowledge is captured using the overall structure given by the enterprise ontology named ArchiMEO, which is a partial ontological realisation of the enterprise architecture framework (EAF) ArchiMate. This ontological representation, containing historical cases and specific enterprise domain knowledge, is applied in a new OBCBR approach. To support the different information needs of different stakeholders, this OBCBR approach has been built in such a way that different views, viewpoints, concerns and stakeholders can be considered. This is realised using a case viewpoint model derived from the ISO/IEC/IEEE 42010 standard. The introduced approach was implemented as a demonstrator and evaluated using an application case that has been elicited from a business partner in the Swiss research project.
Anatomy and Physiology of Multiscale Modeling and Simulation in Systems Medicine.
Mizeranschi, Alexandru; Groen, Derek; Borgdorff, Joris; Hoekstra, Alfons G; Chopard, Bastien; Dubitzky, Werner
2016-01-01
Systems medicine is the application of systems biology concepts, methods, and tools to medical research and practice. It aims to integrate data and knowledge from different disciplines into biomedical models and simulations for the understanding, prevention, cure, and management of complex diseases. Complex diseases arise from the interactions among disease-influencing factors across multiple levels of biological organization from the environment to molecules. To tackle the enormous challenges posed by complex diseases, we need a modeling and simulation framework capable of capturing and integrating information originating from multiple spatiotemporal and organizational scales. Multiscale modeling and simulation in systems medicine is an emerging methodology and discipline that has already demonstrated its potential in becoming this framework. The aim of this chapter is to present some of the main concepts, requirements, and challenges of multiscale modeling and simulation in systems medicine.
Solvent Effects on the Photothermal Regeneration of CO 2 in Monoethanolamine Nanofluids
Nguyen, Du; Stolaroff, Joshuah; Esser-Kahn, Aaron
2015-11-02
We present that a potential approach to reduce energy costs associated with carbon capture is to use external and renewable energy sources. The photothermal release of CO 2 from monoethanolamine mediated by nanoparticles is a unique solution to this problem. When combined with light-absorbing nanoparticles, vapor bubbles form inside the capture solution and release the CO 2 without heating the bulk solvent. The mechanism by which CO 2 is released remained unclear, and understanding this process would improve the efficiency of photothermal CO 2 release. Here we report the use of different cosolvents to improve or reduce the photothermal regenerationmore » of CO 2 captured by monoethanolamine. We found that properties that reduce the residence time of the gas bubbles (viscosity, boiling point, and convection direction) can enhance the regeneration efficiencies. The reduction of bubble residence times minimizes the reabsorption of CO 2 back into the capture solvent where bulk temperatures remain lower than the localized area surrounding the nanoparticle. These properties shed light on the mechanism of release and indicated methods for improving the efficiency of the process. We used this knowledge to develop an improved photothermal CO 2 regeneration system in a continuously flowing setup. Finally, using techniques to reduce residence time in the continuously flowing setup, such as alternative cosolvents and smaller fluid volumes, resulted in regeneration efficiency enhancements of over 200%.« less
Solvent Effects on the Photothermal Regeneration of CO 2 in Monoethanolamine Nanofluids
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nguyen, Du; Stolaroff, Joshuah; Esser-Kahn, Aaron
We present that a potential approach to reduce energy costs associated with carbon capture is to use external and renewable energy sources. The photothermal release of CO 2 from monoethanolamine mediated by nanoparticles is a unique solution to this problem. When combined with light-absorbing nanoparticles, vapor bubbles form inside the capture solution and release the CO 2 without heating the bulk solvent. The mechanism by which CO 2 is released remained unclear, and understanding this process would improve the efficiency of photothermal CO 2 release. Here we report the use of different cosolvents to improve or reduce the photothermal regenerationmore » of CO 2 captured by monoethanolamine. We found that properties that reduce the residence time of the gas bubbles (viscosity, boiling point, and convection direction) can enhance the regeneration efficiencies. The reduction of bubble residence times minimizes the reabsorption of CO 2 back into the capture solvent where bulk temperatures remain lower than the localized area surrounding the nanoparticle. These properties shed light on the mechanism of release and indicated methods for improving the efficiency of the process. We used this knowledge to develop an improved photothermal CO 2 regeneration system in a continuously flowing setup. Finally, using techniques to reduce residence time in the continuously flowing setup, such as alternative cosolvents and smaller fluid volumes, resulted in regeneration efficiency enhancements of over 200%.« less
Tautin, J.; Lebreton, J.-D.; North, P.M.
1993-01-01
Capture-recapture methodology has advanced greatly in the last twenty years and is now a major factor driving the continuing evolution of the North American bird banding program. Bird banding studies are becoming more scientific with improved study designs and analytical procedures. Researchers and managers are gaining more reliable knowledge which in turn betters the conservation of migratory birds. The advances in capture-recapture methodology have benefited gamebird studies primarily, but nongame bird studies will benefit similarly as they expand greatly in the next decade. Further theoretical development of capture-recapture methodology should be encouraged, and, to maximize benefits of the methodology, work on practical applications should be increased.
41 CFR 301-71.303 - What data must we capture in our travel advance accounting system?
Code of Federal Regulations, 2010 CFR
2010-07-01
... capture in our travel advance accounting system? 301-71.303 Section 301-71.303 Public Contracts and Property Management Federal Travel Regulation System TEMPORARY DUTY (TDY) TRAVEL ALLOWANCES AGENCY... must we capture in our travel advance accounting system? You must capture the following data: (a) The...
CI Controls for Energy and Environment
NASA Technical Reports Server (NTRS)
Biondo, Samuel J.
1996-01-01
Computational intelligence (CI) is a rapidly evolving field that utilizes life imitating metaphors for guiding model building including, but not limited to neural networks, fuzzy logic, genetic algorithms, artificial life, and hybrid CI paradigms. Although the boundaries between artificial intelligence (AI) and CI are not distinct, their research communities are separate and distinct. CI researchers tend to focus on processing numerical data from sensors, while the AI community generally relies on symbolic computing to capture human knowledge. In both areas, there is a great deal of interest and activity in hybrid systems that can offset the limitations of individual methods, extend their capabilities, and create new capabilities. Examples of the benefits that can accrue from hybrid systems are contained.
Capturing domain knowledge from multiple sources: the rare bone disorders use case.
Groza, Tudor; Tudorache, Tania; Robinson, Peter N; Zankl, Andreas
2015-01-01
Lately, ontologies have become a fundamental building block in the process of formalising and storing complex biomedical information. The community-driven ontology curation process, however, ignores the possibility of multiple communities building, in parallel, conceptualisations of the same domain, and thus providing slightly different perspectives on the same knowledge. The individual nature of this effort leads to the need of a mechanism to enable us to create an overarching and comprehensive overview of the different perspectives on the domain knowledge. We introduce an approach that enables the loose integration of knowledge emerging from diverse sources under a single coherent interoperable resource. To accurately track the original knowledge statements, we record the provenance at very granular levels. We exemplify the approach in the rare bone disorders domain by proposing the Rare Bone Disorders Ontology (RBDO). Using RBDO, researchers are able to answer queries, such as: "What phenotypes describe a particular disorder and are common to all sources?" or to understand similarities between disorders based on divergent groupings (classifications) provided by the underlying sources. RBDO is available at http://purl.org/skeletome/rbdo. In order to support lightweight query and integration, the knowledge captured by RBDO has also been made available as a SPARQL Endpoint at http://bio-lark.org/se_skeldys.html.
Important ingredients for health adaptive information systems.
Senathirajah, Yalini; Bakken, Suzanne
2011-01-01
Healthcare information systems frequently do not truly meet clinician needs, due to the complexity, variability, and rapid change in medical contexts. Recently the internet world has been transformed by approaches commonly termed 'Web 2.0'. This paper proposes a Web 2.0 model for a healthcare adaptive architecture. The vision includes creating modular, user-composable systems which aim to make all necessary information from multiple internal and external sources available via a platform, for the user to use, arrange, recombine, author, and share at will, using rich interfaces where advisable. Clinicians can create a set of 'widgets' and 'views' which can transform data, reflect their domain knowledge and cater to their needs, using simple drag and drop interfaces without the intervention of programmers. We have built an example system, MedWISE, embodying the user-facing parts of the model. This approach to HIS is expected to have several advantages, including greater suitability to user needs (reflecting clinician rather than programmer concepts and priorities), incorporation of multiple information sources, agile reconfiguration to meet emerging situations and new treatment deployment, capture of user domain expertise and tacit knowledge, efficiencies due to workflow and human-computer interaction improvements, and greater user acceptance.
Soares, Cássia Baldini; Santos, Vilmar Ezequiel Dos; Campos, Célia Maria Sivalli; Lachtim, Sheila Aparecida Ferreira; Campos, Fernanda Cristina
2011-12-01
We propose from the Marxist perspective of the construction of knowledge, a theoretical and methodological framework for understanding social values by capturing everyday representations. We assume that scientific research brings together different dimensions: epistemological, theoretical and methodological that consistently to the other instances, proposes a set of operating procedures and techniques for capturing and analyzing the reality under study in order to expose the investigated object. The study of values reveals the essentiality of the formation of judgments and choices, there are values that reflect the dominant ideology, spanning all social classes, but there are values that reflect class interests, these are not universal, they are formed in relationships and social activities. Basing on the Marxist theory of consciousness, representations are discursive formulations of everyday life - opinion or conviction - issued by subjects about their reality, being a coherent way of understanding and exposure social values: focus groups show is suitable for grasping opinions while interviews show potential to expose convictions.
Design knowledge capture for a corporate memory facility
NASA Technical Reports Server (NTRS)
Boose, John H.; Shema, David B.; Bradshaw, Jeffrey M.
1990-01-01
Currently, much of the information regarding decision alternatives and trade-offs made in the course of a major program development effort is not represented or retained in a way that permits computer-based reasoning over the life cycle of the program. The loss of this information results in problems in tracing design alternatives to requirements, in assessing the impact of change in requirements, and in configuration management. To address these problems, the problem was studied of building an intelligent, active corporate memory facility which would provide for the capture of the requirements and standards of a program, analyze the design alternatives and trade-offs made over the program's lifetime, and examine relationships between requirements and design trade-offs. Early phases of the work have concentrated on design knowledge capture for the Space Station Freedom. Tools are demonstrated and extended which helps automate and document engineering trade studies, and another tool is being developed to help designers interactively explore design alternatives and constraints.
WE-F-BRB-02: Setting the Stage for Incorporation of Toxicity Measures in Treatment Plan Assessments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mayo, C.
2015-06-15
Advancements in informatics in radiotherapy are opening up opportunities to improve our ability to assess treatment plans. Models on individualizing patient dose constraints from prior patient data and shape relationships have been extensively researched and are now making their way into commercial products. New developments in knowledge based treatment planning involve understanding the impact of the radiation dosimetry on the patient. Akin to radiobiology models that have driven intensity modulated radiotherapy optimization, toxicity and outcome predictions based on treatment plans and prior patient experiences may be the next step in knowledge based planning. In order to realize these predictions, itmore » is necessary to understand how the clinical information can be captured, structured and organized with ontologies and databases designed for recall. Large databases containing radiation dosimetry and outcomes present the opportunity to evaluate treatment plans against predictions of toxicity and disease response. Such evaluations can be based on dose volume histogram or even the full 3-dimensional dose distribution and its relation to the critical anatomy. This session will provide an understanding of ontologies and standard terminologies used to capture clinical knowledge into structured databases; How data can be organized and accessed to utilize the knowledge in planning; and examples of research and clinical efforts to incorporate that clinical knowledge into planning for improved care for our patients. Learning Objectives: Understand the role of standard terminologies, ontologies and data organization in oncology Understand methods to capture clinical toxicity and outcomes in a clinical setting Understand opportunities to learn from clinical data and its application to treatment planning Todd McNutt receives funding from Philips, Elekta and Toshiba for some of the work presented.« less
NASA Technical Reports Server (NTRS)
Topousis, Daria E.; Murphy, Keri; Robinson, Greg
2008-01-01
In 2004, NASA faced major knowledge sharing challenges due to geographically isolated field centers that inhibited personnel from sharing experiences and ideas. Mission failures and new directions for the agency demanded better collaborative tools. In addition, with the push to send astronauts back to the moon and to Mars, NASA recognized that systems engineering would have to improve across the agency. Of the ten field centers, seven had not built a spacecraft in over 30 years, and had lost systems engineering expertise. The Systems Engineering Community of Practice came together to capture the knowledge of its members using the suite of collaborative tools provided by the NASA Engineering Network (NEN.) The NEN provided a secure collaboration space for over 60 practitioners across the agency to assemble and review a NASA systems engineering handbook. Once the handbook was complete, they used the open community area to disseminate it. This case study explores both the technology and the social networking that made the community possible, describes technological approaches that facilitated rapid setup and low maintenance, provides best practices that other organizations could adopt, and discusses the vision for how this community will continue to collaborate across the field centers to benefit the agency as it continues exploring the solar system.
Code of Federal Regulations, 2014 CFR
2014-07-01
... capture system and add-on control device operating limits during the performance test? 63.3546 Section 63... of key parameters of the valve operating system (e.g., solenoid valve operation, air pressure... minimum operating limit for that specific capture device or system of multiple capture devices. The...
Code of Federal Regulations, 2012 CFR
2012-07-01
... capture system and add-on control device operating limits during the performance test? 63.3546 Section 63... of key parameters of the valve operating system (e.g., solenoid valve operation, air pressure... minimum operating limit for that specific capture device or system of multiple capture devices. The...
Code of Federal Regulations, 2014 CFR
2014-07-01
... to capture emissions; (3) If a mobile scrubber car that does not capture emissions during travel is... each capture system that uses an electric motor to drive the fan, you must maintain the daily average... (ii) For each capture system that does not use a fan driven by an electric motor, you must maintain...
Code of Federal Regulations, 2013 CFR
2013-07-01
... to capture emissions; (3) If a mobile scrubber car that does not capture emissions during travel is... each capture system that uses an electric motor to drive the fan, you must maintain the daily average... (ii) For each capture system that does not use a fan driven by an electric motor, you must maintain...
Code of Federal Regulations, 2012 CFR
2012-07-01
... to capture emissions; (3) If a mobile scrubber car that does not capture emissions during travel is... each capture system that uses an electric motor to drive the fan, you must maintain the daily average... (ii) For each capture system that does not use a fan driven by an electric motor, you must maintain...
Code of Federal Regulations, 2011 CFR
2011-07-01
... to capture emissions; (3) If a mobile scrubber car that does not capture emissions during travel is... each capture system that uses an electric motor to drive the fan, you must maintain the daily average... (ii) For each capture system that does not use a fan driven by an electric motor, you must maintain...
Code of Federal Regulations, 2010 CFR
2010-07-01
... to capture emissions; (3) If a mobile scrubber car that does not capture emissions during travel is... each capture system that uses an electric motor to drive the fan, you must maintain the daily average... (ii) For each capture system that does not use a fan driven by an electric motor, you must maintain...
Can Moral Hazard Be Resolved by Common-Knowledge in S4n-Knowledge?
NASA Astrophysics Data System (ADS)
Matsuhisa, Takashi
This article investigates the relationship between common-knowledge and agreement in multi-agent system, and to apply the agreement result by common-knowledge to the principal-agent model under non-partition information. We treat the two problems: (1) how we capture the fact that the agents agree on an event or they get consensus on it from epistemic point of view, and (2) how the agreement theorem will be able to make progress to settle a moral hazard problem in the principal-agents model under non-partition information. We shall propose a solution program for the moral hazard in the principal-agents model under non-partition information by common-knowledge. Let us start that the agents have the knowledge structure induced from a reflexive and transitive relation associated with the multi-modal logic S4n. Each agent obtains the membership value of an event under his/her private information, so he/she considers the event as fuzzy set. Specifically consider the situation that the agents commonly know all membership values of the other agents. In this circumstance we shall show the agreement theorem that consensus on the membership values among all agents can still be guaranteed. Furthermore, under certain assumptions we shall show that the moral hazard can be resolved in the principal-agent model when all the expected marginal costs are common-knowledge among the principal and agents.
Mohammadhassanzadeh, Hossein; Van Woensel, William; Abidi, Samina Raza; Abidi, Syed Sibte Raza
2017-01-01
Capturing complete medical knowledge is challenging-often due to incomplete patient Electronic Health Records (EHR), but also because of valuable, tacit medical knowledge hidden away in physicians' experiences. To extend the coverage of incomplete medical knowledge-based systems beyond their deductive closure, and thus enhance their decision-support capabilities, we argue that innovative, multi-strategy reasoning approaches should be applied. In particular, plausible reasoning mechanisms apply patterns from human thought processes, such as generalization, similarity and interpolation, based on attributional, hierarchical, and relational knowledge. Plausible reasoning mechanisms include inductive reasoning , which generalizes the commonalities among the data to induce new rules, and analogical reasoning , which is guided by data similarities to infer new facts. By further leveraging rich, biomedical Semantic Web ontologies to represent medical knowledge, both known and tentative, we increase the accuracy and expressivity of plausible reasoning, and cope with issues such as data heterogeneity, inconsistency and interoperability. In this paper, we present a Semantic Web-based, multi-strategy reasoning approach, which integrates deductive and plausible reasoning and exploits Semantic Web technology to solve complex clinical decision support queries. We evaluated our system using a real-world medical dataset of patients with hepatitis, from which we randomly removed different percentages of data (5%, 10%, 15%, and 20%) to reflect scenarios with increasing amounts of incomplete medical knowledge. To increase the reliability of the results, we generated 5 independent datasets for each percentage of missing values, which resulted in 20 experimental datasets (in addition to the original dataset). The results show that plausibly inferred knowledge extends the coverage of the knowledge base by, on average, 2%, 7%, 12%, and 16% for datasets with, respectively, 5%, 10%, 15%, and 20% of missing values. This expansion in the KB coverage allowed solving complex disease diagnostic queries that were previously unresolvable, without losing the correctness of the answers. However, compared to deductive reasoning, data-intensive plausible reasoning mechanisms yield a significant performance overhead. We observed that plausible reasoning approaches, by generating tentative inferences and leveraging domain knowledge of experts, allow us to extend the coverage of medical knowledge bases, resulting in improved clinical decision support. Second, by leveraging OWL ontological knowledge, we are able to increase the expressivity and accuracy of plausible reasoning methods. Third, our approach is applicable to clinical decision support systems for a range of chronic diseases.
NSF's Perspective on Space Weather Research for Building Forecasting Capabilities
NASA Astrophysics Data System (ADS)
Bisi, M. M.; Pulkkinen, A. A.; Bisi, M. M.; Pulkkinen, A. A.; Webb, D. F.; Oughton, E. J.; Azeem, S. I.
2017-12-01
Space weather research at the National Science Foundation (NSF) is focused on scientific discovery and on deepening knowledge of the Sun-Geospace system. The process of maturation of knowledge base is a requirement for the development of improved space weather forecast models and for the accurate assessment of potential mitigation strategies. Progress in space weather forecasting requires advancing in-depth understanding of the underlying physical processes, developing better instrumentation and measurement techniques, and capturing the advancements in understanding in large-scale physics based models that span the entire chain of events from the Sun to the Earth. This presentation will provide an overview of current and planned programs pertaining to space weather research at NSF and discuss the recommendations of the Geospace Section portfolio review panel within the context of space weather forecasting capabilities.
ERIC Educational Resources Information Center
Turvey, Keith
2010-01-01
This paper argues that if new communications technologies and online spaces are to yield "new relationship[s] with learners" then research that is tuned to recognize, capture and explain the pedagogical processes at the center of such interactions is vital. This has implications for the design of pedagogical activities within Initial…
Connecting Provenance with Semantic Descriptions in the NASA Earth Exchange (NEX)
NASA Astrophysics Data System (ADS)
Votava, P.; Michaelis, A.; Nemani, R. R.
2012-12-01
NASA Earth Exchange (NEX) is a data, modeling and knowledge collaboratory that houses NASA satellite data, climate data and ancillary data where a focused community may come together to share modeling and analysis codes, scientific results, knowledge and expertise on a centralized platform. Some of the main goals of NEX are transparency and repeatability and to that extent we have been adding components that enable tracking of provenance of both scientific processes and datasets produced by these processes. As scientific processes become more complex, they are often developed collaboratively and it becomes increasingly important for the research team to be able to track the development of the process and the datasets that are produced along the way. Additionally, we want to be able to link the processes and the datasets developed on NEX to an existing information and knowledge, so that the users can query and compare the provenance of any dataset or process with regard to the component-specific attributes such as data quality, geographic location, related publications, user comments and annotations etc. We have developed several ontologies that describe datasets and workflow components available on NEX using the OWL ontology language as well as a simple ontology that provides linking mechanism to the collected provenance information. The provenance is captured in two ways - we utilize existing provenance infrastructure of VisTrails, which is used as a workflow engine on NEX, and we extend the captured provenance using the PROV data model expressed through the PROV-O ontology. We do this in order to link and query the provenance easier in the context of the existing NEX information and knowledge. The captured provenance graph is processed and stored using RDFlib with MySQL backend that can be queried using either RDFLib or SPARQL. As a concrete example, we show how this information is captured during anomaly detection process in large satellite datasets.
Electronic Portfolios as Capstone Experiences in a Graduate Program in Organizational Leadership
ERIC Educational Resources Information Center
Goertzen, Brent J.; McRay, Jeni; Klaus, Kaley
2016-01-01
Assessment of student learning in graduate education often takes the form of a summative measure by way of written comprehensive exams. However, written examinations, while suitable for evaluating cognitive knowledge, may not fully capture students' abilities to transfer and apply leadership related knowledge and skills into real-world practice.…
NASA Technical Reports Server (NTRS)
Keller, Richard M.; Norvig, Peter (Technical Monitor)
2000-01-01
NASA's ScienceDesk Project at the Ames Research Center is responsible for scientific knowledge management which includes ensuring the capture, preservation, and traceability of scientific knowledge. Other responsibilities include: 1) Maintaining uniform information access which is achieved through intelligent indexing and visualization, 2) Collaborating both asynchronous and synchronous science teamwork, 3) Monitoring and controlling semi-autonomous remote experimentation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Danielson, Gary R.; Augustenborg, Elsa C.; Beck, Andrew E.
2010-10-29
The IAEA is challenged with limited availability of human resources for inspection and data analysis while proliferation threats increase. PNNL has a variety of IT solutions and techniques (at varying levels of maturity and development) that take raw data closer to useful knowledge, thereby assisting with and standardizing the analytical processes. This paper highlights some PNNL tools and techniques which are applicable to the international safeguards community, including: • Intelligent in-situ triage of data prior to reliable transmission to an analysis center resulting in the transmission of smaller and more relevant data sets • Capture of expert knowledge in re-usablemore » search strings tailored to specific mission outcomes • Image based searching fused with text based searching • Use of gaming to discover unexpected proliferation scenarios • Process modeling (e.g. Physical Model) as the basis for an information integration portal, which links to data storage locations along with analyst annotations, categorizations, geographic data, search strings and visualization outputs.« less
A top-level ontology of functions and its application in the Open Biomedical Ontologies.
Burek, Patryk; Hoehndorf, Robert; Loebe, Frank; Visagie, Johann; Herre, Heinrich; Kelso, Janet
2006-07-15
A clear understanding of functions in biology is a key component in accurate modelling of molecular, cellular and organismal biology. Using the existing biomedical ontologies it has been impossible to capture the complexity of the community's knowledge about biological functions. We present here a top-level ontological framework for representing knowledge about biological functions. This framework lends greater accuracy, power and expressiveness to biomedical ontologies by providing a means to capture existing functional knowledge in a more formal manner. An initial major application of the ontology of functions is the provision of a principled way in which to curate functional knowledge and annotations in biomedical ontologies. Further potential applications include the facilitation of ontology interoperability and automated reasoning. A major advantage of the proposed implementation is that it is an extension to existing biomedical ontologies, and can be applied without substantial changes to these domain ontologies. The Ontology of Functions (OF) can be downloaded in OWL format from http://onto.eva.mpg.de/. Additionally, a UML profile and supplementary information and guides for using the OF can be accessed from the same website.
The Effectiveness of Classroom Capture Technology
ERIC Educational Resources Information Center
Ford, Maire B.; Burns, Colleen E.; Mitch, Nathan; Gomez, Melissa M.
2012-01-01
The use of classroom capture systems (systems that capture audio and video footage of a lecture and attempt to replicate a classroom experience) is becoming increasingly popular at the university level. However, research on the effectiveness of classroom capture systems in the university classroom has been limited due to the recent development and…
ANCA-associated vasculitis in Greek siblings with chronic exposure to silica.
Brener, Z; Cohen, L; Goldberg, S J; Kaufman, A M
2001-11-01
We present the case of two siblings with similar environmental exposure to silica. Both of them developed perinuclear antineutrophil cytoplasmic antibody (p-ANCA)-associated vasculitis with pulmonary-renal syndrome. p-ANCAs were present with antimyeloperoxidase specificity on capture enzyme-linked immunosorbent assay. Treatment with corticosteroids and cyclophosphamide resulted in resolution of the clinical picture. Chronic exposure to silica is the leading environmental factor associated with ANCA-positive vasculitis. Several clusters of systemic vasculitis have been described. Positive and negative human leukocyte antigens (HLA) have been reported in systemic vasculitis. Affected brothers in our case shared one parental HLA haplotype. To the best of our knowledge, this is the first report of a family cluster of silica-induced, ANCA-associated systemic vasculitis with members sharing some of their HLA antigens.
Distilling free-form natural laws from experimental data.
Schmidt, Michael; Lipson, Hod
2009-04-03
For centuries, scientists have attempted to identify and document analytical laws that underlie physical phenomena in nature. Despite the prevalence of computing power, the process of finding natural laws and their corresponding equations has resisted automation. A key challenge to finding analytic relations automatically is defining algorithmically what makes a correlation in observed data important and insightful. We propose a principle for the identification of nontriviality. We demonstrated this approach by automatically searching motion-tracking data captured from various physical systems, ranging from simple harmonic oscillators to chaotic double-pendula. Without any prior knowledge about physics, kinematics, or geometry, the algorithm discovered Hamiltonians, Lagrangians, and other laws of geometric and momentum conservation. The discovery rate accelerated as laws found for simpler systems were used to bootstrap explanations for more complex systems, gradually uncovering the "alphabet" used to describe those systems.
NASA Astrophysics Data System (ADS)
Zhou, Yuzhi; Wang, Han; Liu, Yu; Gao, Xingyu; Song, Haifeng
2018-03-01
The Kerker preconditioner, based on the dielectric function of homogeneous electron gas, is designed to accelerate the self-consistent field (SCF) iteration in the density functional theory calculations. However, a question still remains regarding its applicability to the inhomogeneous systems. We develop a modified Kerker preconditioning scheme which captures the long-range screening behavior of inhomogeneous systems and thus improves the SCF convergence. The effectiveness and efficiency is shown by the tests on long-z slabs of metals, insulators, and metal-insulator contacts. For situations without a priori knowledge of the system, we design the a posteriori indicator to monitor if the preconditioner has suppressed charge sloshing during the iterations. Based on the a posteriori indicator, we demonstrate two schemes of the self-adaptive configuration for the SCF iteration.
XML-Based SHINE Knowledge Base Interchange Language
NASA Technical Reports Server (NTRS)
James, Mark; Mackey, Ryan; Tikidjian, Raffi
2008-01-01
The SHINE Knowledge Base Interchange Language software has been designed to more efficiently send new knowledge bases to spacecraft that have been embedded with the Spacecraft Health Inference Engine (SHINE) tool. The intention of the behavioral model is to capture most of the information generally associated with a spacecraft functional model, while specifically addressing the needs of execution within SHINE and Livingstone. As such, it has some constructs that are based on one or the other.
Do different attention capture paradigms measure different types of capture?
Roque, Nelson A; Wright, Timothy J; Boot, Walter R
2016-10-01
When something captures our attention, why does it do so? This topic has been hotly debated, with some arguing that attention is captured only by salient stimuli (bottom-up view) and others arguing capture is always due to a match between a stimulus and our goals (top-down view). Many different paradigms have provided evidence for 1 view or the other. If either of these strong views are correct, then capture represents a unitary phenomenon, and there should be a high correlation between capture in these paradigms. But if there are different types of capture (top-down, bottom-up), then some attention capture effects should be correlated and some should not. In 2 studies, we collected data from several paradigms used in support of claims of top-down and bottom-up capture in relatively large samples of participants. Contrary to either prediction, measures of capture were not strongly correlated. Results suggest that capture may in fact be strongly determined by idiosyncratic task demands and strategies. Relevant to this lack of relations among tasks, we observed that classic measures of attention capture demonstrated low reliability, especially among measures used to support bottom-up capture. Implications for the low reliability of capture measures are discussed. We also observed that the proportion of participants demonstrating a pattern of responses consistent with capture varied widely among classic measures of capture. Overall, results demonstrate that, even for relatively simple laboratory measures of attention, there are still important gaps in knowledge regarding what these paradigms measure and how they are related.
The genetics of anxiety-related negative valence system traits.
Savage, Jeanne E; Sawyers, Chelsea; Roberson-Nay, Roxann; Hettema, John M
2017-03-01
NIMH's Research Domain Criteria (RDoC) domain of negative valence systems (NVS) captures constructs of negative affect such as fear and distress traditionally subsumed under the various internalizing disorders. Through its aims to capture dimensional measures that cut across diagnostic categories and are linked to underlying neurobiological systems, a large number of phenotypic constructs have been proposed as potential research targets. Since "genes" represent a central "unit of analysis" in the RDoC matrix, it is important for studies going forward to apply what is known about the genetics of these phenotypes as well as fill in the gaps of existing knowledge. This article reviews the extant genetic epidemiological data (twin studies, heritability) and molecular genetic association findings for a broad range of putative NVS phenotypic measures. We find that scant genetic epidemiological data is available for experimentally derived measures such as attentional bias, peripheral physiology, or brain-based measures of threat response. The molecular genetic basis of NVS phenotypes is in its infancy, since most studies have focused on a small number of candidate genes selected for putative association to anxiety disorders (ADs). Thus, more research is required to provide a firm understanding of the genetic aspects of anxiety-related NVS constructs. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
The Genetics of Anxiety-Related Negative Valence System Traits
Savage, Jeanne E.; Sawyers, Chelsea; Roberson-Nay, Roxann; Hettema, John M.
2017-01-01
NIMH’s Research Domain Criteria (RDoC) domain of negative valence systems (NVS) captures constructs of negative affect such as fear and distress traditionally subsumed under the various internalizing disorders. Through its aims to capture dimensional measures that cut across diagnostic categories and are linked to underlying neurobiological systems, a large number of phenotypic constructs have been proposed as potential research targets. Since “genes” represent a central “unit of analysis” in the RDoC matrix, it is important for studies going forward to apply what is known about the genetics of these phenotypes as well as fill in the gaps of existing knowledge. This article reviews the extant genetic epidemiological data (twin studies, heritability) and molecular genetic association findings for a broad range of putative NVS phenotypic measures. We find that scant genetic epidemiological data is available for experimentally-derived measures such as attentional bias, peripheral physiology, or brain-based measures of threat response. The molecular genetic basis of NVS phenotypes is in its infancy, since most studies have focused on a small number of candidate genes selected for putative association to anxiety disorders (ADs). Thus, more research is required to provide a firm understanding of the genetic aspects of anxiety-related NVS constructs. PMID:27196537
NASA Technical Reports Server (NTRS)
Buquo, Lynn; Johnson-Throop, Kathy
2010-01-01
NASA's Human Research Program (HRP) and Space Life Sciences Directorate (SLSD), not unlike many NASA organizations today, struggle with the inherent inefficiencies caused by dependencies on heterogeneous data systems and silos of data and information spread across decentralized discipline domains. The capture of operational and research-based data/information (both in-flight and ground-based) in disparate IT systems impedes the extent to which that data/information can be efficiently and securely shared, analyzed, and enriched into knowledge that directly and more rapidly supports HRP's research-focused human system risk mitigation efforts and SLSD s operationally oriented risk management efforts. As a result, an integrated effort is underway to more fully understand and document how specific sets of risk-related data/information are generated and used and in what IT systems that data/information currently resides. By mapping the risk-related data flow from raw data to useable information and knowledge (think of it as the data supply chain), HRP and SLSD are building an information architecture plan to leverage their existing, shared IT infrastructure. In addition, it is important to create a centralized structured tool to represent risks including attributes such as likelihood, consequence, contributing factors, and the evidence supporting the information in all these fields. Representing the risks in this way enables reasoning about the risks, e.g. revisiting a risk assessment when a mitigation strategy is unavailable, updating a risk assessment when new information becomes available, etc. Such a system also provides a concise way to communicate the risks both within the organization as well as with collaborators. Understanding and, hence, harnessing the human system risk-related data supply chain enhances both organizations' abilities to securely collect, integrate, and share data assets that improve human system research and operations.
Economic and energetic analysis of capturing CO2 from ambient air
House, Kurt Zenz; Baclig, Antonio C.; Ranjan, Manya; van Nierop, Ernst A.; Wilcox, Jennifer; Herzog, Howard J.
2011-01-01
Capturing carbon dioxide from the atmosphere (“air capture”) in an industrial process has been proposed as an option for stabilizing global CO2 concentrations. Published analyses suggest these air capture systems may cost a few hundred dollars per tonne of CO2, making it cost competitive with mainstream CO2 mitigation options like renewable energy, nuclear power, and carbon dioxide capture and storage from large CO2 emitting point sources. We investigate the thermodynamic efficiencies of commercial separation systems as well as trace gas removal systems to better understand and constrain the energy requirements and costs of these air capture systems. Our empirical analyses of operating commercial processes suggest that the energetic and financial costs of capturing CO2 from the air are likely to have been underestimated. Specifically, our analysis of existing gas separation systems suggests that, unless air capture significantly outperforms these systems, it is likely to require more than 400 kJ of work per mole of CO2, requiring it to be powered by CO2-neutral power sources in order to be CO2 negative. We estimate that total system costs of an air capture system will be on the order of $1,000 per tonne of CO2, based on experience with as-built large-scale trace gas removal systems. PMID:22143760
40 CFR 63.3965 - How do I determine the emission capture system efficiency?
Code of Federal Regulations, 2010 CFR
2010-07-01
...; coating solvent flash-off, curing, and drying occurs within the capture system; and the removal or... spray booth and a curing oven. (b) Measuring capture efficiency. If the capture system does not meet... surface preparation activities and drying and curing time. (c) Liquid-to-uncaptured-gas protocol using a...
Knowledge Management in healthcare libraries: the current picture.
Hopkins, Emily
2017-06-01
Knowledge management has seen something of a resurgence in attention amongst health librarians recently. Of course it has never ceased to exist, but now many library staff are becoming more involved in organisational knowledge management, and positioning themselves as key players in the sphere. No single model of knowledge management is proliferating, but approaches that best fit the organisation's size, structure and culture, and a blending of evidence based practice and knowledge sharing. Whatever it is called and whatever models are used, it's clear that for librarians and information professionals, the importance of putting knowledge and evidence into practice, sharing knowledge well and capturing it effectively, are still what we will continue to do. © 2017 Health Libraries Group.
Intelligent nursing: accounting for knowledge as action in practice.
Purkis, Mary E; Bjornsdottir, Kristin
2006-10-01
This paper provides an analysis of nursing as a knowledgeable discipline. We examined ways in which knowledge operates in the practice of home care nursing and explored how knowledge might be fruitfully understood within the ambiguous spaces and competing temporalities characterizing contemporary healthcare services. Two popular metaphors of knowledge in nursing practice were identified and critically examined; evidence-based practice and the nurse as an intuitive worker. Pointing to faults in these conceptualizations, we suggest a different way of conceptualizing the relationship between knowledge and practice, namely practice as being activated by contextualized knowledge. This conceptualization is captured in an understanding of the intelligent creation of context by the nurse for nursing practice to be ethical and effective.
Shoo, Rumishael; Matuku, Willy; Ireri, Jane; Nyagero, Josephat; Gatonga, Patrick
2012-01-01
Introduction AMREF (African Medical and Research Foundation) developed a Knowledge Management Strategy that focused on creating, capturing and applying health knowledge to close the gap between communities and health systems in Africa. There was need to identify AMREF's current Knowledge Management implementation status, problems and constraints encountered after two years of enforcement of the strategy and suggest the way forward. Methods This study was conducted between October 2011 and February 2012. Quantitative data on number and foci of AMREF research publications were collected using a questionnaire. Focus group discussions and in-depth interviews were used to gather data on explanations for the trend of publications and the status of the implementation of the 2010-2014 Knowledge Management Strategy. Quantitative data was analysed using SPSS computer software whereas content analysis of themes was employed on qualitative data. Results Between 1960 and 2011, AMREF produced 257 peer reviewed publications, 158 books and manuals and about 1,188 technical publications including evaluations, guidelines and technical reports. However, the numbers of publications declined from around the year 2000. Large quantities of unpublished and unclassified materials are also in the custody of Heritage. Barriers to Knowledge Management included: lack of incentives for documentation and dissemination; limited documentation and use of good practices in programming; and superficial attention to results or use of evidence. Conclusion Alternative ways of reorganizing Knowledge Management will enable AMREF to use evidence-based knowledge to advocate for appropriate changes in African health policies and practices. PMID:23467647
Shoo, Rumishael; Matuku, Willy; Ireri, Jane; Nyagero, Josephat; Gatonga, Patrick
2012-01-01
AMREF (African Medical and Research Foundation) developed a Knowledge Management Strategy that focused on creating, capturing and applying health knowledge to close the gap between communities and health systems in Africa. There was need to identify AMREF's current Knowledge Management implementation status, problems and constraints encountered after two years of enforcement of the strategy and suggest the way forward. This study was conducted between October 2011 and February 2012. Quantitative data on number and foci of AMREF research publications were collected using a questionnaire. Focus group discussions and in-depth interviews were used to gather data on explanations for the trend of publications and the status of the implementation of the 2010-2014 Knowledge Management Strategy. Quantitative data was analysed using SPSS computer software whereas content analysis of themes was employed on qualitative data. Between 1960 and 2011, AMREF produced 257 peer reviewed publications, 158 books and manuals and about 1,188 technical publications including evaluations, guidelines and technical reports. However, the numbers of publications declined from around the year 2000. Large quantities of unpublished and unclassified materials are also in the custody of Heritage. Barriers to Knowledge Management included: lack of incentives for documentation and dissemination; limited documentation and use of good practices in programming; and superficial attention to results or use of evidence. Alternative ways of reorganizing Knowledge Management will enable AMREF to use evidence-based knowledge to advocate for appropriate changes in African health policies and practices.
A statistical approach to root system classification
Bodner, Gernot; Leitner, Daniel; Nakhforoosh, Alireza; Sobotik, Monika; Moder, Karl; Kaul, Hans-Peter
2013-01-01
Plant root systems have a key role in ecology and agronomy. In spite of fast increase in root studies, still there is no classification that allows distinguishing among distinctive characteristics within the diversity of rooting strategies. Our hypothesis is that a multivariate approach for “plant functional type” identification in ecology can be applied to the classification of root systems. The classification method presented is based on a data-defined statistical procedure without a priori decision on the classifiers. The study demonstrates that principal component based rooting types provide efficient and meaningful multi-trait classifiers. The classification method is exemplified with simulated root architectures and morphological field data. Simulated root architectures showed that morphological attributes with spatial distribution parameters capture most distinctive features within root system diversity. While developmental type (tap vs. shoot-borne systems) is a strong, but coarse classifier, topological traits provide the most detailed differentiation among distinctive groups. Adequacy of commonly available morphologic traits for classification is supported by field data. Rooting types emerging from measured data, mainly distinguished by diameter/weight and density dominated types. Similarity of root systems within distinctive groups was the joint result of phylogenetic relation and environmental as well as human selection pressure. We concluded that the data-define classification is appropriate for integration of knowledge obtained with different root measurement methods and at various scales. Currently root morphology is the most promising basis for classification due to widely used common measurement protocols. To capture details of root diversity efforts in architectural measurement techniques are essential. PMID:23914200
A statistical approach to root system classification.
Bodner, Gernot; Leitner, Daniel; Nakhforoosh, Alireza; Sobotik, Monika; Moder, Karl; Kaul, Hans-Peter
2013-01-01
Plant root systems have a key role in ecology and agronomy. In spite of fast increase in root studies, still there is no classification that allows distinguishing among distinctive characteristics within the diversity of rooting strategies. Our hypothesis is that a multivariate approach for "plant functional type" identification in ecology can be applied to the classification of root systems. The classification method presented is based on a data-defined statistical procedure without a priori decision on the classifiers. The study demonstrates that principal component based rooting types provide efficient and meaningful multi-trait classifiers. The classification method is exemplified with simulated root architectures and morphological field data. Simulated root architectures showed that morphological attributes with spatial distribution parameters capture most distinctive features within root system diversity. While developmental type (tap vs. shoot-borne systems) is a strong, but coarse classifier, topological traits provide the most detailed differentiation among distinctive groups. Adequacy of commonly available morphologic traits for classification is supported by field data. Rooting types emerging from measured data, mainly distinguished by diameter/weight and density dominated types. Similarity of root systems within distinctive groups was the joint result of phylogenetic relation and environmental as well as human selection pressure. We concluded that the data-define classification is appropriate for integration of knowledge obtained with different root measurement methods and at various scales. Currently root morphology is the most promising basis for classification due to widely used common measurement protocols. To capture details of root diversity efforts in architectural measurement techniques are essential.
The Climate Services Partnership (CSP): Working Together to Improve Climate Services Worldwide
NASA Astrophysics Data System (ADS)
Zebiak, S.; Brasseur, G.; Members of the CSP Coordinating Group
2012-04-01
Throughout the world, climate services are required to address urgent needs for climate-informed decision-making, policy and planning. These needs were explored in detail at the first International Conference on Climate Services (ICCS), held in New York in October 2011. After lengthy discussions of needs and capabilities, the conference culminated in the creation of the Climate Services Partnership (CSP). The CSP is an informal interdisciplinary network of climate information users, providers, donors and researchers interested in improving the provision and development of climate services worldwide. Members of the Climate Services Partnership work together to share knowledge, accelerate learning, develop new capacities, and establish good practices. These collaborative efforts will inform and support the evolution and implementation of the Global Framework for Climate Services. The Climate Services Partnership focuses its efforts on three levels. These include: 1. encouraging and sustaining connections between climate information providers, users, donors, and researchers 2. gathering, synthesizing and disseminating current knowledge on climate services by way of an online knowledge management platform 3. generating new knowledge on critical topics in climate service development and provision, through the creation of focused working groups on specific topics To date, the Climate Services Partnership has made progress on all three fronts. Connections have been fostered through outreach at major international conferences and professional societies. The CSP also maintains a website and a monthly newsletter, which serves as a resource for those interested in climate services. The second International Conference on Climate Services (ICCS2) will be held in Berlin in September. The CSP has also created a knowledge capture system that gathers and disseminates a wide range of information related to the development and provision of climate services. This includes an online-searchable database that allows users to see what climate services activities are underway in what locations, to gather and analyze information. As part of the knowledge capture system, more than 10 CSP members are currently developing case studies to describe specific climate services activities; in a few cases, this involves in-depth evaluations of the service in question. Finally, the Economics Working Group of the Climate Services Partnership is analyzing previous methods to economically value climate services in hopes of generating knew knowledge regarding the methods are best suited to assessing the benefits associated with various climate services. Other groups are working to develop guidance materials for the development and use of climate information to support decision and policy-making. The Climate Services Partnership is an open, informal network that builds on activities that are already underway and works to create synergies to improve the provision and development for climate services. Its members currently number more than 50 organizations; it seeks new participants and new initiatives.
Surveillance Systems to Track and Evaluate Obesity Prevention Efforts.
Hoelscher, Deanna M; Ranjit, Nalini; Pérez, Adriana
2017-03-20
To address the obesity epidemic, the public health community must develop surveillance systems that capture data at levels through which obesity prevention efforts are conducted. Current systems assess body mass index (BMI), diet, and physical activity behaviors at the individual level, but environmental and policy-related data are often lacking. The goal of this review is to describe US surveillance systems that evaluate obesity prevention efforts within the context of international trends in obesity monitoring, to identify potential data gaps, and to present recommendations to improve the evaluation of population-level initiatives. Our recommendations include adding environmental and policy measures to surveillance efforts with a focus on addressing underserved populations, harmonizing existing surveillance systems, including more sensitive measures of obesity outcomes, and developing a knowledgeable workforce. In addition, the widespread use of electronic health records and new technologies that allow self-quantification of behaviors offers opportunities for innovative surveillance methods.
An Informatics Blueprint for Healthcare Quality Information Systems
Niland, Joyce C.; Rouse, Layla; Stahl, Douglas C.
2006-01-01
There is a critical gap in our nation's ability to accurately measure and manage the quality of medical care. A robust healthcare quality information system (HQIS) has the potential to address this deficiency through the capture, codification, and analysis of information about patient treatments and related outcomes. Because non-technical issues often present the greatest challenges, this paper provides an overview of these socio-technical issues in building a successful HQIS, including the human, organizational, and knowledge management (KM) perspectives. Through an extensive literature review and direct experience in building a practical HQIS (the National Comprehensive Cancer Network Outcomes Research Database system), we have formulated an “informatics blueprint” to guide the development of such systems. While the blueprint was developed to facilitate healthcare quality information collection, management, analysis, and reporting, the concepts and advice provided may be extensible to the development of other types of clinical research information systems. PMID:16622161
Ahmed, S. Sohail; Oviedo-Orta, Ernesto; Mekaru, Sumiko R.; Freifeld, Clark C.; Tougas, Gervais; Brownstein, John S.
2015-01-01
Background While formal reporting, surveillance, and response structures remain essential to protecting public health, a new generation of freely accessible, online, and real-time informatics tools for disease tracking are expanding the ability to raise earlier public awareness of emerging disease threats. The rationale for this study is to test the hypothesis that the HealthMap informatics tools can complement epidemiological data captured by traditional surveillance monitoring systems for meningitis due to Neisseria meningitides (N. meningitides) by highlighting severe transmissible disease activity and outbreaks in the United States. Methods Annual analyses of N. meningitides disease alerts captured by HealthMap were compared to epidemiological data captured by the Centers for Disease Control’s Active Bacterial Core surveillance (ABCs) for N. meningitides. Morbidity and mortality case reports were measured annually from 2010 to 2013 (HealthMap) and 2005 to 2012 (ABCs). Findings HealthMap N. meningitides monitoring captured 80-90% of alerts as diagnosed N. meningitides, 5-20% of alerts as suspected cases, and 5-10% of alerts as related news articles. HealthMap disease alert activity for emerging disease threats related to N. meningitides were in agreement with patterns identified historically using traditional surveillance systems. HealthMap’s strength lies in its ability to provide a cumulative “snapshot” of weak signals that allows for rapid dissemination of knowledge and earlier public awareness of potential outbreak status while formal testing and confirmation for specific serotypes is ongoing by public health authorities. Conclusions The underreporting of disease cases in internet-based data streaming makes inadequate any comparison to epidemiological trends illustrated by the more comprehensive ABCs network published by the Centers for Disease Control. However, the expected delays in compiling confirmatory reports by traditional surveillance systems (at the time of writing, ABCs data for 2013 is listed as being provisional) emphasize the helpfulness of real-time internet-based data streaming to quickly fill gaps including the visualization of modes of disease transmission in outbreaks for better resource and action planning. HealthMap can also contribute as an internet-based monitoring system to provide real-time channel for patients to report intervention-related failures. PMID:25992552
Typhoon Usagi approaching China
2017-12-08
On Saturday, Sept. 21, TRMM captured rainfall data on Typhoon Usagi as it passed between the northern Philippines and southern Taiwan. TRMM found rain falling at a rate of over 134 mm/hr (~5.2 inches) in USAGI's eye wall. Credit: SSAI/NASA, Hal Pierce NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram
Typhoon Usagi approaching China
2013-09-23
The Moderate Resolution Imaging Spectroradiometer or MODIS instrument that flies aboard NASA's Terra satellite captured this image of Typhoon Usagi on Sept. 22 at 02:45 UTC/Sept. 21 at 10:45 p.m. EDT on its approach to a landfall in China. Credit: NASA Goddard MODIS Rapid Response Team NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram
ScienceOrganizer System and Interface Summary
NASA Technical Reports Server (NTRS)
Keller, Richard M.; Norvig, Peter (Technical Monitor)
2001-01-01
ScienceOrganizer is a specialized knowledge management tool designed to enhance the information storage, organization, and access capabilities of distributed NASA science teams. Users access ScienceOrganizer through an intuitive Web-based interface that enables them to upload, download, and organize project information - including data, documents, images, and scientific records associated with laboratory and field experiments. Information in ScienceOrganizer is "threaded", or interlinked, to enable users to locate, track, and organize interrelated pieces of scientific data. Linkages capture important semantic relationships among information resources in the repository, and these assist users in navigating through the information related to their projects.
Happy Summer Solstice Northern Hemisphere
2017-12-08
This full-disk image from NOAA’s GOES-13 satellite was captured at 11:45 UTC (7:45 a.m. EDT) and shows the Americas on June 21, 2012. This date marks the start of astronomical summer in the northern hemisphere, making it the longest day of the year! NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram
Low clouds over the Yellow Sea and the East China Sea
2017-12-08
Low clouds over the Yellow Sea and the East China Sea was captured by the MODIS instrument on the Aqua satellite on April 1, 2016 at 4:55 UTC. Credit: NASA/Goddard/Jeff Schmaltz/MODIS Land Rapid Response Team NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram
NASA Sees Severe Weather from Central to Eastern US
2017-12-08
Suomi NPP capture this true-color image of the storms over the Midwest and US South on April 30, 2017. This images comes from the Visible Infrared Imaging Radiometer Suite (VIIRS) instrument on @NASA.NPP Credit: NASA/NOAA/NPP/VIIRS NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram
2017-12-08
Cloud vortices off Heard Island, south Indian Ocean. The Moderate Resolution Imaging Spectroradiometer (MODIS) aboard NASA’s Aqua satellite captured this true-color image of sea ice off Heard Island on Nov 2, 2015 at 5:02 AM EST (09:20 UTC). Credit: NASA/GSFC/Jeff Schmaltz/MODIS Land Rapid Response Team NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram
Application of future remote sensing systems to irrigation
NASA Technical Reports Server (NTRS)
Miller, L. D.
1982-01-01
Area estimates of irrigated crops and knowledge of crop type are required for modeling water consumption to assist farmers, rangers, and agricultural consultants in scheduling irrigation for distributed management of crop yields. Information on canopy physiology and soil moisture status on a spatial basis is potentially available from remote sensors, so the questions to be addressed relate to: (1) timing (data frequency, instantaneous and integrated measurement); and scheduling (widely distributed spatial demands); (2) spatial resolution; (3) radiometric and geometric accuracy and geoencoding; and (4) information/data distribution. This latter should be overnight, with no central storage, onsite capture, and low cost.
2017-12-08
NOAA's GOES-East satellite captured a visible image of the storm on Sunday, Oct. 18 at 1145 UTC (7:45 a.m. EDT) that showed it in the North Atlantic, blanketing eastern Canada and stretching east over open waters. ..Credit: NOAA/NASA GOES Project..NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram
NASA Astrophysics Data System (ADS)
Akram, Muhammad Farooq Bin
The management of technology portfolios is an important element of aerospace system design. New technologies are often applied to new product designs to ensure their competitiveness at the time they are introduced to market. The future performance of yet-to- be designed components is inherently uncertain, necessitating subject matter expert knowledge, statistical methods and financial forecasting. Estimates of the appropriate parameter settings often come from disciplinary experts, who may disagree with each other because of varying experience and background. Due to inherent uncertain nature of expert elicitation in technology valuation process, appropriate uncertainty quantification and propagation is very critical. The uncertainty in defining the impact of an input on performance parameters of a system makes it difficult to use traditional probability theory. Often the available information is not enough to assign the appropriate probability distributions to uncertain inputs. Another problem faced during technology elicitation pertains to technology interactions in a portfolio. When multiple technologies are applied simultaneously on a system, often their cumulative impact is non-linear. Current methods assume that technologies are either incompatible or linearly independent. It is observed that in case of lack of knowledge about the problem, epistemic uncertainty is the most suitable representation of the process. It reduces the number of assumptions during the elicitation process, when experts are forced to assign probability distributions to their opinions without sufficient knowledge. Epistemic uncertainty can be quantified by many techniques. In present research it is proposed that interval analysis and Dempster-Shafer theory of evidence are better suited for quantification of epistemic uncertainty in technology valuation process. Proposed technique seeks to offset some of the problems faced by using deterministic or traditional probabilistic approaches for uncertainty propagation. Non-linear behavior in technology interactions is captured through expert elicitation based technology synergy matrices (TSM). Proposed TSMs increase the fidelity of current technology forecasting methods by including higher order technology interactions. A test case for quantification of epistemic uncertainty on a large scale problem of combined cycle power generation system was selected. A detailed multidisciplinary modeling and simulation environment was adopted for this problem. Results have shown that evidence theory based technique provides more insight on the uncertainties arising from incomplete information or lack of knowledge as compared to deterministic or probability theory methods. Margin analysis was also carried out for both the techniques. A detailed description of TSMs and their usage in conjunction with technology impact matrices and technology compatibility matrices is discussed. Various combination methods are also proposed for higher order interactions, which can be applied according to the expert opinion or historical data. The introduction of technology synergy matrix enabled capturing the higher order technology interactions, and improvement in predicted system performance.
Study on data acquisition system based on reconfigurable cache technology
NASA Astrophysics Data System (ADS)
Zhang, Qinchuan; Li, Min; Jiang, Jun
2018-03-01
Waveform capture rate is one of the key features of digital acquisition systems, which represents the waveform processing capability of the system in a unit time. The higher the waveform capture rate is, the larger the chance to capture elusive events is and the more reliable the test result is. First, this paper analyzes the impact of several factors on the waveform capture rate of the system, then the novel technology based on reconfigurable cache is further proposed to optimize system architecture, and the simulation results show that the signal-to-noise ratio of signal, capacity, and structure of cache have significant effects on the waveform capture rate. Finally, the technology is demonstrated by the engineering practice, and the results show that the waveform capture rate of the system is improved substantially without significant increase of system's cost, and the technology proposed has a broad application prospect.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lagos, L.; Upadhyay, H.; Shoffner, P.
2013-07-01
Deactivation and decommissioning (D and D) work is a high risk and technically challenging enterprise within the U.S. Department of Energy complex. During the past three decades, the DOE's Office of Environmental Management has been in charge of carrying out one of the largest environmental restoration efforts in the world: the cleanup of the Manhattan Project legacy. In today's corporate world, worker experiences and knowledge that have developed over time represent a valuable corporate asset. The ever-dynamic workplace, coupled with an aging workforce, presents corporations with the ongoing challenge of preserving work-related experiences and knowledge for cross-generational knowledge transfer tomore » the future workforce [5]. To prevent the D and D knowledge base and expertise from being lost over time, the DOE and the Applied Research Center at Florida International University (FIU) have developed the web-based Knowledge Management Information Tool (KM-IT) to capture and maintain this valuable information in a universally available and easily accessible and usable system. The D and D KM-IT was developed in collaboration with DOE Headquarters (HQ), the Energy Facility Contractors Group (EFCOG), and the ALARA [as low as reasonably achievable] Centers at Savannah River Sites to preserve the D and D information generated and collected by the D and D community. This is an open secured system that can be accessed from https://www.dndkm.org over the web and through mobile devices at https://m.dndkm.org. This knowledge system serves as a centralized repository and provides a common interface for D and D-related activities. It also improves efficiency by reducing the need to rediscover knowledge and promotes the reuse of existing knowledge. It is a community-driven system that facilitates the gathering, analyzing, storing, and sharing of knowledge and information within the D and D community. It assists the DOE D and D community in identifying potential solutions to their problem areas by using the vast resources and knowledge base available throughout the global D and D community. The D and D KM-IT offers a mechanism to the global D and D community for searching relevant D and D information and is focused on providing a single point of access into the collective knowledge base of the D and D community within and outside of the DOE. Collecting information from subject matter specialists, it builds a knowledge repository for future reference archiving Lessons Learned, Best Practices, ALARA reports, and other relevant documents and maintains a secured collaboration platform for the global D and D community to share knowledge. With the dynamic nature and evolution of the D and D knowledge base due to multiple factors such as changes in the workforce, new technologies and methodologies, economics, and regulations, the D and D KM-IT is being developed in a phased and modular fashion. (authors)« less
U.S. Spacesuit Knowledge Capture Accomplishments in Fiscal Years 2012 and 2013
NASA Technical Reports Server (NTRS)
Chullen, Cinda; Oliva, Vladenka R.
2014-01-01
The NASA U.S. spacesuit knowledge capture (KC) program has been in operations since the beginning 2008. The program was designed to augment engineers and others with information about spacesuits in a historical way. A multitude of seminars have captured spacesuit history and knowledge over the last six years of the programs existence. Subject matter experts have provided lectures and were interviewed to help bring the spacesuit to life so that lessons learned will never be lost. As well, the program concentrated in reaching out to the public and industry by making the recorded events part of the public domain through the NASA technical library via You Tube media. The U.S. spacesuit KC topics have included lessons learned from some of the most prominent spacesuit experts and spacesuit users including current and former astronauts. The events have enriched the spacesuit legacy knowledge from Gemini, Apollo, Skylab, Space Shuttle and International Space Station Programs. As well, expert engineers and scientists have shared their challenges and successes to be remembered. The last few years have been some of the most successful years of the KC program program's life with numerous recordings and releases to the public. It is evidenced by the thousands that have view the recordings online. This paper reviews the events accomplished and archived over Fiscal Years 2012 and 2013 and highlights a few of the most memorable ones. This paper also communicates ways to access the events that are available internally to NASA as well as in the public domain.
Enabling the Capture and Sharing of NASA Technical Expertise Through Communities of Practice
NASA Technical Reports Server (NTRS)
Topousis, Daria E.; Dennehy, Cornelius J.; Lebsock, Kenneth L.
2011-01-01
Historically, engineers at the National Aeronautics and Space Administration (NASA) had few opportunities or incentives to share their technical expertise across the Agency. Its center- and project- focused culture often meant that knowledge never left organizational and geographic boundaries. With increasingly complex missions, the closeout of the Shuttle Program, and a new generation entering the workforce, developing a knowledge sharing culture became critical. To address this need, the Office of the Chief Engineer established communities of practice on the NASA Engineering Network. These communities were strategically aligned with NASA's core competencies in such disciplines as avionics, flight mechanics, life support, propulsion, structures, loads and dynamics, human factors, and guidance, navigation, and control. This paper describes the process used to identify and develop communities, from establishing simple websites that compiled discipline-specific resources to fostering a knowledge-sharing environment through collaborative and interactive technologies. It includes qualitative evidence of improved availability and transfer of knowledge. It focuses on pivotal capabilities that increased knowledge exchange such as a custom-made Ask An Expert system, community contact lists, publication of key resources, and submission forms that allowed any user to propose content for the sites. It discusses the peer relationships that developed through the communities and the leadership and infrastructure that made them possible.
Interactions between Knowledge and Testimony in Children's Reality-Status Judgments
ERIC Educational Resources Information Center
Lopez-Mobilia, Gabriel; Woolley, Jacqueline D.
2016-01-01
In 2 studies, we attempted to capture the information-processing abilities underlying children's reality-status judgments. Forty 5- to 6-year-olds and 53 7- to 8-year-olds heard about novel entities (animals) that varied in their fit with children's world knowledge. After hearing about each entity, children could either guess reality status…
ERIC Educational Resources Information Center
Mooss, Angela; Brock-Getz, Petra; Ladner, Robert; Fiano, Theresa
2013-01-01
Objective: The aim of this study was to examine the relationships between health literacy, knowledge of health status, and human immunodeficiency virus/acquired immune deficiency syndrome (HIV/AIDS) transmission beliefs among recipients of Ryan White care. Design: Quota and convenience sampled, quantitative analysis captured with closed and…
Performance Factors Analysis -- A New Alternative to Knowledge Tracing
ERIC Educational Resources Information Center
Pavlik, Philip I., Jr.; Cen, Hao; Koedinger, Kenneth R.
2009-01-01
Knowledge tracing (KT)[1] has been used in various forms for adaptive computerized instruction for more than 40 years. However, despite its long history of application, it is difficult to use in domain model search procedures, has not been used to capture learning where multiple skills are needed to perform a single action, and has not been used…
ERIC Educational Resources Information Center
Becher, Ayelet; Orland-Barak, Lily
2016-01-01
This study suggests an integrative qualitative methodological framework for capturing complexity in mentoring activity. Specifically, the model examines how historical developments of a discipline direct mentors' mediation of professional knowledge through the language that they use. The model integrates social activity theory and a framework of…
Neutron capture cross sections of Kr
NASA Astrophysics Data System (ADS)
Fiebiger, Stefan; Baramsai, Bayarbadrakh; Couture, Aaron; Krtička, Milan; Mosby, Shea; Reifarth, René; O'Donnell, John; Rusev, Gencho; Ullmann, John; Weigand, Mario; Wolf, Clemens
2018-01-01
Neutron capture and β- -decay are competing branches of the s-process nucleosynthesis path at 85Kr [1], which makes it an important branching point. The knowledge of its neutron capture cross section is therefore essential to constrain stellar models of nucleosynthesis. Despite its importance for different fields, no direct measurement of the cross section of 85Kr in the keV-regime has been performed. The currently reported uncertainties are still in the order of 50% [2, 3]. Neutron capture cross section measurements on a 4% enriched 85Kr gas enclosed in a stainless steel cylinder were performed at Los Alamos National Laboratory (LANL) using the Detector for Advanced Neutron Capture Experiments (DANCE). 85Kr is radioactive isotope with a half life of 10.8 years. As this was a low-enrichment sample, the main contaminants, the stable krypton isotopes 83Kr and 86Kr, were also investigated. The material was highly enriched and contained in pressurized stainless steel spheres.
Applying Knowledge Management to an Organization's Transformation
NASA Technical Reports Server (NTRS)
Potter, Shannon; Gill, Tracy; Fritsche, Ralph
2008-01-01
Although workers in the information age have more information at their fingertips than ever before, the ability to effectively capture and reuse actual knowledge is still a surmounting challenge for many organizations. As high tech organizations transform from providing complex products and services in an established domain to providing them in new domains, knowledge remains an increasingly valuable commodity. This paper explores the supply and demand elements of the "knowledge market" within the International Space Station and Spacecraft Processing Directorate (ISSSPD) of NASA's Kennedy Space Center (KSC). It examines how knowledge supply and knowledge demand determine the success of an organization's knowledge management (KM) activities, and how the elements of a KM infrastructure (tools, culture, and training), can be used to create and sustain knowledge supply and demand
Comparison of three artificial intelligence techniques for discharge routing
NASA Astrophysics Data System (ADS)
Khatibi, Rahman; Ghorbani, Mohammad Ali; Kashani, Mahsa Hasanpour; Kisi, Ozgur
2011-06-01
SummaryThe inter-comparison of three artificial intelligence (AI) techniques are presented using the results of river flow/stage timeseries, that are otherwise handled by traditional discharge routing techniques. These models comprise Artificial Neural Network (ANN), Adaptive Nero-Fuzzy Inference System (ANFIS) and Genetic Programming (GP), which are for discharge routing of Kizilirmak River, Turkey. The daily mean river discharge data with a period between 1999 and 2003 were used for training and testing the models. The comparison includes both visual and parametric approaches using such statistic as Coefficient of Correlation (CC), Mean Absolute Error (MAE) and Mean Square Relative Error (MSRE), as well as a basic scoring system. Overall, the results indicate that ANN and ANFIS have mixed fortunes in discharge routing, and both have different abilities in capturing and reproducing some of the observed information. However, the performance of GP displays a better edge over the other two modelling approaches in most of the respects. Attention is given to the information contents of recorded timeseries in terms of their peak values and timings, where one performance measure may capture some of the information contents but be ineffective in others. Thus, this makes a case for compiling knowledge base for various modelling techniques.
Brown, Jessi L.; Bedrosian, Bryan; Bell, Douglas A.; Braham, Melissa A.; Cooper, Jeff; Crandall, Ross H.; DiDonato, Joe; Domenech, Robert; Duerr, Adam E.; Katzner, Todd; Lanzone, Michael J.; LaPlante, David W.; McIntyre, Carol L.; Miller, Tricia A.; Murphy, Robert K.; Shreading, Adam; Slater, Steven J.; Smith, Jeff P.; Smith, Brian W.; Watson, James W.; Woodbridge, Brian
2017-01-01
Conserving wide-ranging animals requires knowledge about their year-round movements and resource use. Golden Eagles (Aquila chrysaetos) exhibit a wide range of movement patterns across North America. We combined tracking data from 571 Golden Eagles from multiple independent satellite-telemetry projects from North America to provide a comprehensive look at the magnitude and extent of these movements on a continental scale. We compared patterns of use relative to four alternative administrative and ecological mapping systems, namely Bird Conservation Regions (BCRs), U.S. administrative migratory bird flyways, Migratory Bird Joint Ventures, and Landscape Conservation Cooperatives. Our analyses suggested that eagles initially captured in eastern North America used space differently than those captured in western North America. Other groups of eagles that exhibited distinct patterns in space use included long-distance migrants from northern latitudes, and southwestern and Californian desert residents. There were also several groupings of eagles in the Intermountain West. Using this collaborative approach, we have identified large-scale movement patterns that may not have been possible with individual studies. These results will support landscape-scale conservation measures for Golden Eagles across North America.
NASA Astrophysics Data System (ADS)
Martin, Deloris
Purpose. The purpose of this study was to describe the existing knowledge transfer practices in selected aerospace companies as perceived by highly experienced engineers retiring from the company. Specifically it was designed to investigate and describe (a) the processes and procedures used to transfer knowledge, (b) the systems that encourage knowledge transfer, (c) the impact of management actions on knowledge transfer, and (d) constraining factors that might impede knowledge transfer. Methodology. A descriptive case study was the methodology applied in this study. Qualitative data were gathered from highly experienced engineers from 3 large aerospace companies in Southern California. A semistructured interview was conducted face-to-face with each participant in a private or semiprivate, non-workplace setting to obtain each engineer's perspectives on his or her company's current knowledge transfer practices. Findings. The participants in this study preferred to transfer knowledge using face-to-face methods, one-on-one, through actual troubleshooting and problem-solving scenarios. Managers in these aerospace companies were observed as having knowledge transfer as a low priority; they tend not to promote knowledge transfer among their employees. While mentoring is the most common knowledge transfer system these companies offer, it is not the preferred method of knowledge transfer among the highly experienced engineers. Job security and schedule pressures are the top constraints that impede knowledge transfer between the highly experienced engineers and their coworkers. Conclusions. The study data support the conclusion that the highly experienced engineers in the study's aerospace companies would more likely transfer their knowledge to those remaining in the industry if the transfer could occur face-to-face with management support and acknowledgement of their expertise and if their job security is not threatened. The study also supports the conclusion that managers should be responsible for the leadership in developing a knowledge-sharing culture and rewarding those who do share. Recommendations. It is recommended that a quantitative study of highly experienced engineers in aerospace be conducted to determine the degree to which knowledge-sharing methods, processes, and procedures may be effective in capturing their knowledge. It is also recommended that a replication of this study be undertaken to include the perspectives of first-line managers on developing a knowledge-sharing culture for the aerospace industry.
NASA's SDO Satellite Captures Venus Transit Approach -- Bigger, Better!
2017-12-08
NASA image captured June 5, 2012. On June 5-6 2012, SDO is collecting images of one of the rarest predictable solar events: the transit of Venus across the face of the sun. This event happens in pairs eight years apart that are separated from each other by 105 or 121 years. The last transit was in 2004 and the next will not happen until 2117. Credit: NASA/SDO, AIA To read more about the 2012 Venus Transit go to: sunearthday.nasa.gov/transitofvenus Add your photos of the Transit of Venus to our Flickr Group here: www.flickr.com/groups/venustransit/ NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram
NASA's SDO Satellite Captures 2012 Venus Transit [Close-Up
2017-12-08
NASA image captured June 5, 2012. On June 5-6 2012, SDO is collecting images of one of the rarest predictable solar events: the transit of Venus across the face of the sun. This event happens in pairs eight years apart that are separated from each other by 105 or 121 years. The last transit was in 2004 and the next will not happen until 2117. Credit: NASA/SDO, HMI To read more about the 2012 Venus Transit go to: sunearthday.nasa.gov/transitofvenus Add your photos of the Transit of Venus to our Flickr Group here: www.flickr.com/groups/venustransit/ NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram
In defense of abstract conceptual representations.
Binder, Jeffrey R
2016-08-01
An extensive program of research in the past 2 decades has focused on the role of modal sensory, motor, and affective brain systems in storing and retrieving concept knowledge. This focus has led in some circles to an underestimation of the need for more abstract, supramodal conceptual representations in semantic cognition. Evidence for supramodal processing comes from neuroimaging work documenting a large, well-defined cortical network that responds to meaningful stimuli regardless of modal content. The nodes in this network correspond to high-level "convergence zones" that receive broadly crossmodal input and presumably process crossmodal conjunctions. It is proposed that highly conjunctive representations are needed for several critical functions, including capturing conceptual similarity structure, enabling thematic associative relationships independent of conceptual similarity, and providing efficient "chunking" of concept representations for a range of higher order tasks that require concepts to be configured as situations. These hypothesized functions account for a wide range of neuroimaging results showing modulation of the supramodal convergence zone network by associative strength, lexicality, familiarity, imageability, frequency, and semantic compositionality. The evidence supports a hierarchical model of knowledge representation in which modal systems provide a mechanism for concept acquisition and serve to ground individual concepts in external reality, whereas broadly conjunctive, supramodal representations play an equally important role in concept association and situation knowledge.
ERIC Educational Resources Information Center
Galagan, Patricia A.
1997-01-01
Capturing and leveraging knowledge is an important new management trend that is as yet undefined. Some companies are accounting for their intellectual capital and applying it to the company balance sheets. (JOW)
Virtual Neurorobotics (VNR) to Accelerate Development of Plausible Neuromorphic Brain Architectures.
Goodman, Philip H; Buntha, Sermsak; Zou, Quan; Dascalu, Sergiu-Mihai
2007-01-01
Traditional research in artificial intelligence and machine learning has viewed the brain as a specially adapted information-processing system. More recently the field of social robotics has been advanced to capture the important dynamics of human cognition and interaction. An overarching societal goal of this research is to incorporate the resultant knowledge about intelligence into technology for prosthetic, assistive, security, and decision support applications. However, despite many decades of investment in learning and classification systems, this paradigm has yet to yield truly "intelligent" systems. For this reason, many investigators are now attempting to incorporate more realistic neuromorphic properties into machine learning systems, encouraged by over two decades of neuroscience research that has provided parameters that characterize the brain's interdependent genomic, proteomic, metabolomic, anatomic, and electrophysiological networks. Given the complexity of neural systems, developing tenable models to capture the essence of natural intelligence for real-time application requires that we discriminate features underlying information processing and intrinsic motivation from those reflecting biological constraints (such as maintaining structural integrity and transporting metabolic products). We propose herein a conceptual framework and an iterative method of virtual neurorobotics (VNR) intended to rapidly forward-engineer and test progressively more complex putative neuromorphic brain prototypes for their ability to support intrinsically intelligent, intentional interaction with humans. The VNR system is based on the viewpoint that a truly intelligent system must be driven by emotion rather than programmed tasking, incorporating intrinsic motivation and intentionality. We report pilot results of a closed-loop, real-time interactive VNR system with a spiking neural brain, and provide a video demonstration as online supplemental material.
U.S. Spacesuit Knowledge Capture Status and Initiatives in Fiscal Year 2014
NASA Technical Reports Server (NTRS)
Chullen, Cinda; Oliva, Vladenka R.
2015-01-01
Since its 2008 inception, the NASA U.S. Spacesuit Knowledge Capture (KC) program has shared historical spacesuit information with engineers and other technical team members to expand their understanding of the spacesuit's evolution, known capability and limitations, and future desires and needs for its use. As part of the U.S. Spacesuit KC program, subject-matter experts have delivered presentations, held workshops, and participated in interviews to share valuable spacesuit lessons learned to ensure this vital information will survive for existing and future generations to use. These events have included spacesuit knowledge from the inception of NASA's first spacesuit to current spacesuit design. To ensure that this information is shared with the entire NASA community and other interested or invested entities, these KC events were digitally recorded and transcribed to be uploaded onto several applicable NASA Web sites. This paper discusses the various Web sites that the KC events are uploaded to and possible future sites that will channel this information.
U.S. Spacesuit Knowledge Capture Accomplishments in Fiscal Year 2014
NASA Technical Reports Server (NTRS)
Chullen, Cinda; Oliva, Vladenka R.
2015-01-01
Since its 2008 inception, the NASA U.S. Spacesuit Knowledge Capture (KC) program has shared historical spacesuit information with engineers and other technical team members to expand their understanding of the spacesuit's evolution, known capability and limitations, and future desires and needs for its use. As part of the U.S. Spacesuit KC program, subject-matter experts have delivered presentations, held workshops, and participated in interviews to share valuable spacesuit lessons learned to ensure this vital information will survive for existing and future generations to use. These events have included spacesuit knowledge from the inception of NASA's first spacesuit to current spacesuit design. To ensure that this information is shared with the entire NASA community and other interested or invested entities, these KC events were digitally recorded and transcribed to be uploaded onto several applicable NASA Web sites. This paper discusses the various Web sites that the KC events are uploaded to and possible future sites that will channel this information.
Systems Analysis of Physical Absorption of CO2 in Ionic Liquids for Pre-Combustion Carbon Capture.
Zhai, Haibo; Rubin, Edward S
2018-04-17
This study develops an integrated technical and economic modeling framework to investigate the feasibility of ionic liquids (ILs) for precombustion carbon capture. The IL 1-hexyl-3-methylimidazolium bis(trifluoromethylsulfonyl)imide is modeled as a potential physical solvent for CO 2 capture at integrated gasification combined cycle (IGCC) power plants. The analysis reveals that the energy penalty of the IL-based capture system comes mainly from the process and product streams compression and solvent pumping, while the major capital cost components are the compressors and absorbers. On the basis of the plant-level analysis, the cost of CO 2 avoided by the IL-based capture and storage system is estimated to be $63 per tonne of CO 2 . Technical and economic comparisons between IL- and Selexol-based capture systems at the plant level show that an IL-based system could be a feasible option for CO 2 capture. Improving the CO 2 solubility of ILs can simplify the capture process configuration and lower the process energy and cost penalties to further enhance the viability of this technology.
Hunter, Lawrence; Lu, Zhiyong; Firby, James; Baumgartner, William A; Johnson, Helen L; Ogren, Philip V; Cohen, K Bretonnel
2008-01-01
Background Information extraction (IE) efforts are widely acknowledged to be important in harnessing the rapid advance of biomedical knowledge, particularly in areas where important factual information is published in a diverse literature. Here we report on the design, implementation and several evaluations of OpenDMAP, an ontology-driven, integrated concept analysis system. It significantly advances the state of the art in information extraction by leveraging knowledge in ontological resources, integrating diverse text processing applications, and using an expanded pattern language that allows the mixing of syntactic and semantic elements and variable ordering. Results OpenDMAP information extraction systems were produced for extracting protein transport assertions (transport), protein-protein interaction assertions (interaction) and assertions that a gene is expressed in a cell type (expression). Evaluations were performed on each system, resulting in F-scores ranging from .26 – .72 (precision .39 – .85, recall .16 – .85). Additionally, each of these systems was run over all abstracts in MEDLINE, producing a total of 72,460 transport instances, 265,795 interaction instances and 176,153 expression instances. Conclusion OpenDMAP advances the performance standards for extracting protein-protein interaction predications from the full texts of biomedical research articles. Furthermore, this level of performance appears to generalize to other information extraction tasks, including extracting information about predicates of more than two arguments. The output of the information extraction system is always constructed from elements of an ontology, ensuring that the knowledge representation is grounded with respect to a carefully constructed model of reality. The results of these efforts can be used to increase the efficiency of manual curation efforts and to provide additional features in systems that integrate multiple sources for information extraction. The open source OpenDMAP code library is freely available at PMID:18237434
Overview of NRC Proactive Management of Materials Degradation (PMMD) Program
NASA Astrophysics Data System (ADS)
Carpenter, C. E. Gene; Hull, Amy; Oberson, Greg
Materials degradation phenomena, if not appropriately managed, have the potential to adversely impact the design functionality and safety margins of nuclear power plant (NPP) systems, structures and components (SSCs). Therefore, the U.S. Nuclear Regulatory Commission (NRC) has initiated an over-the-horizon multi-year research Proactive Management of Materials Degradation (PMMD) Research Program, which is presently evaluating longer time frames (i.e., 80 or more years) and including passive long-lived SSCs beyond the primary piping and core internals, such as concrete containment and cable insulation. This will allow the NRC to (1) identify significant knowledge gaps and new forms of degradation; (2) capture current knowledge base; and, (3) prioritize materials degradation research needs and directions for future efforts. This effort is being accomplished in collaboration with the U.S. Department of Energy's (DOE) LWR Sustainability (LWRS) program. This presentation will discuss the activities to date, including results, and the path forward.
Theoretical Commitment and Implicit Knowledge: Why Anomalies do not Trigger Learning
NASA Astrophysics Data System (ADS)
Ohlsson, Stellan
A theory consists of a mental model, laws that specify parameters of the model and one or more explanatory schemas. Models represent by being isomorphic to real systems. To explain an event is to reenact its genesis by executing the relevant model in the mind's eye. Schemas capture recurring structural features of explanations. To subscribe to a theory is to be committed to explaining a particular class of events with that theory (and nothing else). Given theoretical commitment, an anomaly, i.e., an event that cannot be explained, is an occasion for theory change, but in the absence of commitment, the response is instead to exclude the anomalous event from the domain of application of the theory. Lay people and children hold their theories implicitly and hence without commitment. These observations imply that the analogy between scientist's theories and children's knowledge is valid, but that the analogy between theory change and learning is not.
A Bayesian Model of the Memory Colour Effect.
Witzel, Christoph; Olkkonen, Maria; Gegenfurtner, Karl R
2018-01-01
According to the memory colour effect, the colour of a colour-diagnostic object is not perceived independently of the object itself. Instead, it has been shown through an achromatic adjustment method that colour-diagnostic objects still appear slightly in their typical colour, even when they are colourimetrically grey. Bayesian models provide a promising approach to capture the effect of prior knowledge on colour perception and to link these effects to more general effects of cue integration. Here, we model memory colour effects using prior knowledge about typical colours as priors for the grey adjustments in a Bayesian model. This simple model does not involve any fitting of free parameters. The Bayesian model roughly captured the magnitude of the measured memory colour effect for photographs of objects. To some extent, the model predicted observed differences in memory colour effects across objects. The model could not account for the differences in memory colour effects across different levels of realism in the object images. The Bayesian model provides a particularly simple account of memory colour effects, capturing some of the multiple sources of variation of these effects.
A Bayesian Model of the Memory Colour Effect
Olkkonen, Maria; Gegenfurtner, Karl R.
2018-01-01
According to the memory colour effect, the colour of a colour-diagnostic object is not perceived independently of the object itself. Instead, it has been shown through an achromatic adjustment method that colour-diagnostic objects still appear slightly in their typical colour, even when they are colourimetrically grey. Bayesian models provide a promising approach to capture the effect of prior knowledge on colour perception and to link these effects to more general effects of cue integration. Here, we model memory colour effects using prior knowledge about typical colours as priors for the grey adjustments in a Bayesian model. This simple model does not involve any fitting of free parameters. The Bayesian model roughly captured the magnitude of the measured memory colour effect for photographs of objects. To some extent, the model predicted observed differences in memory colour effects across objects. The model could not account for the differences in memory colour effects across different levels of realism in the object images. The Bayesian model provides a particularly simple account of memory colour effects, capturing some of the multiple sources of variation of these effects. PMID:29760874
Unsettled weather across central Australia
2017-12-08
In late July 2013, a low pressure system off Australia’s southeast coast and moist onshore winds combined to create unsettled weather across central Australia – and a striking image of a broad cloud band across the stark winter landscape. The Moderate Resolution Imaging Spectroradiometer (MODIS) aboard NASA’s Terra satellite captured this true-color image on July 22 at 01:05 UTC (10:35 a.m. Australian Central Standard Time). To the west of the low pressure trough the skies are clear and dry. To the east, the broad band of bright white clouds obscures the landscape. The system brought wind, precipitation and cooler temperatures to the region. The same day as MODIS captured this image, the Naval Research Lab (NRL) published an edition of the Global Storm Tracker (GST), which gave a world-wide view of the low-pressure systems across the world. This tracker shows the entire cloud band across Australia, as well as the location of the low pressure system. A good view of the Storm Tracker is provided by Red Orbit at: www.redorbit.com/media/uploads/2013/07/072213-weather-003... Credit: NASA/GSFC/Jeff Schmaltz/MODIS Land Rapid Response Team NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram
Heldenbrant, David J; Koech, Phillip K; Rainbolt, James E; Bearden, Mark D; Zheng, Feng
2014-02-18
A system and process are disclosed for selective removal and recovery of H.sub.2S from a gaseous volume, e.g., from natural gas. Anhydrous organic, sorbents chemically capture H.sub.2S gas to form hydrosulfide salts. Regeneration of the capture solvent involves addition of an anti-solvent that releases the captured H.sub.2S gas from the capture sorbent. The capture sorbent and anti-solvent are reactivated for reuse, e.g., by simple distillation.
A four stage approach for ontology-based health information system design.
Kuziemsky, Craig E; Lau, Francis
2010-11-01
To describe and illustrate a four stage methodological approach to capture user knowledge in a biomedical domain area, use that knowledge to design an ontology, and then implement and evaluate the ontology as a health information system (HIS). A hybrid participatory design-grounded theory (GT-PD) method was used to obtain data and code them for ontology development. Prototyping was used to implement the ontology as a computer-based tool. Usability testing evaluated the computer-based tool. An empirically derived domain ontology and set of three problem-solving approaches were developed as a formalized model of the concepts and categories from the GT coding. The ontology and problem-solving approaches were used to design and implement a HIS that tested favorably in usability testing. The four stage approach illustrated in this paper is useful for designing and implementing an ontology as the basis for a HIS. The approach extends existing ontology development methodologies by providing an empirical basis for theory incorporated into ontology design. Copyright © 2010 Elsevier B.V. All rights reserved.
Garrison, Gina Daubney; Baia, Patricia; Canning, Jacquelyn E; Strang, Aimee F
2015-03-25
To describe the shift to an asynchronous online approach for pedagogy instruction within a pharmacy resident teaching program offered by a dual-campus college. The pedagogy instruction component of the teaching program (Part I) was redesigned with a focus on the content, delivery, and coordination of the learning environment. Asynchronous online learning replaced distance technology or lecture capture. Using a pedagogical content knowledge framework, residents participated in self-paced online learning using faculty recordings, readings, and discussion board activities. A learning management system was used to assess achievement of learning objectives and participation prior to progressing to the teaching experiences component of the teaching program (Part II). Evaluation of resident pedagogical knowledge development and participation in Part I of the teaching program was achieved through the learning management system. Participant surveys and written reflections showed general satisfaction with the online learning environment. Future considerations include addition of a live orientation session and increased faculty presence in the online learning environment. An online approach framed by educational theory can be an effective way to provide pedagogy instruction within a teaching program.
40 CFR 63.4361 - How do I determine the emission capture system efficiency?
Code of Federal Regulations, 2010 CFR
2010-07-01
... from the web coating/printing operation surfaces they are applied to occurs within the capture system... system efficiency? 63.4361 Section 63.4361 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... emission capture system efficiency? You must use the procedures and test methods in this section to...
An integrated decision support system for wastewater nutrient recovery and recycling to agriculture
NASA Astrophysics Data System (ADS)
Roy, E. D.; Bomeisl, L.; Cornbrooks, P.; Mo, W.
2017-12-01
Nutrient recovery and recycling has become a key research topic within the wastewater engineering and nutrient management communities. Several technologies now exist that can effectively capture nutrients from wastewater, and innovation in this area continues to be an important research pursuit. However, practical nutrient recycling solutions require more than capable nutrient capture technologies. We also need to understand the role that wastewater nutrient recovery and recycling can play within broader nutrient management schemes at the landscape level, including important interactions at the nexus of food, energy, and water. We are developing an integrated decision support system that combines wastewater treatment data, agricultural data, spatial nutrient balance modeling, life cycle assessment, stakeholder knowledge, and multi-criteria decision making. Our goals are to: (1) help guide design decisions related to the implementation of sustainable nutrient recovery technology, (2) support innovations in watershed nutrient management that operate at the interface of the built environment and agriculture, and (3) aid efforts to protect aquatic ecosystems while supporting human welfare in a circular nutrient economy. These goals will be realized partly through the assessment of plausible alternative scenarios for the future. In this presentation, we will describe the tool and focus on nutrient balance results for the New England region. These results illustrate that both centralized and decentralized wastewater nutrient recovery schemes have potential to transform nutrient flows in many New England watersheds, diverting wastewater N and P away from aquatic ecosystems and toward local or regional agricultural soils where they can offset a substantial percentage of imported fertilizer. We will also highlight feasibility criteria and next steps to integrate stakeholder knowledge, economics, and life cycle assessment into the tool.
Entanglement and thermodynamics after a quantum quench in integrable systems.
Alba, Vincenzo; Calabrese, Pasquale
2017-07-25
Entanglement and entropy are key concepts standing at the foundations of quantum and statistical mechanics. Recently, the study of quantum quenches revealed that these concepts are intricately intertwined. Although the unitary time evolution ensuing from a pure state maintains the system at zero entropy, local properties at long times are captured by a statistical ensemble with nonzero thermodynamic entropy, which is the entanglement accumulated during the dynamics. Therefore, understanding the entanglement evolution unveils how thermodynamics emerges in isolated systems. Alas, an exact computation of the entanglement dynamics was available so far only for noninteracting systems, whereas it was deemed unfeasible for interacting ones. Here, we show that the standard quasiparticle picture of the entanglement evolution, complemented with integrability-based knowledge of the steady state and its excitations, leads to a complete understanding of the entanglement dynamics in the space-time scaling limit. We thoroughly check our result for the paradigmatic Heisenberg chain.
Entanglement and thermodynamics after a quantum quench in integrable systems
NASA Astrophysics Data System (ADS)
Alba, Vincenzo; Calabrese, Pasquale
2017-07-01
Entanglement and entropy are key concepts standing at the foundations of quantum and statistical mechanics. Recently, the study of quantum quenches revealed that these concepts are intricately intertwined. Although the unitary time evolution ensuing from a pure state maintains the system at zero entropy, local properties at long times are captured by a statistical ensemble with nonzero thermodynamic entropy, which is the entanglement accumulated during the dynamics. Therefore, understanding the entanglement evolution unveils how thermodynamics emerges in isolated systems. Alas, an exact computation of the entanglement dynamics was available so far only for noninteracting systems, whereas it was deemed unfeasible for interacting ones. Here, we show that the standard quasiparticle picture of the entanglement evolution, complemented with integrability-based knowledge of the steady state and its excitations, leads to a complete understanding of the entanglement dynamics in the space-time scaling limit. We thoroughly check our result for the paradigmatic Heisenberg chain.
Entanglement and thermodynamics after a quantum quench in integrable systems
Alba, Vincenzo; Calabrese, Pasquale
2017-01-01
Entanglement and entropy are key concepts standing at the foundations of quantum and statistical mechanics. Recently, the study of quantum quenches revealed that these concepts are intricately intertwined. Although the unitary time evolution ensuing from a pure state maintains the system at zero entropy, local properties at long times are captured by a statistical ensemble with nonzero thermodynamic entropy, which is the entanglement accumulated during the dynamics. Therefore, understanding the entanglement evolution unveils how thermodynamics emerges in isolated systems. Alas, an exact computation of the entanglement dynamics was available so far only for noninteracting systems, whereas it was deemed unfeasible for interacting ones. Here, we show that the standard quasiparticle picture of the entanglement evolution, complemented with integrability-based knowledge of the steady state and its excitations, leads to a complete understanding of the entanglement dynamics in the space–time scaling limit. We thoroughly check our result for the paradigmatic Heisenberg chain. PMID:28698379
Ness, Seth L; Manyakov, Nikolay V; Bangerter, Abigail; Lewin, David; Jagannatha, Shyla; Boice, Matthew; Skalkin, Andrew; Dawson, Geraldine; Janvier, Yvette M; Goodwin, Matthew S; Hendren, Robert; Leventhal, Bennett; Shic, Frederick; Cioccia, Walter; Pandina, Gahan
2017-01-01
Objective: To test usability and optimize the Janssen Autism Knowledge Engine (JAKE®) system's components, biosensors, and procedures used for objective measurement of core and associated symptoms of autism spectrum disorder (ASD) in clinical trials. Methods: A prospective, observational study of 29 children and adolescents with ASD using the JAKE system was conducted at three sites in the United States. This study was designed to establish the feasibility of the JAKE system and to learn practical aspects of its implementation. In addition to information collected by web and mobile components, wearable biosensor data were collected both continuously in natural settings and periodically during a battery of experimental tasks administered in laboratory settings. This study is registered at clinicaltrials.gov, NCT02299700. Results: Feedback collected throughout the study allowed future refinements to be planned for all components of the system. The Autism Behavior Inventory (ABI), a parent-reported measure of ASD core and associated symptoms, performed well. Among biosensors studied, the eye-tracker, sleep monitor, and electrocardiogram were shown to capture high quality data, whereas wireless electroencephalography was difficult to use due to its form factor. On an exit survey, the majority of parents rated their overall reaction to JAKE as positive/very positive. No significant device-related events were reported in the study. Conclusion: The results of this study, with the described changes, demonstrate that the JAKE system is a viable, useful, and safe platform for use in clinical trials of ASD, justifying larger validation and deployment studies of the optimized system.
Ness, Seth L.; Manyakov, Nikolay V.; Bangerter, Abigail; Lewin, David; Jagannatha, Shyla; Boice, Matthew; Skalkin, Andrew; Dawson, Geraldine; Janvier, Yvette M.; Goodwin, Matthew S.; Hendren, Robert; Leventhal, Bennett; Shic, Frederick; Cioccia, Walter; Pandina, Gahan
2017-01-01
Objective: To test usability and optimize the Janssen Autism Knowledge Engine (JAKE®) system's components, biosensors, and procedures used for objective measurement of core and associated symptoms of autism spectrum disorder (ASD) in clinical trials. Methods: A prospective, observational study of 29 children and adolescents with ASD using the JAKE system was conducted at three sites in the United States. This study was designed to establish the feasibility of the JAKE system and to learn practical aspects of its implementation. In addition to information collected by web and mobile components, wearable biosensor data were collected both continuously in natural settings and periodically during a battery of experimental tasks administered in laboratory settings. This study is registered at clinicaltrials.gov, NCT02299700. Results: Feedback collected throughout the study allowed future refinements to be planned for all components of the system. The Autism Behavior Inventory (ABI), a parent-reported measure of ASD core and associated symptoms, performed well. Among biosensors studied, the eye-tracker, sleep monitor, and electrocardiogram were shown to capture high quality data, whereas wireless electroencephalography was difficult to use due to its form factor. On an exit survey, the majority of parents rated their overall reaction to JAKE as positive/very positive. No significant device-related events were reported in the study. Conclusion: The results of this study, with the described changes, demonstrate that the JAKE system is a viable, useful, and safe platform for use in clinical trials of ASD, justifying larger validation and deployment studies of the optimized system. PMID:29018317
ERIC Educational Resources Information Center
Govender, Nadaraj
2015-01-01
This case study explored the development of two pre-service teachers' subject matter knowledge (SMK) of electromagnetism while integrating the use of concept maps (CM) and collaborative learning (CL) strategies. The study aimed at capturing how these pre-service teachers' SMK in electromagnetism was enhanced after having been taught SMK in a…
ERIC Educational Resources Information Center
Erickson, Lisa B.
2013-01-01
In today's connected world, the reach of the Internet and collaborative social media tools have opened up new opportunities for individuals, regardless of their location, to share their knowledge, expertise, and creativity with others. These tools have also opened up opportunities for organizations to connect with new sources of innovation to…
Teaching Knowledge Management by Combining Wikis and Screen Capture Videos
ERIC Educational Resources Information Center
Makkonen, Pekka; Siakas, Kerstin; Vaidya, Shakespeare
2011-01-01
Purpose: This paper aims to report on the design and creation of a knowledge management course aimed at facilitating student creation and use of social interactive learning tools for enhanced learning. Design/methodology/approach: The era of social media and web 2.0 has enabled a bottom-up collaborative approach and new ways to publish work on the…
OER (Re)Use and Language Teachers' Tacit Professional Knowledge: Three Vignettes
ERIC Educational Resources Information Center
Beaven, Tita
2015-01-01
The pedagogic practical knowledge that teachers use in their lessons is very difficult to make visible and often remains tacit. This chapter draws on data from a recent study and closely analyses a number of Open Educational Resources used by three language teachers at the UK Open University in order to try to capture how their use of the…
Developing an Advanced Environment for Collaborative Computing
NASA Technical Reports Server (NTRS)
Becerra-Fernandez, Irma; Stewart, Helen; DelAlto, Martha; DelAlto, Martha; Knight, Chris
1999-01-01
Knowledge management in general tries to organize and make available important know-how, whenever and where ever is needed. Today, organizations rely on decision-makers to produce "mission critical" decisions that am based on inputs from multiple domains. The ideal decision-maker has a profound understanding of specific domains that influence the decision-making process coupled with the experience that allows them to act quickly and decisively on the information. In addition, learning companies benefit by not repeating costly mistakes, and by reducing time-to-market in Research & Development projects. Group-decision making tools can help companies make better decisions by capturing the knowledge from groups of experts. Furthermore, companies that capture their customers preferences can improve their customer service, which translates to larger profits. Therefore collaborative computing provides a common communication space, improves sharing of knowledge, provides a mechanism for real-time feedback on the tasks being performed, helps to optimize processes, and results in a centralized knowledge warehouse. This paper presents the research directions. of a project which seeks to augment an advanced collaborative web-based environment called Postdoc, with workflow capabilities. Postdoc is a "government-off-the-shelf" document management software developed at NASA-Ames Research Center (ARC).
The Power of Story: Dressing Up the Naked Truth
NASA Technical Reports Server (NTRS)
Simmons, Annette
2004-01-01
ASK Magazine is not alone when it comes to using storytelling to capture lessons learned and share knowledge. Several other practitioners have successfully introduced this approach to knowledge management within organizations. This article by Annette Simmons marks the first in a series ty authors whose work on storytelling has been widely recognized. We hope these features illuminate why ASK contributors use the story form to share their knowledge, and how you can do the same. Annette Simmons spoke at the February 2002 APPL Masters Forum.
Garling, A C
1994-01-01
Most of us can remember the crowning sense of elegance we occasionally felt when we solved a very difficult geometry problem. We linked the proof to the postulates. It was almost like calling on history or the elders to stand silently with us in the flurry of our moment. We "worked truth." I'd like to capture a little of that same "working truth" and apply it in a very unlikely spot: information systems and information technology. It is time to go back and look at the basic postulates of knowledge and responsibility and truthfully apply them in the health care interchange between doctor and patient and make sure that our systems add to and even create an elegance so that the basic relationship of physician and patient in healing can flourish.
Multi-Agent Strategic Modeling in a Specific Environment
NASA Astrophysics Data System (ADS)
Gams, Matjaz; Bezek, Andraz
Multi-agent modeling in ambient intelligence (AmI) is concerned with the following task [19]: How can external observations of multi-agent systems in the ambient be used to analyze, model, and direct agent behavior? The main purpose is to obtain knowledge about acts in the environment thus enabling proper actions of the AmI systems [1]. Analysis of such systems must thus capture complex world state representation and asynchronous agent activities. Instead of studying basic numerical data, researchers often use more complex data structures, such as rules and decision trees. Some methods are extremely useful when characterizing state space, but lack the ability to clearly represent temporal state changes occurred by agent actions. To comprehend simultaneous agent actions and complex changes of state space, most often a combination of graphical and symbolical representation performs better in terms of human understanding and performance.
Privacy-preserving screen capture: towards closing the loop for health IT usability.
Cooley, Joseph; Smith, Sean
2013-08-01
As information technology permeates healthcare (particularly provider-facing systems), maximizing system effectiveness requires the ability to document and analyze tricky or troublesome usage scenarios. However, real-world health IT systems are typically replete with privacy-sensitive data regarding patients, diagnoses, clinicians, and EMR user interface details; instrumentation for screen capture (capturing and recording the scenario depicted on the screen) needs to respect these privacy constraints. Furthermore, real-world health IT systems are typically composed of modules from many sources, mission-critical and often closed-source; any instrumentation for screen capture can rely neither on access to structured output nor access to software internals. In this paper, we present a tool to help solve this problem: a system that combines keyboard video mouse (KVM) capture with automatic text redaction (and interactively selectable unredaction) to produce precise technical content that can enrich stakeholder communications and improve end-user influence on system evolution. KVM-based capture makes our system both application-independent and OS-independent because it eliminates software-interface dependencies on capture targets. Using a corpus of EMR screenshots, we present empirical measurements of redaction effectiveness and processing latency to demonstrate system performances. We discuss how these techniques can translate into instrumentation systems that improve real-world health IT deployments. Copyright © 2013 Elsevier Inc. All rights reserved.
From Big Data to Knowledge in the Social Sciences
Hesse, Bradford W.; Moser, Richard P.; Riley, William T.
2015-01-01
One of the challenges associated with high-volume, diverse datasets is whether synthesis of open data streams can translate into actionable knowledge. Recognizing that challenge and other issues related to these types of data, the National Institutes of Health developed the Big Data to Knowledge or BD2K initiative. The concept of translating “big data to knowledge” is important to the social and behavioral sciences in several respects. First, a general shift to data-intensive science will exert an influence on all scientific disciplines, but particularly on the behavioral and social sciences given the wealth of behavior and related constructs captured by big data sources. Second, science is itself a social enterprise; by applying principles from the social sciences to the conduct of research, it should be possible to ameliorate some of the systemic problems that plague the scientific enterprise in the age of big data. We explore the feasibility of recalibrating the basic mechanisms of the scientific enterprise so that they are more transparent and cumulative; more integrative and cohesive; and more rapid, relevant, and responsive. PMID:26294799
NASA Astrophysics Data System (ADS)
Ceder, Gerbrand
2007-03-01
The prediction of structure is a key problem in computational materials science that forms the platform on which rational materials design can be performed. Finding structure by traditional optimization methods on quantum mechanical energy models is not possible due to the complexity and high dimensionality of the coordinate space. An unusual, but efficient solution to this problem can be obtained by merging ideas from heuristic and ab initio methods: In the same way that scientist build empirical rules by observation of experimental trends, we have developed machine learning approaches that extract knowledge from a large set of experimental information and a database of over 15,000 first principles computations, and used these to rapidly direct accurate quantum mechanical techniques to the lowest energy crystal structure of a material. Knowledge is captured in a Bayesian probability network that relates the probability to find a particular crystal structure at a given composition to structure and energy information at other compositions. We show that this approach is highly efficient in finding the ground states of binary metallic alloys and can be easily generalized to more complex systems.
Transfer and capture into distant retrograde orbits
NASA Astrophysics Data System (ADS)
Scott, Christopher J.
This dissertation utilizes theory and techniques derived from the fields of dynamical systems theory, astrodyanamics, celestial mechanics, and fluid mechanics to analyze the phenomenon of satellite capture and interrelated spacecraft transfers in restricted three-body systems. The results extend current knowledge and understanding of capture dynamics in the context of astrodynamics and celestial mechanics. Manifold theory, fast Lyapunov indicator maps, and the classification of space structure facilitate an analysis of the transport of objects from the chaotic reaches of the solar system to the distant retrograde region in the sun-Jupiter system. Apart from past studies this dissertation considers the role of the complex lobe structure encompassing stable regions in the circular restricted three-body problem. These structures are shown to be responsible for the phenomenon of sticky orbits and the transport of objects among stable regions. Since permanent capture can only be achieved through a change in energy, fast Lyapunov indicator maps and other methods which reveal the structure of the conservative system are used to discern capture regions and identify the underpinnings of the dynamics. Fast Lyapunov indicator maps provide an accurate classification of orbits of permanent capture and escape, yet monopolize computational resources. In anticipation of a fully three-dimensional analysis in the dissipative system a new mapping parameter is introduced based on energy degradation and averaged velocity. Although the study specifically addresses the sun-Jupiter system, the qualitative results and devised techniques can be applied throughout the solar system and to capture about extrasolar planets. Extending the analysis beyond the exterior of the stable distant retrograde region fosters the construction of transfer orbits from low-Earth orbit to a stable periodic orbit at the center of the stable distant retrograde region. Key to this analysis is the predictability of collision orbits within the highly chaotic region commonly recognized as a saddle point on the energy manifold. The pragmatic techniques derived from this analysis solve a number of complications apparent in the literature. Notably a reliable methodology for the construction of an arbitrary number of transfer orbits circumvents the requirement of computing specialized periodic orbits or extensive numerical sampling of the phase space. The procedure provides a complete description of the design space accessing a wide range of distant retrograde orbits sizes, insertion points, and parking orbit altitudes in an automated manner. The transfers are studied in a similar fashion to periodic orbits unveiling the intimate relationship among design parameters and phase space structure. An arbitrary number of Earth return periodic orbits can be generated as a by-product. These orbits may be useful for spacecraft that must make a number of passes near the second primary without a reduction in energy. Further analysis of the lobe dynamics and a modification of the transfers to the center of the stable region yields sets of single impulse transfers to sticky distant retrograde orbits. It is shown that the evolution of the phase space structures with energy corresponds to the variation of capture time and target size. Capture phenomenon is related to the stability characteristics of the unstable periodic orbit and the geometry of the corresponding homoclinic tangle at various energies. Future spacecraft with little or no propulsive means may take advantage of these natural trajectories for operations in the region. Temporary capture along a sticky orbit may come before incremental stabilization of the spacecraft by way of a series of small impulsive or a low continuous thrust maneuvers. The requirements of small stabilization maneuver are calculated and compared to a direct transfer to the center of stable region. This mission design may be desirable as any failure in the classic set of maneuvers to the center of the stable region could result in the loss of the spacecraft. A simple low-thrust stabilization method is analyzed in a similar manner to nebular drag. It is shown that stabilization maneuvers initiated within the sticky region can be achieved via a simple control law. Moreover, the sticky region can be used as a staging point for both spiral-in and spiral-out maneuvers. For the spiral in maneuver this negates a large, initial maneuver required to reach the center of the stable region. It is shown that large lengths of orbits exist within the sticky regions which reliably lead to permanent capture. In the case of spiral-out the spacecraft is transported to a highly energetic yet stable orbit about the second primary. From here a small maneuver could allow the spacecraft to access other regions of the solar system.
A Systematic Approach for Evaluation of Capture Zones at Pump and Treat Systems
This document describes a systematic approach for performing capture zone analysis associated with ground water pump and treat systems. A “capture zone” refers to the three-dimensional region that contributes the ground water extracted by one or more wells or drains. A capture ...
NASA Technical Reports Server (NTRS)
Choudhary, Alok Nidhi; Leung, Mun K.; Huang, Thomas S.; Patel, Janak H.
1989-01-01
Several techniques to perform static and dynamic load balancing techniques for vision systems are presented. These techniques are novel in the sense that they capture the computational requirements of a task by examining the data when it is produced. Furthermore, they can be applied to many vision systems because many algorithms in different systems are either the same, or have similar computational characteristics. These techniques are evaluated by applying them on a parallel implementation of the algorithms in a motion estimation system on a hypercube multiprocessor system. The motion estimation system consists of the following steps: (1) extraction of features; (2) stereo match of images in one time instant; (3) time match of images from different time instants; (4) stereo match to compute final unambiguous points; and (5) computation of motion parameters. It is shown that the performance gains when these data decomposition and load balancing techniques are used are significant and the overhead of using these techniques is minimal.
Evaluation of Uniform Cost Accounting System to Fully Capture Depot Level Repair Costs.
1985-12-01
RD-RI65 522 EVALUATION OF UNIFORM COST ACCOUNTING SYSTEM TO FULLY i/I CAPTURE DEPOT LEVEL REPAIR COSTS (U) NAVAL POSTGRADUATE SCHOOL MONTEREY CA D R...8217.LECTE B ,- THESIS EVALUATION OF UNIFORM COST ACCOUNTING SYSTEM 0TO FULLY CAPTURE DEPOT LEVEL REPAIR COSTS Jby __jDavid Richmond O’Brien lj,,, December...Include Security Classification) EVALUATION OF UNIFORM COST ACCOUNTING SYSTEM TO FULLY CAPTURE DEPOT LEVEL REPAIR COSTS 12 PERSONAL AUTHOR(S) O’Brien- David
Portable pathogen detection system
Colston, Billy W.; Everett, Matthew; Milanovich, Fred P.; Brown, Steve B.; Vendateswaran, Kodumudi; Simon, Jonathan N.
2005-06-14
A portable pathogen detection system that accomplishes on-site multiplex detection of targets in biological samples. The system includes: microbead specific reagents, incubation/mixing chambers, a disposable microbead capture substrate, and an optical measurement and decoding arrangement. The basis of this system is a highly flexible Liquid Array that utilizes optically encoded microbeads as the templates for biological assays. Target biological samples are optically labeled and captured on the microbeads, which are in turn captured on an ordered array or disordered array disposable capture substrate and then optically read.
Informatics — EDRN Public Portal
The EDRN provides a comprehensive informatics activity which includes a number of tools and an integrated knowledge environment for capturing, managing, integrating, and sharing results from across EDRN's cancer biomarker research network.
NASA Technical Reports Server (NTRS)
Izygon, Michel E.
1992-01-01
The development process of the knowledge base for the generation of Test Libraries for Mission Operations Computer (MOC) Command Support focused on a series of information gathering interviews. These knowledge capture sessions are supporting the development of a prototype for evaluating the capabilities of INTUIT on such an application. the prototype includes functions related to POCC (Payload Operation Control Center) processing. It prompts the end-users for input through a series of panels and then generates the Meds associated with the initialization and the update of hazardous command tables for a POCC Processing TLIB.
Sustainable Capture: Concepts for Managing Stream-Aquifer Systems.
Davids, Jeffrey C; Mehl, Steffen W
2015-01-01
Most surface water bodies (i.e., streams, lakes, etc.) are connected to the groundwater system to some degree so that changes to surface water bodies (either diversions or importations) can change flows in aquifer systems, and pumping from an aquifer can reduce discharge to, or induce additional recharge from streams, springs, and lakes. The timescales of these interactions are often very long (decades), making sustainable management of these systems difficult if relying only on observations of system responses. Instead, management scenarios are often analyzed based on numerical modeling. In this paper we propose a framework and metrics that can be used to relate the Theis concepts of capture to sustainable measures of stream-aquifer systems. We introduce four concepts: Sustainable Capture Fractions, Sustainable Capture Thresholds, Capture Efficiency, and Sustainable Groundwater Storage that can be used as the basis for developing metrics for sustainable management of stream-aquifer systems. We demonstrate their utility on a hypothetical stream-aquifer system where pumping captures both streamflow and discharge to phreatophytes at different amounts based on pumping location. In particular, Capture Efficiency (CE) can be easily understood by both scientists and non-scientist alike, and readily identifies vulnerabilities to sustainable stream-aquifer management when its value exceeds 100%. © 2014, National Ground Water Association.
U.S. Spacesuit Knowledge Capture Accomplishments in Fiscal Years 2012 and 2013
NASA Technical Reports Server (NTRS)
Chullen, Cinda; Oliva, Vladenka R.
2014-01-01
The NASA U.S. Spacesuit Knowledge Capture (KC) program has existed since the beginning of 2008. The program was designed to augment engineers and other technical team members with historical spacesuit information to add to their understanding of the spacesuit, its evolution, its limitations, and its capabilities. Over 40 seminars have captured spacesuit history and knowledge over the last six years of the program's existence. Subject matter experts have provided lectures and some were interviewed to help bring the spacesuit to life so that lessons learned will never be lost. As well, the program concentrated in reaching out to the public and industry by making the recorded events part of the public domain through the NASA technical library through YouTube media. The U.S. Spacesuit KC topics have included lessons learned from some of the most prominent spacesuit experts and spacesuit users including current and former astronauts. The events have enriched the spacesuit legacy knowledge from Gemini, Apollo, Skylab, Space Shuttle and International Space Station Programs. As well, expert engineers and scientists have shared their challenges and successes to be remembered. Based on evidence by the thousands of people who have viewed the recordings online, the last few years have been some of the most successful years of the KC program's life with numerous digital recordings and public releases. This paper reviews the events accomplished and archived over Fiscal Years 2012 and 2013 and highlights a few of the most memorable ones. This paper also communicates ways to access the events that are available internally on the NASA domain as well as those released on the public domain.
Samba: a real-time motion capture system using wireless camera sensor networks.
Oh, Hyeongseok; Cha, Geonho; Oh, Songhwai
2014-03-20
There is a growing interest in 3D content following the recent developments in 3D movies, 3D TVs and 3D smartphones. However, 3D content creation is still dominated by professionals, due to the high cost of 3D motion capture instruments. The availability of a low-cost motion capture system will promote 3D content generation by general users and accelerate the growth of the 3D market. In this paper, we describe the design and implementation of a real-time motion capture system based on a portable low-cost wireless camera sensor network. The proposed system performs motion capture based on the data-driven 3D human pose reconstruction method to reduce the computation time and to improve the 3D reconstruction accuracy. The system can reconstruct accurate 3D full-body poses at 16 frames per second using only eight markers on the subject's body. The performance of the motion capture system is evaluated extensively in experiments.
Samba: A Real-Time Motion Capture System Using Wireless Camera Sensor Networks
Oh, Hyeongseok; Cha, Geonho; Oh, Songhwai
2014-01-01
There is a growing interest in 3D content following the recent developments in 3D movies, 3D TVs and 3D smartphones. However, 3D content creation is still dominated by professionals, due to the high cost of 3D motion capture instruments. The availability of a low-cost motion capture system will promote 3D content generation by general users and accelerate the growth of the 3D market. In this paper, we describe the design and implementation of a real-time motion capture system based on a portable low-cost wireless camera sensor network. The proposed system performs motion capture based on the data-driven 3D human pose reconstruction method to reduce the computation time and to improve the 3D reconstruction accuracy. The system can reconstruct accurate 3D full-body poses at 16 frames per second using only eight markers on the subject's body. The performance of the motion capture system is evaluated extensively in experiments. PMID:24658618
NASA Astrophysics Data System (ADS)
Kerschke, D. I.; Häner, R.; Schurr, B.; Oncken, O.; Wächter, J.
2014-12-01
Interoperable data management platforms play an increasing role in the advancement of knowledge and technology in many scientific disciplines. Through high quality services they support the establishment of efficient and innovative research environments. Well-designed research environments can facilitate the sustainable utilization, exchange, and re-use of scientific data and functionality by using standardized community models. Together with innovative 3D/4D visualization, these concepts provide added value in improving scientific knowledge-gain, even across the boundaries of disciplines. A project benefiting from the added value is the Integrated Plate boundary Observatory in Chile (IPOC). IPOC is a European-South American network to study earthquakes and deformation at the Chilean continental margin and to monitor the plate boundary system for capturing an anticipated great earthquake in a seismic gap. In contrast to conventional observatories that monitor individual signals only, IPOC captures a large range of different processes through various observation methods (e.g., seismographs, GPS, magneto-telluric sensors, creep-meter, accelerometer, InSAR). For IPOC a conceptual design has been devised that comprises an architectural blueprint for a data management platform based on common and standardized data models, protocols, and encodings as well as on an exclusive use of Free and Open Source Software (FOSS) including visualization components. Following the principles of event-driven service-oriented architectures, the design enables novel processes by sharing and re-using functionality and information on the basis of innovative data mining and data fusion technologies. This platform can help to improve the understanding of the physical processes underlying plate deformations as well as the natural hazards induced by them. Through the use of standards, this blueprint can not only be facilitated for other plate observing systems (e.g., the European Plate Observing System EPOS), it also supports integrated approaches to include sensor networks that provide complementary processes for dynamic monitoring. Moreover, the integration of such observatories into superordinate research infrastructures (federation of virtual observatories) will be enabled.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Illangasekare, Tissa; Trevisan, Luca; Agartan, Elif
2015-03-31
Carbon Capture and Storage (CCS) represents a technology aimed to reduce atmospheric loading of CO 2 from power plants and heavy industries by injecting it into deep geological formations, such as saline aquifers. A number of trapping mechanisms contribute to effective and secure storage of the injected CO 2 in supercritical fluid phase (scCO 2) in the formation over the long term. The primary trapping mechanisms are structural, residual, dissolution and mineralization. Knowledge gaps exist on how the heterogeneity of the formation manifested at all scales from the pore to the site scales affects trapping and parameterization of contributing mechanismsmore » in models. An experimental and modeling study was conducted to fill these knowledge gaps. Experimental investigation of fundamental processes and mechanisms in field settings is not possible as it is not feasible to fully characterize the geologic heterogeneity at all relevant scales and gathering data on migration, trapping and dissolution of scCO 2. Laboratory experiments using scCO 2 under ambient conditions are also not feasible as it is technically challenging and cost prohibitive to develop large, two- or three-dimensional test systems with controlled high pressures to keep the scCO 2 as a liquid. Hence, an innovative approach that used surrogate fluids in place of scCO 2 and formation brine in multi-scale, synthetic aquifers test systems ranging in scales from centimeter to meter scale developed used. New modeling algorithms were developed to capture the processes controlled by the formation heterogeneity, and they were tested using the data from the laboratory test systems. The results and findings are expected to contribute toward better conceptual models, future improvements to DOE numerical codes, more accurate assessment of storage capacities, and optimized placement strategies. This report presents the experimental and modeling methods and research results.« less
Depletion and capture: revisiting “The source of water derived from wells"
Konikow, Leonard F.; Leake, Stanley A.
2014-01-01
A natural consequence of groundwater withdrawals is the removal of water from subsurface storage, but the overall rates and magnitude of groundwater depletion and capture relative to groundwater withdrawals (extraction or pumpage) have not previously been well characterized. This study assesses the partitioning of long-term cumulative withdrawal volumes into fractions derived from storage depletion and capture, where capture includes both increases in recharge and decreases in discharge. Numerical simulation of a hypothetical groundwater basin is used to further illustrate some of Theis' (1940) principles, particularly when capture is constrained by insufficient available water. Most prior studies of depletion and capture have assumed that capture is unconstrained through boundary conditions that yield linear responses. Examination of real systems indicates that capture and depletion fractions are highly variable in time and space. For a large sample of long-developed groundwater systems, the depletion fraction averages about 0.15 and the capture fraction averages about 0.85 based on cumulative volumes. Higher depletion fractions tend to occur in more arid regions, but the variation is high and the correlation coefficient between average annual precipitation and depletion fraction for individual systems is only 0.40. Because 85% of long-term pumpage is derived from capture in these real systems, capture must be recognized as a critical factor in assessing water budgets, groundwater storage depletion, and sustainability of groundwater development. Most capture translates into streamflow depletion, so it can detrimentally impact ecosystems.
Geometric prepatterning-based tuning of the period doubling onset strain during thin-film wrinkling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Saha, Sourabh K.
Wrinkling of thin films is an easy-to-implement and low-cost technique to fabricate stretch-tunable periodic micro and nanoscale structures. However, the tunability of such structures is often limited by the emergence of an undesirable period-doubled mode at high strains. Predictively tuning the onset strain for period doubling via existing techniques requires one to have extensive knowledge about the nonlinear pattern formation behavior. Herein, a geometric prepatterning-based technique is introduced that can be implemented even with limited system knowledge to predictively delay period doubling. The technique comprises prepatterning the film/base bilayer with a sinusoidal pattern that has the same period as themore » natural period of the system. This technique has been verified via physical and computational experiments on the polydimethylsiloxane (PDMS)/glass bilayer system. It is observed that the onset strain can be increased from the typical value of 20% for flat films to greater than 30% with a modest prepattern aspect ratio (2·amplitude/period) of 0.15. In addition, finite element simulations reveal that (i) the onset strain increases with increasing prepattern amplitude and (ii) the delaying effect can be captured entirely by the prepattern geometry. Therefore, one can implement this technique even with limited system knowledge, such as material properties or film thickness, by simply replicating pre-existing wrinkled patterns to generate prepatterned bilayers. Furthermore, geometric prepatterning is a practical scheme to increase the operating range of stretch-tunable wrinkle-based devices by at least 50%.« less
Geometric prepatterning-based tuning of the period doubling onset strain during thin-film wrinkling
Saha, Sourabh K.
2017-04-05
Wrinkling of thin films is an easy-to-implement and low-cost technique to fabricate stretch-tunable periodic micro and nanoscale structures. However, the tunability of such structures is often limited by the emergence of an undesirable period-doubled mode at high strains. Predictively tuning the onset strain for period doubling via existing techniques requires one to have extensive knowledge about the nonlinear pattern formation behavior. Herein, a geometric prepatterning-based technique is introduced that can be implemented even with limited system knowledge to predictively delay period doubling. The technique comprises prepatterning the film/base bilayer with a sinusoidal pattern that has the same period as themore » natural period of the system. This technique has been verified via physical and computational experiments on the polydimethylsiloxane (PDMS)/glass bilayer system. It is observed that the onset strain can be increased from the typical value of 20% for flat films to greater than 30% with a modest prepattern aspect ratio (2·amplitude/period) of 0.15. In addition, finite element simulations reveal that (i) the onset strain increases with increasing prepattern amplitude and (ii) the delaying effect can be captured entirely by the prepattern geometry. Therefore, one can implement this technique even with limited system knowledge, such as material properties or film thickness, by simply replicating pre-existing wrinkled patterns to generate prepatterned bilayers. Furthermore, geometric prepatterning is a practical scheme to increase the operating range of stretch-tunable wrinkle-based devices by at least 50%.« less
Exploration Medical System Trade Study Tools Overview
NASA Technical Reports Server (NTRS)
Mindock, J.; Myers, J.; Latorella, K.; Cerro, J.; Hanson, A.; Hailey, M.; Middour, C.
2018-01-01
ExMC is creating an ecosystem of tools to enable well-informed medical system trade studies. The suite of tools address important system implementation aspects of the space medical capabilities trade space and are being built using knowledge from the medical community regarding the unique aspects of space flight. Two integrating models, a systems engineering model and a medical risk analysis model, tie the tools together to produce an integrated assessment of the medical system and its ability to achieve medical system target requirements. This presentation will provide an overview of the various tools that are a part of the tool ecosystem. Initially, the presentation's focus will address the tools that supply the foundational information to the ecosystem. Specifically, the talk will describe how information that describes how medicine will be practiced is captured and categorized for efficient utilization in the tool suite. For example, the talk will include capturing what conditions will be planned for in-mission treatment, planned medical activities (e.g., periodic physical exam), required medical capabilities (e.g., provide imaging), and options to implement the capabilities (e.g., an ultrasound device). Database storage and configuration management will also be discussed. The presentation will include an overview of how these information tools will be tied to parameters in a Systems Modeling Language (SysML) model, allowing traceability to system behavioral, structural, and requirements content. The discussion will also describe an HRP-led enhanced risk assessment model developed to provide quantitative insight into each capability's contribution to mission success. Key outputs from these various tools, to be shared with the space medical and exploration mission development communities, will be assessments of medical system implementation option satisfaction of requirements and per-capability contributions toward achieving requirements.
40 CFR 63.4165 - How do I determine the emission capture system efficiency?
Code of Federal Regulations, 2010 CFR
2010-07-01
... system; coating solvent flash-off and coating, curing, and drying occurs within the capture system and... when being moved between a spray booth and a curing oven. (b) If the capture system does not meet both... surface preparation activities and drying or curing time. (c) Liquid-to-uncaptured-gas protocol using a...
Code of Federal Regulations, 2013 CFR
2013-07-01
... system and add-on control device operating limits during the performance test? 63.3546 Section 63.3546... of key parameters of the valve operating system (e.g., solenoid valve operation, air pressure... minimum operating limit for that specific capture device or system of multiple capture devices. The...
Code of Federal Regulations, 2014 CFR
2014-07-01
... capture system and add-on control device operating limits during the performance test? 63.4966 Section 63... system and add-on control device operating limits during the performance test? During the performance... outlet gas temperature is the maximum operating limit for your condenser. (e) Emission capture system...
Code of Federal Regulations, 2012 CFR
2012-07-01
... capture system and add-on control device operating limits during the performance test? 63.4966 Section 63... system and add-on control device operating limits during the performance test? During the performance... outlet gas temperature is the maximum operating limit for your condenser. (e) Emission capture system...
Stimson, D H R; Pringle, A J; Maillet, D; King, A R; Nevin, S T; Venkatachalam, T K; Reutens, D C; Bhalla, R
2016-09-01
The emphasis on the reduction of gaseous radioactive effluent associated with PET radiochemistry laboratories has increased. Various radioactive gas capture strategies have been employed historically including expensive automated compression systems. We have implemented a new cost-effective strategy employing gas capture bags with electronic feedback that are integrated with the cyclotron safety system. Our strategy is suitable for multiple automated 18 F radiosynthesis modules and individual automated 11 C radiosynthesis modules. We describe novel gas capture systems that minimize the risk of human error and are routinely used in our facility.
Carbon dioxide capture from atmospheric air using sodium hydroxide spray.
Stolaroff, Joshuah K; Keith, David W; Lowry, Gregory V
2008-04-15
In contrast to conventional carbon capture systems for power plants and other large point sources, the system described in this paper captures CO2 directly from ambient air. This has the advantages that emissions from diffuse sources and past emissions may be captured. The objective of this research is to determine the feasibility of a NaOH spray-based contactor for use in an air capture system by estimating the cost and energy requirements per unit CO2 captured. A prototype system is constructed and tested to measure CO2 absorption, energy use, and evaporative water loss and compared with theoretical predictions. A numerical model of drop collision and coalescence is used to estimate operating parameters for a full-scale system, and the cost of operating the system per unit CO2 captured is estimated. The analysis indicates that CO2 capture from air for climate change mitigation is technically feasible using off-the-shelf technology. Drop coalescence significantly decreases the CO2 absorption efficiency; however, fan and pump energy requirements are manageable. Water loss is significant (20 mol H2O/mol CO2 at 15 degrees C and 65% RH) but can be lowered by appropriately designing and operating the system. The cost of CO2 capture using NaOH spray (excluding solution recovery and CO2 sequestration, which may be comparable) in the full-scale system is 96 $/ton-CO2 in the base case, and ranges from 53 to 127 $/ton-CO2 under alternate operating parameters and assumptions regarding capital costs and mass transfer rate. The low end of the cost range is reached by a spray with 50 microm mean drop diameter, which is achievable with commercially available spray nozzles.
Real-time measurements, rare events and photon economics
NASA Astrophysics Data System (ADS)
Jalali, B.; Solli, D. R.; Goda, K.; Tsia, K.; Ropers, C.
2010-07-01
Rogue events otherwise known as outliers and black swans are singular, rare, events that carry dramatic impact. They appear in seemingly unconnected systems in the form of oceanic rogue waves, stock market crashes, evolution, and communication systems. Attempts to understand the underlying dynamics of such complex systems that lead to spectacular and often cataclysmic outcomes have been frustrated by the scarcity of events, resulting in insufficient statistical data, and by the inability to perform experiments under controlled conditions. Extreme rare events also occur in ultrafast physical sciences where it is possible to collect large data sets, even for rare events, in a short time period. The knowledge gained from observing rare events in ultrafast systems may provide valuable insight into extreme value phenomena that occur over a much slower timescale and that have a closer connection with human experience. One solution is a real-time ultrafast instrument that is capable of capturing singular and randomly occurring non-repetitive events. The time stretch technology developed during the past 13 years is providing a powerful tool box for reaching this goal. This paper reviews this technology and discusses its use in capturing rogue events in electronic signals, spectroscopy, and imaging. We show an example in nonlinear optics where it was possible to capture rare and random solitons whose unusual statistical distribution resemble those observed in financial markets. The ability to observe the true spectrum of each event in real time has led to important insight in understanding the underlying process, which in turn has made it possible to control soliton generation leading to improvement in the coherence of supercontinuum light. We also show a new class of fast imagers which are being considered for early detection of cancer because of their potential ability to detect rare diseased cells (so called rogue cells) in a large population of healthy cells.
Torshizi, Abolfazl Doostparast; Zarandi, Mohammad Hossein Fazel; Torshizi, Ghazaleh Doostparast; Eghbali, Kamyar
2014-01-01
This paper deals with application of fuzzy intelligent systems in diagnosing severity level and recommending appropriate therapies for patients having Benign Prostatic Hyperplasia. Such an intelligent system can have remarkable impacts on correct diagnosis of the disease and reducing risk of mortality. This system captures various factors from the patients using two modules. The first module determines severity level of the Benign Prostatic Hyperplasia and the second module, which is a decision making unit, obtains output of the first module accompanied by some external knowledge and makes an appropriate treatment decision based on its ontology model and a fuzzy type-1 system. In order to validate efficiency and accuracy of the developed system, a case study is conducted by 44 participants. Then the results are compared with the recommendations of a panel of experts on the experimental data. Then precision and accuracy of the results were investigated based on a statistical analysis. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Concurrent engineering design and management knowledge capture
NASA Technical Reports Server (NTRS)
1990-01-01
The topics are presented in viewgraph form and include the following: real-time management, personnel management, project management, conceptual design and decision making; the SITRF design problem; and the electronic-design notebook.
Shen, Xiaobai
2010-01-01
This paper provides an historical survey of the evolution of rice technology in China, from the traditional farming system to genetically modified rice today. Using sociotechnological analytical framework, it analyses rice technology as a socio-technical ensemble - a complex interaction of material and social elements, and discusses the specificity of technology development and its socio-technical outcomes. It points to two imperatives in rice variety development: wholesale transporting agricultural technology and social mechanism to developing countries are likely lead to negative consequences; indigenous innovation including deploying GM technology for seed varietal development and capturing/cultivating local knowledge will provide better solutions.
GOES Full Disk Shows First Day of Spring in the Northern Hemisphere
2014-03-20
This full-disk image from NOAA’s GOES-13 satellite was captured at 11:45 UTC (7:45 a.m. EDT) and shows the Americas on March 20, 2014. This date marks the start of astronomical spring in the northern hemisphere. Credit: NOAA/NASA GOES Project NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram
Bioinformatics for Exploration
NASA Technical Reports Server (NTRS)
Johnson, Kathy A.
2006-01-01
For the purpose of this paper, bioinformatics is defined as the application of computer technology to the management of biological information. It can be thought of as the science of developing computer databases and algorithms to facilitate and expedite biological research. This is a crosscutting capability that supports nearly all human health areas ranging from computational modeling, to pharmacodynamics research projects, to decision support systems within autonomous medical care. Bioinformatics serves to increase the efficiency and effectiveness of the life sciences research program. It provides data, information, and knowledge capture which further supports management of the bioastronautics research roadmap - identifying gaps that still remain and enabling the determination of which risks have been addressed.
System and process for capture of acid gasses at elevated pressure from gaseous process streams
Heldebrant, David J.; Koech, Phillip K.; Linehan, John C.; Rainbolt, James E.; Bearden, Mark D.; Zheng, Feng
2016-09-06
A system, method, and material that enables the pressure-activated reversible chemical capture of acid gasses such as CO.sub.2 from gas volumes such as streams, flows or any other volume. Once the acid gas is chemically captured, the resulting product typically a zwitterionic salt, can be subjected to a reduced pressure whereupon the resulting product will release the captures acid gas and the capture material will be regenerated. The invention includes this process as well as the materials and systems for carrying out and enabling this process.
Stability of a slotted ALOHA system with capture effect
NASA Astrophysics Data System (ADS)
Onozato, Yoshikuni; Liu, Jin; Noguchi, Shoichi
1989-02-01
The stability of a slotted ALOHA system with capture effect is investigated under a general communication environment where terminals are divided into two groups (low-power and high-power) and the capture effect is modeled by capture probabilities. An approximate analysis is developed using catastrophe theory, in which the effects of system and user parameters on the stability are characterized by the cusp catastrophe. Particular attention is given to the low-power group, since it must bear the strain under the capture effect. The stability conditions of the two groups are given explicitly by bifurcation sets.
A pharma perspective on the systems medicine and pharmacology of inflammation.
Lahoz-Beneytez, Julio; Schnizler, Katrin; Eissing, Thomas
2015-02-01
Biological systems are complex and comprehend multiple scales of organisation. Hence, holistic approaches are necessary to capture the behaviour of these entities from the molecular and cellular to the whole organism level. This also applies to the understanding and treatment of different diseases. Traditional systems biology has been successful in describing different biological phenomena at the cellular level, but it still lacks of a holistic description of the multi-scale interactions within the body. The importance of the physiological context is of particular interest in inflammation. Regulatory agencies have urged the scientific community to increase the translational power of bio-medical research and it has been recognised that modelling and simulation could be a path to follow. Interestingly, in pharma R&D, modelling and simulation has been employed since a long time ago. Systems pharmacology, and particularly physiologically based pharmacokinetic/pharmacodynamic models, serve as a suitable framework to integrate the available and emerging knowledge at different levels of the drug development process. Systems medicine and pharmacology of inflammation will potentially benefit from this framework in order to better understand inflammatory diseases and to help to transfer the vast knowledge on the molecular and cellular level into a more physiological context. Ultimately, this may lead to reliable predictions of clinical outcomes such as disease progression or treatment efficacy, contributing thereby to a better care of patients. Copyright © 2014 Elsevier Inc. All rights reserved.
Study on launch scheme of space-net capturing system.
Gao, Qingyu; Zhang, Qingbin; Feng, Zhiwei; Tang, Qiangang
2017-01-01
With the continuous progress in active debris-removal technology, scientists are increasingly concerned about the concept of space-net capturing system. The space-net capturing system is a long-range-launch flexible capture system, which has great potential to capture non-cooperative targets such as inactive satellites and upper stages. In this work, the launch scheme is studied by experiment and simulation, including two-step ejection and multi-point-traction analyses. The numerical model of the tether/net is based on finite element method and is verified by full-scale ground experiment. The results of the ground experiment and numerical simulation show that the two-step ejection and six-point traction scheme of the space-net system is superior to the traditional one-step ejection and four-point traction launch scheme.
Study on launch scheme of space-net capturing system
Zhang, Qingbin; Feng, Zhiwei; Tang, Qiangang
2017-01-01
With the continuous progress in active debris-removal technology, scientists are increasingly concerned about the concept of space-net capturing system. The space-net capturing system is a long-range-launch flexible capture system, which has great potential to capture non-cooperative targets such as inactive satellites and upper stages. In this work, the launch scheme is studied by experiment and simulation, including two-step ejection and multi-point-traction analyses. The numerical model of the tether/net is based on finite element method and is verified by full-scale ground experiment. The results of the ground experiment and numerical simulation show that the two-step ejection and six-point traction scheme of the space-net system is superior to the traditional one-step ejection and four-point traction launch scheme. PMID:28877187
Adaptive System Modeling for Spacecraft Simulation
NASA Technical Reports Server (NTRS)
Thomas, Justin
2011-01-01
This invention introduces a methodology and associated software tools for automatically learning spacecraft system models without any assumptions regarding system behavior. Data stream mining techniques were used to learn models for critical portions of the International Space Station (ISS) Electrical Power System (EPS). Evaluation on historical ISS telemetry data shows that adaptive system modeling reduces simulation error anywhere from 50 to 90 percent over existing approaches. The purpose of the methodology is to outline how someone can create accurate system models from sensor (telemetry) data. The purpose of the software is to support the methodology. The software provides analysis tools to design the adaptive models. The software also provides the algorithms to initially build system models and continuously update them from the latest streaming sensor data. The main strengths are as follows: Creates accurate spacecraft system models without in-depth system knowledge or any assumptions about system behavior. Automatically updates/calibrates system models using the latest streaming sensor data. Creates device specific models that capture the exact behavior of devices of the same type. Adapts to evolving systems. Can reduce computational complexity (faster simulations).
Destination pluto: New horizons performance during the approach phase
NASA Astrophysics Data System (ADS)
Flanigan, Sarah H.; Rogers, Gabe D.; Guo, Yanping; Kirk, Madeline N.; Weaver, Harold A.; Owen, William M.; Jackman, Coralie D.; Bauman, Jeremy; Pelletier, Frederic; Nelson, Derek; Stanbridge, Dale; Dumont, Phillip J.; Williams, Bobby; Stern, S. Alan; Olkin, Cathy B.; Young, Leslie A.; Ennico, Kimberly
2016-11-01
The New Horizons spacecraft began its journey to the Pluto-Charon system on January 19, 2006 on-board an Atlas V rocket from Cape Canaveral, Florida. As the first mission in NASA's New Frontiers program, the objective of the New Horizons mission is to perform the first exploration of ice dwarfs in the Kuiper Belt, extending knowledge of the solar system to include the icy "third zone" for the first time. Arriving at the correct time and correct position relative to Pluto on July 14, 2015 depended on the successful execution of a carefully choreographed sequence of events. The Core command sequence, which was developed and optimized over multiple years and included the highest-priority science observations during the closest approach period, was contingent on precise navigation to the Pluto-Charon system and nominal performance of the guidance and control (G&C) subsystem. The flyby and gravity assist of Jupiter on February 28, 2007 was critical in placing New Horizons on the path to Pluto. Once past Jupiter, trajectory correction maneuvers (TCMs) became the sole source of trajectory control since the spacecraft did not encounter any other planetary bodies along its flight path prior to Pluto. During the Pluto approach phase, which formally began on January 15, 2015, optical navigation images were captured primarily with the Long Range Reconnaissance Imager to refine spacecraft and Pluto-Charon system trajectory knowledge, which in turn was used to design TCMs. Orbit determination solutions were also used to update the spacecraft's on-board trajectory knowledge throughout the approach phase. Nominal performance of the G&C subsystem, accurate TCM designs, and high-quality orbit determination solutions resulted in final Pluto-relative B-plane arrival conditions that facilitated a successful first reconnaissance of the Pluto-Charon system.
System for objective assessment of image differences in digital cinema
NASA Astrophysics Data System (ADS)
Fliegel, Karel; Krasula, Lukáš; Páta, Petr; Myslík, Jiří; Pecák, Josef; Jícha, Marek
2014-09-01
There is high demand for quick digitization and subsequent image restoration of archived film records. Digitization is very urgent in many cases because various invaluable pieces of cultural heritage are stored on aging media. Only selected records can be reconstructed perfectly using painstaking manual or semi-automatic procedures. This paper aims to answer the question what are the quality requirements on the restoration process in order to obtain acceptably close visual perception of the digitally restored film in comparison to the original analog film copy. This knowledge is very important to preserve the original artistic intention of the movie producers. Subjective experiment with artificially distorted images has been conducted in order to answer the question what is the visual impact of common image distortions in digital cinema. Typical color and contrast distortions were introduced and test images were presented to viewers using digital projector. Based on the outcome of this subjective evaluation a system for objective assessment of image distortions has been developed and its performance tested. The system utilizes calibrated digital single-lens reflex camera and subsequent analysis of suitable features of images captured from the projection screen. The evaluation of captured image data has been optimized in order to obtain predicted differences between the reference and distorted images while achieving high correlation with the results of subjective assessment. The system can be used to objectively determine the difference between analog film and digital cinema images on the projection screen.
75 FR 28024 - Agency Information Collection Activities: Proposed Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-19
... the data-capturing process. SAMHSA will place Web site registration information into a Knowledge Management database and will place email subscription information into a database maintained by a third-party...
Martins, Ana Paula Barbosa; Feitosa, Leonardo Manir; Lessa, Rosangela Paula; Almeida, Zafira Silva; Heupel, Michelle; Silva, Wagner Macedo; Tchaicka, Ligia; Nunes, Jorge Luiz Silva
2018-01-01
Increasing fishing effort has caused declines in shark populations worldwide. Understanding biological and ecological characteristics of sharks is essential to effectively implement management measures, but to fully understand drivers of fishing pressure social factors must be considered through multidisciplinary and integrated approaches. The present study aimed to use fisher and trader knowledge to describe the shark catch and product supply chain in Northeastern Brazil, and evaluate perceptions regarding the regional conservation status of shark species. Non-systematic observations and structured individual interviews were conducted with experienced fishers and traders. The demand and economic value of shark fins has reportedly decreased over the last 10 years while the shark meat trade has increased slightly, including a small increase in the average price per kilogram of meat. Several threatened shark species were reportedly often captured off shore and traded at local markets. This reported and observed harvest breaches current Brazilian environmental laws. Fishing communities are aware of population declines of several shark species, but rarely take action to avoid capture of sharks. The continuing capture of sharks is mainly due to a lack of knowledge of environmental laws, lack of enforcement by responsible authorities, and difficulties encountered by fishers in finding alternative income streams. National and regional conservation measures are immediately required to reduce overfishing on shark populations in Northeastern Brazil. Social and economic improvements for poor fishing communities must also be implemented to achieve sustainable fisheries.
Almeida, Zafira Silva; Heupel, Michelle; Silva, Wagner Macedo; Tchaicka, Ligia
2018-01-01
Increasing fishing effort has caused declines in shark populations worldwide. Understanding biological and ecological characteristics of sharks is essential to effectively implement management measures, but to fully understand drivers of fishing pressure social factors must be considered through multidisciplinary and integrated approaches. The present study aimed to use fisher and trader knowledge to describe the shark catch and product supply chain in Northeastern Brazil, and evaluate perceptions regarding the regional conservation status of shark species. Non-systematic observations and structured individual interviews were conducted with experienced fishers and traders. The demand and economic value of shark fins has reportedly decreased over the last 10 years while the shark meat trade has increased slightly, including a small increase in the average price per kilogram of meat. Several threatened shark species were reportedly often captured off shore and traded at local markets. This reported and observed harvest breaches current Brazilian environmental laws. Fishing communities are aware of population declines of several shark species, but rarely take action to avoid capture of sharks. The continuing capture of sharks is mainly due to a lack of knowledge of environmental laws, lack of enforcement by responsible authorities, and difficulties encountered by fishers in finding alternative income streams. National and regional conservation measures are immediately required to reduce overfishing on shark populations in Northeastern Brazil. Social and economic improvements for poor fishing communities must also be implemented to achieve sustainable fisheries. PMID:29534100
MSFC's Advanced Space Propulsion Formulation Task
NASA Technical Reports Server (NTRS)
Huebner, Lawrence D.; Gerrish, Harold P.; Robinson, Joel W.; Taylor, Terry L.
2012-01-01
In NASA s Fiscal Year 2012, a small project was undertaken to provide additional substance, depth, and activity knowledge to the technology areas identified in the In-Space Propulsion Systems Roadmap, Technology Area 02 (TA-02), as created under the auspices of the NASA Office of the Chief Technologist (OCT). This roadmap was divided into four basic groups: (1) Chemical Propulsion, (2) Non-chemical Propulsion, (3) Advanced (TRL<3) Propulsion Technologies, and (4) Supporting Technologies. The first two were grouped according to the governing physics. The third group captured technologies and physic concepts that are at a lower TRL level. The fourth group identified pertinent technical areas that are strongly coupled with these related areas which could allow significant improvements in performance. There were a total of 45 technologies identified in TA-02, and 25 of these were studied in this formulation task. The goal of this task was to provide OCT with a knowledge-base for decisionmaking on advanced space propulsion technologies and not waste money by unintentionally repeating past projects or funding the technologies with minor impacts. This formulation task developed the next level of detail for technologies described and provides context to OCT where investments should be made. The presentation will begin with the list of technologies from TA-02, how they were prioritized for this study, and details on what additional data was captured for the technologies studied. Following this, some samples of the documentation will be provided, followed by plans on how the data will be made accessible.
Depletion and capture: revisiting "the source of water derived from wells".
Konikow, L F; Leake, S A
2014-09-01
A natural consequence of groundwater withdrawals is the removal of water from subsurface storage, but the overall rates and magnitude of groundwater depletion and capture relative to groundwater withdrawals (extraction or pumpage) have not previously been well characterized. This study assesses the partitioning of long-term cumulative withdrawal volumes into fractions derived from storage depletion and capture, where capture includes both increases in recharge and decreases in discharge. Numerical simulation of a hypothetical groundwater basin is used to further illustrate some of Theis' (1940) principles, particularly when capture is constrained by insufficient available water. Most prior studies of depletion and capture have assumed that capture is unconstrained through boundary conditions that yield linear responses. Examination of real systems indicates that capture and depletion fractions are highly variable in time and space. For a large sample of long-developed groundwater systems, the depletion fraction averages about 0.15 and the capture fraction averages about 0.85 based on cumulative volumes. Higher depletion fractions tend to occur in more arid regions, but the variation is high and the correlation coefficient between average annual precipitation and depletion fraction for individual systems is only 0.40. Because 85% of long-term pumpage is derived from capture in these real systems, capture must be recognized as a critical factor in assessing water budgets, groundwater storage depletion, and sustainability of groundwater development. Most capture translates into streamflow depletion, so it can detrimentally impact ecosystems. Published 2014. This article is a U.S. Government work and is in the public domain in the USA.
Functional networks inference from rule-based machine learning models.
Lazzarini, Nicola; Widera, Paweł; Williamson, Stuart; Heer, Rakesh; Krasnogor, Natalio; Bacardit, Jaume
2016-01-01
Functional networks play an important role in the analysis of biological processes and systems. The inference of these networks from high-throughput (-omics) data is an area of intense research. So far, the similarity-based inference paradigm (e.g. gene co-expression) has been the most popular approach. It assumes a functional relationship between genes which are expressed at similar levels across different samples. An alternative to this paradigm is the inference of relationships from the structure of machine learning models. These models are able to capture complex relationships between variables, that often are different/complementary to the similarity-based methods. We propose a protocol to infer functional networks from machine learning models, called FuNeL. It assumes, that genes used together within a rule-based machine learning model to classify the samples, might also be functionally related at a biological level. The protocol is first tested on synthetic datasets and then evaluated on a test suite of 8 real-world datasets related to human cancer. The networks inferred from the real-world data are compared against gene co-expression networks of equal size, generated with 3 different methods. The comparison is performed from two different points of view. We analyse the enriched biological terms in the set of network nodes and the relationships between known disease-associated genes in a context of the network topology. The comparison confirms both the biological relevance and the complementary character of the knowledge captured by the FuNeL networks in relation to similarity-based methods and demonstrates its potential to identify known disease associations as core elements of the network. Finally, using a prostate cancer dataset as a case study, we confirm that the biological knowledge captured by our method is relevant to the disease and consistent with the specialised literature and with an independent dataset not used in the inference process. The implementation of our network inference protocol is available at: http://ico2s.org/software/funel.html.
Validation of enhanced kinect sensor based motion capturing for gait assessment
Müller, Björn; Ilg, Winfried; Giese, Martin A.
2017-01-01
Optical motion capturing systems are expensive and require substantial dedicated space to be set up. On the other hand, they provide unsurpassed accuracy and reliability. In many situations however flexibility is required and the motion capturing system can only temporarily be placed. The Microsoft Kinect v2 sensor is comparatively cheap and with respect to gait analysis promising results have been published. We here present a motion capturing system that is easy to set up, flexible with respect to the sensor locations and delivers high accuracy in gait parameters comparable to a gold standard motion capturing system (VICON). Further, we demonstrate that sensor setups which track the person only from one-side are less accurate and should be replaced by two-sided setups. With respect to commonly analyzed gait parameters, especially step width, our system shows higher agreement with the VICON system than previous reports. PMID:28410413
Transforming Functional Requirements from UML into BPEL to Efficiently Develop SOA-Based Systems
NASA Astrophysics Data System (ADS)
Vemulapalli, Anisha; Subramanian, Nary
The intended behavior of any system such as services, tasks or functions can be captured by functional requirements of the system. As our dependence on online services has grown steadily, the web applications are being developed employing the SOA. BPEL4WS provides a means for expressing functional requirements of an SOA-based system by providing constructs to capture business goals and objectives for the system. In this paper we propose an approach for transforming user-centered requirements captured using UML into a corresponding BPEL specification, where the business processes are captured by means of use-cases from which UML sequence diagrams and activity diagrams are extracted. Subsequently these UML models are mapped to BPEL specifications that capture the essence of the initial business requirements to develop the SOA-based system by employing CASE tools. A student housing system is used as a case study to illustrate this approach and the system is validated using NetBeans.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hills, J.G.
1992-02-06
Nuclear explosives may be used to capture small asteroids (e.g., 20--50 meters in diameter) into bound orbits around the earth. The captured objects could be used for construction material for manned and unmanned activity in Earth orbit. Asteroids with small approach velocities, which are the ones most likely to have close approaches to the Earth, require the least energy for capture. They are particularly easy to capture if they pass within one Earth radius of the surface of the Earth. They could be intercepted with intercontinental missiles if the latter were retrofit with a more flexible guiding and homing capability.more » This asteroid capture-defense system could be implemented in a few years at low cost by using decommissioned ICMs. The economic value of even one captured asteroid is many times the initial investment. The asteroid capture system would be an essential part of the learning curve for dealing with larger asteroids that can hit the earth.« less
40 CFR 63.4291 - What are my options for meeting the emission limits?
Code of Federal Regulations, 2010 CFR
2010-07-01
... emission capture systems and add-on controls, the organic HAP emission rate for the web coating/printing... demonstrate that all capture systems and control devices for the web coating/printing operation(s) meet the... capture systems and control devices for the web coating/printing operation(s) meet the operating limits...
40 CFR 63.4181 - What definitions apply to this subpart?
Code of Federal Regulations, 2011 CFR
2011-07-01
... commercial or industrial HVAC systems. Manufacturer's formulation data means data on a material (such as a... capture system efficiency means the portion (expressed as a percentage) of the pollutants from an emission source that is delivered to an add-on control device. Capture system means one or more capture devices...
40 CFR 63.4181 - What definitions apply to this subpart?
Code of Federal Regulations, 2012 CFR
2012-07-01
... commercial or industrial HVAC systems. Manufacturer's formulation data means data on a material (such as a.... Capture efficiency or capture system efficiency means the portion (expressed as a percentage) of the pollutants from an emission source that is delivered to an add-on control device. Capture system means one or...
40 CFR 63.4181 - What definitions apply to this subpart?
Code of Federal Regulations, 2014 CFR
2014-07-01
... commercial or industrial HVAC systems. Manufacturer's formulation data means data on a material (such as a.... Capture efficiency or capture system efficiency means the portion (expressed as a percentage) of the pollutants from an emission source that is delivered to an add-on control device. Capture system means one or...