Sample records for knowledge engineering tools

  1. Computational approaches to metabolic engineering utilizing systems biology and synthetic biology.

    PubMed

    Fong, Stephen S

    2014-08-01

    Metabolic engineering modifies cellular function to address various biochemical applications. Underlying metabolic engineering efforts are a host of tools and knowledge that are integrated to enable successful outcomes. Concurrent development of computational and experimental tools has enabled different approaches to metabolic engineering. One approach is to leverage knowledge and computational tools to prospectively predict designs to achieve the desired outcome. An alternative approach is to utilize combinatorial experimental tools to empirically explore the range of cellular function and to screen for desired traits. This mini-review focuses on computational systems biology and synthetic biology tools that can be used in combination for prospective in silico strain design.

  2. An integrated knowledge system for wind tunnel testing - Project Engineers' Intelligent Assistant

    NASA Technical Reports Server (NTRS)

    Lo, Ching F.; Shi, George Z.; Hoyt, W. A.; Steinle, Frank W., Jr.

    1993-01-01

    The Project Engineers' Intelligent Assistant (PEIA) is an integrated knowledge system developed using artificial intelligence technology, including hypertext, expert systems, and dynamic user interfaces. This system integrates documents, engineering codes, databases, and knowledge from domain experts into an enriched hypermedia environment and was designed to assist project engineers in planning and conducting wind tunnel tests. PEIA is a modular system which consists of an intelligent user-interface, seven modules and an integrated tool facility. Hypermedia technology is discussed and the seven PEIA modules are described. System maintenance and updating is very easy due to the modular structure and the integrated tool facility provides user access to commercial software shells for documentation, reporting, or database updating. PEIA is expected to provide project engineers with technical information, increase efficiency and productivity, and provide a realistic tool for personnel training.

  3. Voice-enabled Knowledge Engine using Flood Ontology and Natural Language Processing

    NASA Astrophysics Data System (ADS)

    Sermet, M. Y.; Demir, I.; Krajewski, W. F.

    2015-12-01

    The Iowa Flood Information System (IFIS) is a web-based platform developed by the Iowa Flood Center (IFC) to provide access to flood inundation maps, real-time flood conditions, flood forecasts, flood-related data, information and interactive visualizations for communities in Iowa. The IFIS is designed for use by general public, often people with no domain knowledge and limited general science background. To improve effective communication with such audience, we have introduced a voice-enabled knowledge engine on flood related issues in IFIS. Instead of navigating within many features and interfaces of the information system and web-based sources, the system provides dynamic computations based on a collection of built-in data, analysis, and methods. The IFIS Knowledge Engine connects to real-time stream gauges, in-house data sources, analysis and visualization tools to answer natural language questions. Our goal is the systematization of data and modeling results on flood related issues in Iowa, and to provide an interface for definitive answers to factual queries. The goal of the knowledge engine is to make all flood related knowledge in Iowa easily accessible to everyone, and support voice-enabled natural language input. We aim to integrate and curate all flood related data, implement analytical and visualization tools, and make it possible to compute answers from questions. The IFIS explicitly implements analytical methods and models, as algorithms, and curates all flood related data and resources so that all these resources are computable. The IFIS Knowledge Engine computes the answer by deriving it from its computational knowledge base. The knowledge engine processes the statement, access data warehouse, run complex database queries on the server-side and return outputs in various formats. This presentation provides an overview of IFIS Knowledge Engine, its unique information interface and functionality as an educational tool, and discusses the future plans for providing knowledge on flood related issues and resources. IFIS Knowledge Engine provides an alternative access method to these comprehensive set of tools and data resources available in IFIS. Current implementation of the system accepts free-form input and voice recognition capabilities within browser and mobile applications.

  4. The Art of Artificial Intelligence. 1. Themes and Case Studies of Knowledge Engineering

    DTIC Science & Technology

    1977-08-01

    in scientific and medical inference illuminate the art of knowledge engineering and its parent science , Artificial Intelligence....The knowledge engineer practices the art of bringing the principles and tools of AI research to bear on difficult applications problems requiring

  5. Knowledge management: An abstraction of knowledge base and database management systems

    NASA Technical Reports Server (NTRS)

    Riedesel, Joel D.

    1990-01-01

    Artificial intelligence application requirements demand powerful representation capabilities as well as efficiency for real-time domains. Many tools exist, the most prevalent being expert systems tools such as ART, KEE, OPS5, and CLIPS. Other tools just emerging from the research environment are truth maintenance systems for representing non-monotonic knowledge, constraint systems, object oriented programming, and qualitative reasoning. Unfortunately, as many knowledge engineers have experienced, simply applying a tool to an application requires a large amount of effort to bend the application to fit. Much work goes into supporting work to make the tool integrate effectively. A Knowledge Management Design System (KNOMAD), is described which is a collection of tools built in layers. The layered architecture provides two major benefits; the ability to flexibly apply only those tools that are necessary for an application, and the ability to keep overhead, and thus inefficiency, to a minimum. KNOMAD is designed to manage many knowledge bases in a distributed environment providing maximum flexibility and expressivity to the knowledge engineer while also providing support for efficiency.

  6. An engineering paradigm in the biomedical sciences: Knowledge as epistemic tool.

    PubMed

    Boon, Mieke

    2017-10-01

    In order to deal with the complexity of biological systems and attempts to generate applicable results, current biomedical sciences are adopting concepts and methods from the engineering sciences. Philosophers of science have interpreted this as the emergence of an engineering paradigm, in particular in systems biology and synthetic biology. This article aims at the articulation of the supposed engineering paradigm by contrast with the physics paradigm that supported the rise of biochemistry and molecular biology. This articulation starts from Kuhn's notion of a disciplinary matrix, which indicates what constitutes a paradigm. It is argued that the core of the physics paradigm is its metaphysical and ontological presuppositions, whereas the core of the engineering paradigm is the epistemic aim of producing useful knowledge for solving problems external to the scientific practice. Therefore, the two paradigms involve distinct notions of knowledge. Whereas the physics paradigm entails a representational notion of knowledge, the engineering paradigm involves the notion of 'knowledge as epistemic tool'. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Formalization of the engineering science discipline - knowledge engineering

    NASA Astrophysics Data System (ADS)

    Peng, Xiao

    Knowledge is the most precious ingredient facilitating aerospace engineering research and product development activities. Currently, the most common knowledge retention methods are paper-based documents, such as reports, books and journals. However, those media have innate weaknesses. For example, four generations of flying wing aircraft (Horten, Northrop XB-35/YB-49, Boeing BWB and many others) were mostly developed in isolation. The subsequent engineers were not aware of the previous developments, because these projects were documented such which prevented the next generation of engineers to benefit from the previous lessons learned. In this manner, inefficient knowledge retention methods have become a primary obstacle for knowledge transfer from the experienced to the next generation of engineers. In addition, the quality of knowledge itself is a vital criterion; thus, an accurate measure of the quality of 'knowledge' is required. Although qualitative knowledge evaluation criteria have been researched in other disciplines, such as the AAA criterion by Ernest Sosa stemming from the field of philosophy, a quantitative knowledge evaluation criterion needs to be developed which is capable to numerically determine the qualities of knowledge for aerospace engineering research and product development activities. To provide engineers with a high-quality knowledge management tool, the engineering science discipline Knowledge Engineering has been formalized to systematically address knowledge retention issues. This research undertaking formalizes Knowledge Engineering as follows: 1. Categorize knowledge according to its formats and representations for the first time, which serves as the foundation for the subsequent knowledge management function development. 2. Develop an efficiency evaluation criterion for knowledge management by analyzing the characteristics of both knowledge and the parties involved in the knowledge management processes. 3. Propose and develop an innovative Knowledge-Based System (KBS), AVD KBS, forming a systematic approach facilitating knowledge management. 4. Demonstrate the efficiency advantages of AVDKBS over traditional knowledge management methods via selected design case studies. This research formalizes, for the first time, Knowledge Engineering as a distinct discipline by delivering a robust and high-quality knowledge management and process tool, AVDKBS. Formalizing knowledge proves to significantly impact the effectiveness of aerospace knowledge retention and utilization.

  8. Knowledge Management tools integration within DLR's concurrent engineering facility

    NASA Astrophysics Data System (ADS)

    Lopez, R. P.; Soragavi, G.; Deshmukh, M.; Ludtke, D.

    The complexity of space endeavors has increased the need for Knowledge Management (KM) tools. The concept of KM involves not only the electronic storage of knowledge, but also the process of making this knowledge available, reusable and traceable. Establishing a KM concept within the Concurrent Engineering Facility (CEF) has been a research topic of the German Aerospace Centre (DLR). This paper presents the current KM tools of the CEF: the Software Platform for Organizing and Capturing Knowledge (S.P.O.C.K.), the data model Virtual Satellite (VirSat), and the Simulation Model Library (SimMoLib), and how their usage improved the Concurrent Engineering (CE) process. This paper also exposes the lessons learned from the introduction of KM practices into the CEF and elaborates a roadmap for the further development of KM in CE activities at DLR. The results of the application of the Knowledge Management tools have shown the potential of merging the three software platforms with their functionalities, as the next step towards the fully integration of KM practices into the CE process. VirSat will stay as the main software platform used within a CE study, and S.P.O.C.K. and SimMoLib will be integrated into VirSat. These tools will support the data model as a reference and documentation source, and as an access to simulation and calculation models. The use of KM tools in the CEF aims to become a basic practice during the CE process. The settlement of this practice will result in a much more extended knowledge and experience exchange within the Concurrent Engineering environment and, consequently, the outcome of the studies will comprise higher quality in the design of space systems.

  9. Satellite Contamination and Materials Outgassing Knowledge base

    NASA Technical Reports Server (NTRS)

    Minor, Jody L.; Kauffman, William J. (Technical Monitor)

    2001-01-01

    Satellite contamination continues to be a design problem that engineers must take into account when developing new satellites. To help with this issue, NASA's Space Environments and Effects (SEE) Program funded the development of the Satellite Contamination and Materials Outgassing Knowledge base. This engineering tool brings together in one location information about the outgassing properties of aerospace materials based upon ground-testing data, the effects of outgassing that has been observed during flight and measurements of the contamination environment by on-orbit instruments. The knowledge base contains information using the ASTM Standard E- 1559 and also consolidates data from missions using quartz-crystal microbalances (QCM's). The data contained in the knowledge base was shared with NASA by government agencies and industry in the US and international space agencies as well. The term 'knowledgebase' was used because so much information and capability was brought together in one comprehensive engineering design tool. It is the SEE Program's intent to continually add additional material contamination data as it becomes available - creating a dynamic tool whose value to the user is ever increasing. The SEE Program firmly believes that NASA, and ultimately the entire contamination user community, will greatly benefit from this new engineering tool and highly encourages the community to not only use the tool but add data to it as well.

  10. Development of a Spacecraft Materials Selector Expert System

    NASA Technical Reports Server (NTRS)

    Pippin, G.; Kauffman, W. (Technical Monitor)

    2002-01-01

    This report contains a description of the knowledge base tool and examples of its use. A downloadable version of the Spacecraft Materials Selector (SMS) knowledge base is available through the NASA Space Environments and Effects Program. The "Spacecraft Materials Selector" knowledge base is part of an electronic expert system. The expert system consists of an inference engine that contains the "decision-making" code and the knowledge base that contains the selected body of information. The inference engine is a software package previously developed at Boeing, called the Boeing Expert System Tool (BEST) kit.

  11. Inductive knowledge acquisition experience with commercial tools for space shuttle main engine testing

    NASA Technical Reports Server (NTRS)

    Modesitt, Kenneth L.

    1990-01-01

    Since 1984, an effort has been underway at Rocketdyne, manufacturer of the Space Shuttle Main Engine (SSME), to automate much of the analysis procedure conducted after engine test firings. Previously published articles at national and international conferences have contained the context of and justification for this effort. Here, progress is reported in building the full system, including the extensions of integrating large databases with the system, known as Scotty. Inductive knowledge acquisition has proven itself to be a key factor in the success of Scotty. The combination of a powerful inductive expert system building tool (ExTran), a relational data base management system (Reliance), and software engineering principles and Computer-Assisted Software Engineering (CASE) tools makes for a practical, useful and state-of-the-art application of an expert system.

  12. Project-Based Teaching-Learning Computer-Aided Engineering Tools

    ERIC Educational Resources Information Center

    Simoes, J. A.; Relvas, C.; Moreira, R.

    2004-01-01

    Computer-aided design, computer-aided manufacturing, computer-aided analysis, reverse engineering and rapid prototyping are tools that play an important key role within product design. These are areas of technical knowledge that must be part of engineering and industrial design courses' curricula. This paper describes our teaching experience of…

  13. A Capstone Wiki Knowledge Base: A Case Study of an Online Tool Designed to Promote Life-Long Learning through Engineering Literature Research

    ERIC Educational Resources Information Center

    Clarke, James B.; Coyle, James R.

    2011-01-01

    This article reports the results of a case study in which an experimental wiki knowledge base was designed, developed, and tested by the Brill Science Library at Miami University for an undergraduate engineering senior capstone project. The wiki knowledge base was created to determine if the science library could enhance the engineering literature…

  14. TARGET's role in knowledge acquisition, engineering, validation, and documentation

    NASA Technical Reports Server (NTRS)

    Levi, Keith R.

    1994-01-01

    We investigate the use of the TARGET task analysis tool for use in the development of rule-based expert systems. We found TARGET to be very helpful in the knowledge acquisition process. It enabled us to perform knowledge acquisition with one knowledge engineer rather than two. In addition, it improved communication between the domain expert and knowledge engineer. We also found it to be useful for both the rule development and refinement phases of the knowledge engineering process. Using the network in these phases required us to develop guidelines that enabled us to easily translate the network into production rules. A significant requirement for TARGET remaining useful throughout the knowledge engineering process was the need to carefully maintain consistency between the network and the rule representations. Maintaining consistency not only benefited the knowledge engineering process, but also has significant payoffs in the areas of validation of the expert system and documentation of the knowledge in the system.

  15. A quality assessment tool for markup-based clinical guidelines.

    PubMed

    Shalom, Erez; Shahar, Yuval; Taieb-Maimon, Meirav; Lunenfeld, Eitan

    2008-11-06

    We introduce a tool for quality assessment of procedural and declarative knowledge. We developed this tool for evaluating the specification of mark-up-based clinical GLs. Using this graphical tool, the expert physician and knowledge engineer collaborate to perform scoring, using pre-defined scoring scale, each of the knowledge roles of the mark-ups, comparing it to a gold standard. The tool enables scoring the mark-ups simultaneously at different sites by different users at different locations.

  16. Knowledge-Acquisition Tool For Expert System

    NASA Technical Reports Server (NTRS)

    Disbrow, James D.; Duke, Eugene L.; Regenie, Victoria A.

    1988-01-01

    Digital flight-control systems monitored by computer program that evaluates and recommends. Flight-systems engineers for advanced, high-performance aircraft use knowlege-acquisition tool for expert-system flight-status monitor suppling interpretative data. Interpretative function especially important in time-critical, high-stress situations because it facilitates problem identification and corrective strategy. Conditions evaluated and recommendations made by ground-based engineers having essential knowledge for analysis and monitoring of performances of advanced aircraft systems.

  17. Framework Support For Knowledge-Based Software Development

    NASA Astrophysics Data System (ADS)

    Huseth, Steve

    1988-03-01

    The advent of personal engineering workstations has brought substantial information processing power to the individual programmer. Advanced tools and environment capabilities supporting the software lifecycle are just beginning to become generally available. However, many of these tools are addressing only part of the software development problem by focusing on rapid construction of self-contained programs by a small group of talented engineers. Additional capabilities are required to support the development of large programming systems where a high degree of coordination and communication is required among large numbers of software engineers, hardware engineers, and managers. A major player in realizing these capabilities is the framework supporting the software development environment. In this paper we discuss our research toward a Knowledge-Based Software Assistant (KBSA) framework. We propose the development of an advanced framework containing a distributed knowledge base that can support the data representation needs of tools, provide environmental support for the formalization and control of the software development process, and offer a highly interactive and consistent user interface.

  18. Hubble Space Telescope Design Engineering Knowledgebase (HSTDEK)

    NASA Technical Reports Server (NTRS)

    Johannes, James D.; Everetts, Clark

    1989-01-01

    The research covered here pays specific attention to the development of tools to assist knowledge engineers in acquiring knowledge and to assist other technical, engineering, and management personnel in automatically performing knowledge capture as part of their everyday work without adding any extra work to what they already do. Requirements for data products, the knowledge base, and methods for mapping knowledge in the documents onto the knowledge representations are discussed, as are some of the difficulties of capturing in the knowledge base the structure of the design process itself, along with a model of the system designed. The capture of knowledge describing the interactions of different components is also discussed briefly.

  19. Faculty Recommendations for Web Tools: Implications for Course Management Systems

    ERIC Educational Resources Information Center

    Oliver, Kevin; Moore, John

    2008-01-01

    A gap analysis of web tools in Engineering was undertaken as one part of the Digital Library Network for Engineering and Technology (DLNET) grant funded by NSF (DUE-0085849). DLNET represents a Web portal and an online review process to archive quality knowledge objects in Engineering and Technology disciplines. The gap analysis coincided with the…

  20. Perspectives on knowledge in engineering design

    NASA Technical Reports Server (NTRS)

    Rasdorf, W. J.

    1985-01-01

    Various perspectives are given of the knowledge currently used in engineering design, specifically dealing with knowledge-based expert systems (KBES). Constructing an expert system often reveals inconsistencies in domain knowledge while formalizing it. The types of domain knowledge (facts, procedures, judgments, and control) differ from the classes of that knowledge (creative, innovative, and routine). The feasible tasks for expert systems can be determined based on these types and classes of knowledge. Interpretive tasks require reasoning about a task in light of the knowledge available, where generative tasks create potential solutions to be tested against constraints. Only after classifying the domain by type and level can the engineer select a knowledge-engineering tool for the domain being considered. The critical features to be weighed after classification are knowledge representation techniques, control strategies, interface requirements, compatibility with traditional systems, and economic considerations.

  1. Knowledge engineering in volcanology: Practical claims and general approach

    NASA Astrophysics Data System (ADS)

    Pshenichny, Cyril A.

    2014-10-01

    Knowledge engineering, being a branch of artificial intelligence, offers a variety of methods for elicitation and structuring of knowledge in a given domain. Only a few of them (ontologies and semantic nets, event/probability trees, Bayesian belief networks and event bushes) are known to volcanologists. Meanwhile, the tasks faced by volcanology and the solutions found so far favor a much wider application of knowledge engineering, especially tools for handling dynamic knowledge. This raises some fundamental logical and mathematical problems and requires an organizational effort, but may strongly improve panel discussions, enhance decision support, optimize physical modeling and support scientific collaboration.

  2. Knowledge-acquisition tools for medical knowledge-based systems.

    PubMed

    Lanzola, G; Quaglini, S; Stefanelli, M

    1995-03-01

    Knowledge-based systems (KBS) have been proposed to solve a large variety of medical problems. A strategic issue for KBS development and maintenance are the efforts required for both knowledge engineers and domain experts. The proposed solution is building efficient knowledge acquisition (KA) tools. This paper presents a set of KA tools we are developing within a European Project called GAMES II. They have been designed after the formulation of an epistemological model of medical reasoning. The main goal is that of developing a computational framework which allows knowledge engineers and domain experts to interact cooperatively in developing a medical KBS. To this aim, a set of reusable software components is highly recommended. Their design was facilitated by the development of a methodology for KBS construction. It views this process as comprising two activities: the tailoring of the epistemological model to the specific medical task to be executed and the subsequent translation of this model into a computational architecture so that the connections between computational structures and their knowledge level counterparts are maintained. The KA tools we developed are illustrated taking examples from the behavior of a KBS we are building for the management of children with acute myeloid leukemia.

  3. Psychological tools for knowledge acquisition

    NASA Technical Reports Server (NTRS)

    Rueter, Henry H.; Olson, Judith Reitman

    1988-01-01

    Knowledge acquisition is said to be the biggest bottleneck in the development of expert systems. The problem is getting the knowledge out of the expert's head and into a computer. In cognitive psychology, characterizing metal structures and why experts are good at what they do is an important research area. Is there some way that the tools that psychologists have developed to uncover mental structure can be used to benefit knowledge engineers? We think that the way to find out is to browse through the psychologist's toolbox to see what there is in it that might be of use to knowledge engineers. Expert system developers have relied on two standard methods for extracting knowledge from the expert: (1) the knowledge engineer engages in an intense bout of interviews with the expert or experts, or (2) the knowledge engineer becomes an expert himself, relying on introspection to uncover the basis of his own expertise. Unfortunately, these techniques have the difficulty that often the expert himself isn't consciously aware of the basis of his expertise. If the expert himself isn't conscious of how he solves problems, introspection is useless. Cognitive psychology has faced similar problems for many years and has developed exploratory methods that can be used to discover cognitive structure from simple data.

  4. Knowledge-based environment for optical system design

    NASA Astrophysics Data System (ADS)

    Johnson, R. Barry

    1991-01-01

    Optical systems are extensively utilized by industry government and military organizations. The conceptual design engineering design fabrication and testing of these systems presently requires significant time typically on the order of 3-5 years. The Knowledge-Based Environment for Optical System Design (KB-OSD) Program has as its principal objectives the development of a methodology and tool(s) that will make a notable reduction in the development time of optical system projects reduce technical risk and overall cost. KB-OSD can be considered as a computer-based optical design associate for system engineers and design engineers. By utilizing artificial intelligence technology coupled with extensive design/evaluation computer application programs and knowledge bases the KB-OSD will provide the user with assistance and guidance to accomplish such activities as (i) develop system level and hardware level requirements from mission requirements (ii) formulate conceptual designs (iii) construct a statement of work for an RFP (iv) develop engineering level designs (v) evaluate an existing design and (vi) explore the sensitivity of a system to changing scenarios. The KB-OSD comprises a variety of computer platforms including a Stardent Titan supercomputer numerous design programs (lens design coating design thermal materials structural atmospherics etc. ) data bases and heuristic knowledge bases. An important element of the KB-OSD Program is the inclusion of the knowledge of individual experts in various areas of optics and optical system engineering. This knowledge is obtained by KB-OSD knowledge engineers performing

  5. Representing Human Expertise by the OWL Web Ontology Language to Support Knowledge Engineering in Decision Support Systems.

    PubMed

    Ramzan, Asia; Wang, Hai; Buckingham, Christopher

    2014-01-01

    Clinical decision support systems (CDSSs) often base their knowledge and advice on human expertise. Knowledge representation needs to be in a format that can be easily understood by human users as well as supporting ongoing knowledge engineering, including evolution and consistency of knowledge. This paper reports on the development of an ontology specification for managing knowledge engineering in a CDSS for assessing and managing risks associated with mental-health problems. The Galatean Risk and Safety Tool, GRiST, represents mental-health expertise in the form of a psychological model of classification. The hierarchical structure was directly represented in the machine using an XML document. Functionality of the model and knowledge management were controlled using attributes in the XML nodes, with an accompanying paper manual for specifying how end-user tools should behave when interfacing with the XML. This paper explains the advantages of using the web-ontology language, OWL, as the specification, details some of the issues and problems encountered in translating the psychological model to OWL, and shows how OWL benefits knowledge engineering. The conclusions are that OWL can have an important role in managing complex knowledge domains for systems based on human expertise without impeding the end-users' understanding of the knowledge base. The generic classification model underpinning GRiST makes it applicable to many decision domains and the accompanying OWL specification facilitates its implementation.

  6. Development of a knowledge acquisition tool for an expert system flight status monitor

    NASA Technical Reports Server (NTRS)

    Disbrow, J. D.; Duke, E. L.; Regenie, V. A.

    1986-01-01

    Two of the main issues in artificial intelligence today are knowledge acquisition dion and knowledge representation. The Dryden Flight Research Facility of NASA's Ames Research Center is presently involved in the design and implementation of an expert system flight status monitor that will provide expertise and knowledge to aid the flight systems engineer in monitoring today's advanced high-performance aircraft. The flight status monitor can be divided into two sections: the expert system itself and the knowledge acquisition tool. The knowledge acquisition tool, the means it uses to extract knowledge from the domain expert, and how that knowledge is represented for computer use is discussed. An actual aircraft system has been codified by this tool with great success. Future real-time use of the expert system has been facilitated by using the knowledge acquisition tool to easily generate a logically consistent and complete knowledge base.

  7. Development of a knowledge acquisition tool for an expert system flight status monitor

    NASA Technical Reports Server (NTRS)

    Disbrow, J. D.; Duke, E. L.; Regenie, V. A.

    1986-01-01

    Two of the main issues in artificial intelligence today are knowledge acquisition and knowledge representation. The Dryden Flight Research Facility of NASA's Ames Research Center is presently involved in the design and implementation of an expert system flight status monitor that will provide expertise and knowledge to aid the flight systems engineer in monitoring today's advanced high-performance aircraft. The flight status monitor can be divided into two sections: the expert system itself and the knowledge acquisition tool. This paper discusses the knowledge acquisition tool, the means it uses to extract knowledge from the domain expert, and how that knowledge is represented for computer use. An actual aircraft system has been codified by this tool with great success. Future real-time use of the expert system has been facilitated by using the knowledge acquisition tool to easily generate a logically consistent and complete knowledge base.

  8. On the acquisition and representation of procedural knowledge

    NASA Technical Reports Server (NTRS)

    Saito, T.; Ortiz, C.; Loftin, R. B.

    1992-01-01

    Historically knowledge acquisition has proven to be one of the greatest barriers to the development of intelligent systems. Current practice generally requires lengthy interactions between the expert whose knowledge is to be captured and the knowledge engineer whose responsibility is to acquire and represent knowledge in a useful form. Although much research has been devoted to the development of methodologies and computer software to aid in the capture and representation of some of some types of knowledge, little attention has been devoted to procedural knowledge. NASA personnel frequently perform tasks that are primarily procedural in nature. Previous work is reviewed in the field of knowledge acquisition and then focus on knowledge acquisition for procedural tasks with special attention devoted to the Navy's VISTA tool. The design and development is described of a system for the acquisition and representation of procedural knowledge-TARGET (Task Analysis and Rule Generation Tool). TARGET is intended as a tool that permits experts to visually describe procedural tasks and as a common medium for knowledge refinement by the expert and knowledge engineer. The system is designed to represent the acquired knowledge in the form of production rules. Systems such as TARGET have the potential to profoundly reduce the time, difficulties, and costs of developing knowledge-based systems for the performance of procedural tasks.

  9. TARGET: Rapid Capture of Process Knowledge

    NASA Technical Reports Server (NTRS)

    Ortiz, C. J.; Ly, H. V.; Saito, T.; Loftin, R. B.

    1993-01-01

    TARGET (Task Analysis/Rule Generation Tool) represents a new breed of tool that blends graphical process flow modeling capabilities with the function of a top-down reporting facility. Since NASA personnel frequently perform tasks that are primarily procedural in nature, TARGET models mission or task procedures and generates hierarchical reports as part of the process capture and analysis effort. Historically, capturing knowledge has proven to be one of the greatest barriers to the development of intelligent systems. Current practice generally requires lengthy interactions between the expert whose knowledge is to be captured and the knowledge engineer whose responsibility is to acquire and represent the expert's knowledge in a useful form. Although much research has been devoted to the development of methodologies and computer software to aid in the capture and representation of some types of knowledge, procedural knowledge has received relatively little attention. In essence, TARGET is one of the first tools of its kind, commercial or institutional, that is designed to support this type of knowledge capture undertaking. This paper will describe the design and development of TARGET for the acquisition and representation of procedural knowledge. The strategies employed by TARGET to support use by knowledge engineers, subject matter experts, programmers and managers will be discussed. This discussion includes the method by which the tool employs its graphical user interface to generate a task hierarchy report. Next, the approach to generate production rules for incorporation in and development of a CLIPS based expert system will be elaborated. TARGET also permits experts to visually describe procedural tasks as a common medium for knowledge refinement by the expert community and knowledge engineer making knowledge consensus possible. The paper briefly touches on the verification and validation issues facing the CLIPS rule generation aspects of TARGET. A description of efforts to support TARGET's interoperability issues on PCs, Macintoshes and UNIX workstations concludes the paper.

  10. Impact of distributed virtual reality on engineering knowledge retention and student engagement

    NASA Astrophysics Data System (ADS)

    Sulbaran, Tulio Alberto

    Engineering Education is facing many problems, one of which is poor knowledge retention among engineering students. This problem affects the Architecture, Engineering, and Construction (A/E/C) industry, because students are unprepared for many necessary job skills. This problem of poor knowledge retention is caused by many factors, one of which is the mismatch between student learning preferences and the media used to teach engineering. The purpose of this research is to assess the impact of Distributed Virtual Reality (DVR) as an engineering teaching tool. The implementation of DVR addresses the issue of poor knowledge retention by impacting the mismatch between learning and teaching style in the visual versus verbal spectrum. Using as a point of departure three knowledge domain areas (Learning and Instruction, Distributed Virtual Reality and Crane Selection as Part of Crane Lift Planning), a DVR engineering teaching tool is developed, deployed and assessed in engineering classrooms. The statistical analysis of the data indicates that: (1) most engineering students are visual learners; (2) most students would like more classes using DVR; (3) engineering students find DVR more engaging than traditional learning methods; (4) most students find the responsiveness of the DVR environments to be either good or very good; (5) all students are able to interact with DVR and most of the students found it easy or very easy to navigate (without previous formal training in how to use DVR); (6) students' knowledge regarding the subject (crane selection) is higher after the experiment; and, (7) students' using different instructional media do not demonstrate statistical difference in knowledge retained after the experiment. This inter-disciplinary research offers opportunities for direct and immediate application in education, research, and industry, due to the fact that the instructional module developed (on crane selection as part of construction crane lift planning) can be used to convey knowledge to engineers beyond the classrooms. This instructional module can also be used as a workbench to assess parameters on engineering education such as time on task, assessment media, and long-term retention among others.

  11. Automation based on knowledge modeling theory and its applications in engine diagnostic systems using Space Shuttle Main Engine vibrational data. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Kim, Jonnathan H.

    1995-01-01

    Humans can perform many complicated tasks without explicit rules. This inherent and advantageous capability becomes a hurdle when a task is to be automated. Modern computers and numerical calculations require explicit rules and discrete numerical values. In order to bridge the gap between human knowledge and automating tools, a knowledge model is proposed. Knowledge modeling techniques are discussed and utilized to automate a labor and time intensive task of detecting anomalous bearing wear patterns in the Space Shuttle Main Engine (SSME) High Pressure Oxygen Turbopump (HPOTP).

  12. Information Communication using Knowledge Engine on Flood Issues

    NASA Astrophysics Data System (ADS)

    Demir, I.; Krajewski, W. F.

    2012-04-01

    The Iowa Flood Information System (IFIS) is a web-based platform developed by the Iowa Flood Center (IFC) to provide access to and visualization of flood inundation maps, real-time flood conditions, flood forecasts both short-term and seasonal, and other flood-related data for communities in Iowa. The system is designed for use by general public, often people with no domain knowledge and poor general science background. To improve effective communication with such audience, we have introduced a new way in IFIS to get information on flood related issues - instead of by navigating within hundreds of features and interfaces of the information system and web-based sources-- by providing dynamic computations based on a collection of built-in data, analysis, and methods. The IFIS Knowledge Engine connects to distributed sources of real-time stream gauges, and in-house data sources, analysis and visualization tools to answer questions grouped into several categories. Users will be able to provide input based on the query within the categories of rainfall, flood conditions, forecast, inundation maps, flood risk and data sensors. Our goal is the systematization of knowledge on flood related issues, and to provide a single source for definitive answers to factual queries. Long-term goal of this knowledge engine is to make all flood related knowledge easily accessible to everyone, and provide educational geoinformatics tool. The future implementation of the system will be able to accept free-form input and voice recognition capabilities within browser and mobile applications. We intend to deliver increasing capabilities for the system over the coming releases of IFIS. This presentation provides an overview of our Knowledge Engine, its unique information interface and functionality as an educational tool, and discusses the future plans for providing knowledge on flood related issues and resources.

  13. Exploring Engineering instructors' views about writing and online tools to support communication in Engineering

    NASA Astrophysics Data System (ADS)

    Howard, Sarah K.; Khosronejad, Maryam; Calvo, Rafael A.

    2017-11-01

    To be fully prepared for the professional workplace, Engineering students need to be able to effectively communicate. However, there has been a growing concern in the field about students' preparedness for this aspect of their future work. It is argued that online writing tools, to engage numbers of students in the writing process, can support feedback on and development of writing in engineering on a larger scale. Through interviews and questionnaires, this study explores engineering academics' perceptions of writing to better understand how online writing tools may be integrated into their teaching. Results suggest that writing is viewed positively in the discipline, but it is not believed to be essential to success in engineering. Online writing tools were believed to support a larger number of students, but low knowledge of the tools limited academics' understanding of their usefulness in teaching and learning. Implications for innovation in undergraduate teaching are discussed.

  14. Engineering Yarrowia lipolytica for Use in Biotechnological Applications: A Review of Major Achievements and Recent Innovations.

    PubMed

    Madzak, Catherine

    2018-06-25

    Yarrowia lipolytica is an oleaginous saccharomycetous yeast with a long history of industrial use. It aroused interest several decades ago as host for heterologous protein production. Thanks to the development of numerous molecular and genetic tools, Y. lipolytica is now a recognized system for expressing heterologous genes and secreting the corresponding proteins of interest. As genomic and transcriptomic tools increased our basic knowledge on this yeast, we can now envision engineering its metabolic pathways for use as whole-cell factory in various bioconversion processes. Y. lipolytica is currently being developed as a workhorse for biotechnology, notably for single-cell oil production and upgrading of industrial wastes into valuable products. As it becomes more and more difficult to keep up with an ever-increasing literature on Y. lipolytica engineering technology, this article aims to provide basic and actualized knowledge on this research area. The most useful reviews on Y. lipolytica biology, use, and safety will be evoked, together with a resume of the engineering tools available in this yeast. This mini-review will then focus on recently developed tools and engineering strategies, with a particular emphasis on promoter tuning, metabolic pathways assembly, and genome editing technologies.

  15. Solving ordinary differential equations by electrical analogy: a multidisciplinary teaching tool

    NASA Astrophysics Data System (ADS)

    Sanchez Perez, J. F.; Conesa, M.; Alhama, I.

    2016-11-01

    Ordinary differential equations are the mathematical formulation for a great variety of problems in science and engineering, and frequently, two different problems are equivalent from a mathematical point of view when they are formulated by the same equations. Students acquire the knowledge of how to solve these equations (at least some types of them) using protocols and strict algorithms of mathematical calculation without thinking about the meaning of the equation. The aim of this work is that students learn to design network models or circuits in this way; with simple knowledge of them, students can establish the association of electric circuits and differential equations and their equivalences, from a formal point of view, that allows them to associate knowledge of two disciplines and promote the use of this interdisciplinary approach to address complex problems. Therefore, they learn to use a multidisciplinary tool that allows them to solve these kinds of equations, even students of first course of engineering, whatever the order, grade or type of non-linearity. This methodology has been implemented in numerous final degree projects in engineering and science, e.g., chemical engineering, building engineering, industrial engineering, mechanical engineering, architecture, etc. Applications are presented to illustrate the subject of this manuscript.

  16. Relationship of prior knowledge and working engineers' learning preferences: implications for designing effective instruction

    NASA Astrophysics Data System (ADS)

    Baukal, Charles E.; Ausburn, Lynna J.

    2017-05-01

    Continuing engineering education (CEE) is important to ensure engineers maintain proficiency over the life of their careers. However, relatively few studies have examined designing effective training for working engineers. Research has indicated that both learner instructional preferences and prior knowledge can impact the learning process, but it has not established if these factors are interrelated. The study reported here considered relationships of prior knowledge and three aspects of learning preferences of working engineers at a manufacturing company: learning strategy choices, verbal-visual cognitive styles, and multimedia preferences. Prior knowledge was not found to be significantly related to engineers' learning preferences, indicating independence of effects of these variables on learning. The study also examined relationships of this finding to the Multimedia Cone of Abstraction and implications for its use as an instructional design tool for CEE.

  17. Using social media to facilitate knowledge transfer in complex engineering environments: a primer for educators

    NASA Astrophysics Data System (ADS)

    Murphy, Glen; Salomone, Sonia

    2013-03-01

    While highly cohesive groups are potentially advantageous they are also often correlated with the emergence of knowledge and information silos based around those same functional or occupational clusters. Consequently, an essential challenge for engineering organisations wishing to overcome informational silos is to implement mechanisms that facilitate, encourage and sustain interactions between otherwise disconnected groups. This paper acts as a primer for those seeking to gain an understanding of the design, functionality and utility of a suite of software tools generically termed social media technologies in the context of optimising the management of tacit engineering knowledge. Underpinned by knowledge management theory and using detailed case examples, this paper explores how social media technologies achieve such goals, allowing for the transfer of knowledge by tapping into the tacit and explicit knowledge of disparate groups in complex engineering environments.

  18. A knowledge engineering framework towards clinical support for adverse drug event prevention: the PSIP approach.

    PubMed

    Koutkias, Vassilis; Stalidis, George; Chouvarda, Ioanna; Lazou, Katerina; Kilintzis, Vassilis; Maglaveras, Nicos

    2009-01-01

    Adverse Drug Events (ADEs) are currently considered as a major public health issue, endangering patients' safety and causing significant healthcare costs. Several research efforts are currently concentrating on the reduction of preventable ADEs by employing Information Technology (IT) solutions, which aim to provide healthcare professionals and patients with relevant knowledge and decision support tools. In this context, we present a knowledge engineering approach towards the construction of a Knowledge-based System (KBS) regarded as the core part of a CDSS (Clinical Decision Support System) for ADE prevention, all developed in the context of the EU-funded research project PSIP (Patient Safety through Intelligent Procedures in Medication). In the current paper, we present the knowledge sources considered in PSIP and the implications they pose to knowledge engineering, the methodological approach followed, as well as the components defining the knowledge engineering framework based on relevant state-of-the-art technologies and representation formalisms.

  19. Reverse engineering of integrated circuits

    DOEpatents

    Chisholm, Gregory H.; Eckmann, Steven T.; Lain, Christopher M.; Veroff, Robert L.

    2003-01-01

    Software and a method therein to analyze circuits. The software comprises several tools, each of which perform particular functions in the Reverse Engineering process. The analyst, through a standard interface, directs each tool to the portion of the task to which it is most well suited, rendering previously intractable problems solvable. The tools are generally used iteratively to produce a successively more abstract picture of a circuit, about which incomplete a priori knowledge exists.

  20. Problem Solving and Training Guide for Shipyard Industrial Engineers

    DTIC Science & Technology

    1986-06-01

    Design Integration Tools Building 192 Room 128 9500 MacArthur Blvd Bethesda, MD 20817-5700 8. PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING...called upon to increase the knowledge about industrial engineering of some shipyard group. The Curriculum is seen especially as a tool to identify new...materials on all common machine shop tools . Data permits calculation of machining time. 085 Ostwald, Phillip F. AMERICAN MACHINIST MANUFACTURING COST

  1. Validation of highly reliable, real-time knowledge-based systems

    NASA Technical Reports Server (NTRS)

    Johnson, Sally C.

    1988-01-01

    Knowledge-based systems have the potential to greatly increase the capabilities of future aircraft and spacecraft and to significantly reduce support manpower needed for the space station and other space missions. However, a credible validation methodology must be developed before knowledge-based systems can be used for life- or mission-critical applications. Experience with conventional software has shown that the use of good software engineering techniques and static analysis tools can greatly reduce the time needed for testing and simulation of a system. Since exhaustive testing is infeasible, reliability must be built into the software during the design and implementation phases. Unfortunately, many of the software engineering techniques and tools used for conventional software are of little use in the development of knowledge-based systems. Therefore, research at Langley is focused on developing a set of guidelines, methods, and prototype validation tools for building highly reliable, knowledge-based systems. The use of a comprehensive methodology for building highly reliable, knowledge-based systems should significantly decrease the time needed for testing and simulation. A proven record of delivering reliable systems at the beginning of the highly visible testing and simulation phases is crucial to the acceptance of knowledge-based systems in critical applications.

  2. Knowledge Assisted Integrated Design of a Component and Its Manufacturing Process

    NASA Astrophysics Data System (ADS)

    Gautham, B. P.; Kulkarni, Nagesh; Khan, Danish; Zagade, Pramod; Reddy, Sreedhar; Uppaluri, Rohith

    Integrated design of a product and its manufacturing processes would significantly reduce the total cost of the products as well as the cost of its development. However this would only be possible if we have a platform that allows us to link together simulations tools used for product design, performance evaluation and its manufacturing processes in a closed loop. In addition to that having a comprehensive knowledgebase that provides systematic knowledge guided assistance to product or process designers who may not possess in-depth design knowledge or in-depth knowledge of the simulation tools, would significantly speed up the end-to-end design process. In this paper, we propose a process and illustrate a case for achieving an integrated product and manufacturing process design assisted by knowledge support for the user to make decisions at various stages. We take transmission component design as an example. The example illustrates the design of a gear for its geometry, material selection and its manufacturing processes, particularly, carburizing-quenching and tempering, and feeding the material properties predicted during heat treatment into performance estimation in a closed loop. It also identifies and illustrates various decision stages in the integrated life cycle and discusses the use of knowledge engineering tools such as rule-based guidance, to assist the designer make informed decisions. Simulation tools developed on various commercial, open-source platforms as well as in-house tools along with knowledge engineering tools are linked to build a framework with appropriate navigation through user-friendly interfaces. This is illustrated through examples in this paper.

  3. Knowledge-based assistance in costing the space station DMS

    NASA Technical Reports Server (NTRS)

    Henson, Troy; Rone, Kyle

    1988-01-01

    The Software Cost Engineering (SCE) methodology developed over the last two decades at IBM Systems Integration Division (SID) in Houston is utilized to cost the NASA Space Station Data Management System (DMS). An ongoing project to capture this methodology, which is built on a foundation of experiences and lessons learned, has resulted in the development of an internal-use-only, PC-based prototype that integrates algorithmic tools with knowledge-based decision support assistants. This prototype Software Cost Engineering Automation Tool (SCEAT) is being employed to assist in the DMS costing exercises. At the same time, DMS costing serves as a forcing function and provides a platform for the continuing, iterative development, calibration, and validation and verification of SCEAT. The data that forms the cost engineering database is derived from more than 15 years of development of NASA Space Shuttle software, ranging from low criticality, low complexity support tools to highly complex and highly critical onboard software.

  4. Microblogging in Higher Education: Digital Natives, Knowledge Creation, Social Engineering, and Intelligence Analysis of Educational Tweets

    ERIC Educational Resources Information Center

    Cleveland, Simon; Jackson, Barcus C.; Dawson, Maurice

    2016-01-01

    With the rise of Web 2.0, microblogging has become a widely accepted phenomenon for sharing information. Moreover, the Twitter platform has become the tool of choice for universities looking to increase their digital footprint. However, scant research addresses the viability of microblogging as a tool to facilitate knowledge creation practices…

  5. An application of object-oriented knowledge representation to engineering expert systems

    NASA Technical Reports Server (NTRS)

    Logie, D. S.; Kamil, H.; Umaretiya, J. R.

    1990-01-01

    The paper describes an object-oriented knowledge representation and its application to engineering expert systems. The object-oriented approach promotes efficient handling of the problem data by allowing knowledge to be encapsulated in objects and organized by defining relationships between the objects. An Object Representation Language (ORL) was implemented as a tool for building and manipulating the object base. Rule-based knowledge representation is then used to simulate engineering design reasoning. Using a common object base, very large expert systems can be developed, comprised of small, individually processed, rule sets. The integration of these two schemes makes it easier to develop practical engineering expert systems. The general approach to applying this technology to the domain of the finite element analysis, design, and optimization of aerospace structures is discussed.

  6. Protein engineering and its applications in food industry.

    PubMed

    Kapoor, Swati; Rafiq, Aasima; Sharma, Savita

    2017-07-24

    Protein engineering is a young discipline that has been branched out from the field of genetic engineering. Protein engineering is based on the available knowledge about the proteins structure/function(s), tools/instruments, software, bioinformatics database, available cloned gene, knowledge about available protein, vectors, recombinant strains and other materials that could lead to change in the protein backbone. Protein produced properly from genetic engineering process means a protein that is able to fold correctly and to do particular function(s) efficiently even after being subjected to engineering practices. Protein is modified through its gene or chemically. However, modification of protein through gene is easier. There is no specific limitation of Protein Engineering tools; any technique that can lead to change the protein constituent of amino acid and result in the modification of protein structure/function is in the frame of Protein Engineering. Meanwhile, there are some common tools used to reach a specific target. More active industrial and pharmaceutical based proteins have been invented by the field of Protein Engineering to introduce new function as well as to change its interaction with surrounding environment. A variety of protein engineering applications have been reported in the literature. These applications range from biocatalysis for food and industry to environmental, medical and nanobiotechnology applications. Successful combinations of various protein engineering methods had led to successful results in food industries and have created a scope to maintain the quality of finished product after processing.

  7. The Study on Collaborative Manufacturing Platform Based on Agent

    NASA Astrophysics Data System (ADS)

    Zhang, Xiao-yan; Qu, Zheng-geng

    To fulfill the trends of knowledge-intensive in collaborative manufacturing development, we have described multi agent architecture supporting knowledge-based platform of collaborative manufacturing development platform. In virtue of wrapper service and communication capacity agents provided, the proposed architecture facilitates organization and collaboration of multi-disciplinary individuals and tools. By effectively supporting the formal representation, capture, retrieval and reuse of manufacturing knowledge, the generalized knowledge repository based on ontology library enable engineers to meaningfully exchange information and pass knowledge across boundaries. Intelligent agent technology increases traditional KBE systems efficiency and interoperability and provides comprehensive design environments for engineers.

  8. Engineering Education for a New Era

    NASA Astrophysics Data System (ADS)

    Ohgaki, Shinichiro

    Engineering education is composed of five components, the idea what engineering education ought to be, the knowledge in engineering fields, those who learn engineering, those who teach engineering and the stakeholders in engineering issues. The characteristics of all these five components are changing with the times. When we consider the engineering education for the next era, we should analyze the changes of all five components. Especially the knowledge and tools in engineering fields has been expanding, and advanced science and technology is casting partly a dark shadow on the modern convenient life. Moral rules or ethics for developing new products and engineering systems are now regarded as most important in engineering fields. All those who take the responsibility for engineering education should understand the change of all components in engineering education and have a clear grasp of the essence of engineering for sustainable society.

  9. Criterion-Referenced Test Items for Small Engines.

    ERIC Educational Resources Information Center

    Herd, Amon

    This notebook contains criterion-referenced test items for testing students' knowledge of small engines. The test items are based upon competencies found in the Missouri Small Engine Competency Profile. The test item bank is organized in 18 sections that cover the following duties: shop procedures; tools and equipment; fasteners; servicing fuel…

  10. Proceedings of Tenth Annual Software Engineering Workshop

    NASA Technical Reports Server (NTRS)

    1985-01-01

    Papers are presented on the following topics: measurement of software technology, recent studies of the Software Engineering Lab, software management tools, expert systems, error seeding as a program validation technique, software quality assurance, software engineering environments (including knowledge-based environments), the Distributed Computing Design System, and various Ada experiments.

  11. Engineering the future with America's high school students

    NASA Technical Reports Server (NTRS)

    Farrance, M. A.; Jenner, J. W.

    1993-01-01

    The number of students enrolled in engineering is declining while the need for engineers is increasing. One contributing factor is that most high school students have little or no knowledge about what engineering is, or what engineers do. To teach young students about engineering, engineers need good tools. This paper presents a course of study developed and used by the authors in a junior college course for high school students. Students learned about engineering through independent student projects, in-class problem solving, and use of career information resources. Selected activities from the course can be adapted to teach students about engineering in other settings. Among the most successful techniques were the student research paper assignments, working out a solution to an engineering problem as a class exercise, and the use of technical materials to illustrate engineering concepts and demonstrate 'tools of the trade'.

  12. VIP: A knowledge-based design aid for the engineering of space systems

    NASA Technical Reports Server (NTRS)

    Lewis, Steven M.; Bellman, Kirstie L.

    1990-01-01

    The Vehicles Implementation Project (VIP), a knowledge-based design aid for the engineering of space systems is described. VIP combines qualitative knowledge in the form of rules, quantitative knowledge in the form of equations, and other mathematical modeling tools. The system allows users rapidly to develop and experiment with models of spacecraft system designs. As information becomes available to the system, appropriate equations are solved symbolically and the results are displayed. Users may browse through the system, observing dependencies and the effects of altering specific parameters. The system can also suggest approaches to the derivation of specific parameter values. In addition to providing a tool for the development of specific designs, VIP aims at increasing the user's understanding of the design process. Users may rapidly examine the sensitivity of a given parameter to others in the system and perform tradeoffs or optimizations of specific parameters. A second major goal of VIP is to integrate the existing corporate knowledge base of models and rules into a central, symbolic form.

  13. Knowledge Preservation Techniques That Can Facilitate Intergroup Communications

    NASA Technical Reports Server (NTRS)

    Moreman, Douglas; Dyer, John; Coffee, John; Noga, Donald F. (Technical Monitor)

    2000-01-01

    We have developed tools, social methods and software, for (1) acquiring technical knowledge from engineers and scientists, (2) preserving that knowledge, (3) making the totality of our stored knowledge rapidly searchable. Our motivation has been, mainly, to preserve rare knowledge of senior engineers who are near retirement. Historical value of such knowledge, and also of our tools, has been pointed out to us by historians. We now propose the application these tools to enhancing communication among groups that are working jointly on a project. Of most value will be projects having groups among whom communication is rare and incomplete. We propose that discussions among members of a group be recorded in audio and that both the actual audio and transcriptions of that audio, and optional other pieces be combined into electronic, webpage-like "books". These books can then be searched rapidly by interested people in other groups. At points of particular interest, a searcher can zoom in on the text and even on the original recordings to pick up nuances (e.g. to distinguish a utterance said in seriousness from one in sarcasm). In this matter, not only can potentially valuable technical details be preserved for the future, but communication be enhanced during the life of a joint undertaking.

  14. Software reengineering

    NASA Technical Reports Server (NTRS)

    Fridge, Ernest M., III

    1991-01-01

    Today's software systems generally use obsolete technology, are not integrated properly with other software systems, and are difficult and costly to maintain. The discipline of reverse engineering is becoming prominent as organizations try to move their systems up to more modern and maintainable technology in a cost effective manner. JSC created a significant set of tools to develop and maintain FORTRAN and C code during development of the Space Shuttle. This tool set forms the basis for an integrated environment to re-engineer existing code into modern software engineering structures which are then easier and less costly to maintain and which allow a fairly straightforward translation into other target languages. The environment will support these structures and practices even in areas where the language definition and compilers do not enforce good software engineering. The knowledge and data captured using the reverse engineering tools is passed to standard forward engineering tools to redesign or perform major upgrades to software systems in a much more cost effective manner than using older technologies. A beta vision of the environment was released in Mar. 1991. The commercial potential for such re-engineering tools is very great. CASE TRENDS magazine reported it to be the primary concern of over four hundred of the top MIS executives.

  15. Designing computer learning environments for engineering and computer science: The scaffolded knowledge integration framework

    NASA Astrophysics Data System (ADS)

    Linn, Marcia C.

    1995-06-01

    Designing effective curricula for complex topics and incorporating technological tools is an evolving process. One important way to foster effective design is to synthesize successful practices. This paper describes a framework called scaffolded knowledge integration and illustrates how it guided the design of two successful course enhancements in the field of computer science and engineering. One course enhancement, the LISP Knowledge Integration Environment, improved learning and resulted in more gender-equitable outcomes. The second course enhancement, the spatial reasoning environment, addressed spatial reasoning in an introductory engineering course. This enhancement minimized the importance of prior knowledge of spatial reasoning and helped students develop a more comprehensive repertoire of spatial reasoning strategies. Taken together, the instructional research programs reinforce the value of the scaffolded knowledge integration framework and suggest directions for future curriculum reformers.

  16. Aviation Safety Program Atmospheric Environment Safety Technologies (AEST) Project

    NASA Technical Reports Server (NTRS)

    Colantonio, Ron

    2011-01-01

    Engine Icing: Characterization and Simulation Capability: Develop knowledge bases, analysis methods, and simulation tools needed to address the problem of engine icing; in particular, ice-crystal icing Airframe Icing Simulation and Engineering Tool Capability: Develop and demonstrate 3-D capability to simulate and model airframe ice accretion and related aerodynamic performance degradation for current and future aircraft configurations in an expanded icing environment that includes freezing drizzle/rain Atmospheric Hazard Sensing and Mitigation Technology Capability: Improve and expand remote sensing and mitigation of hazardous atmospheric environments and phenomena

  17. A top-down approach in control engineering third-level teaching: The case of hydrogen-generation

    NASA Astrophysics Data System (ADS)

    Setiawan, Eko; Habibi, M. Afnan; Fall, Cheikh; Hodaka, Ichijo

    2017-09-01

    This paper presents a top-down approach in control engineering third-level teaching. The paper shows the control engineering solution for the issue of practical implementation in order to motivate students. The proposed strategy only focuses on one technique of control engineering to lead student correctly. The proposed teaching steps are 1) defining the problem, 2) listing of acquired knowledge or required skill, 3) selecting of one control engineering technique, 4) arrangement the order of teaching: problem introduction, implementation of control engineering technique, explanation of system block diagram, model derivation, controller design, and 5) enrichment knowledge by the other control techniques. The approach presented highlights hardware implementation and the use of software simulation as a self-learning tool for students.

  18. The integration of automated knowledge acquisition with computer-aided software engineering for space shuttle expert systems

    NASA Technical Reports Server (NTRS)

    Modesitt, Kenneth L.

    1990-01-01

    A prediction was made that the terms expert systems and knowledge acquisition would begin to disappear over the next several years. This is not because they are falling into disuse; it is rather that practitioners are realizing that they are valuable adjuncts to software engineering, in terms of problem domains addressed, user acceptance, and in development methodologies. A specific problem was discussed, that of constructing an automated test analysis system for the Space Shuttle Main Engine. In this domain, knowledge acquisition was part of requirements systems analysis, and was performed with the aid of a powerful inductive ESBT in conjunction with a computer aided software engineering (CASE) tool. The original prediction is not a very risky one -- it has already been accomplished.

  19. A study of diverse clinical decision support rule authoring environments and requirements for integration

    PubMed Central

    2012-01-01

    Background Efficient rule authoring tools are critical to allow clinical Knowledge Engineers (KEs), Software Engineers (SEs), and Subject Matter Experts (SMEs) to convert medical knowledge into machine executable clinical decision support rules. The goal of this analysis was to identify the critical success factors and challenges of a fully functioning Rule Authoring Environment (RAE) in order to define requirements for a scalable, comprehensive tool to manage enterprise level rules. Methods The authors evaluated RAEs in active use across Partners Healthcare, including enterprise wide, ambulatory only, and system specific tools, with a focus on rule editors for reminder and medication rules. We conducted meetings with users of these RAEs to discuss their general experience and perceived advantages and limitations of these tools. Results While the overall rule authoring process is similar across the 10 separate RAEs, the system capabilities and architecture vary widely. Most current RAEs limit the ability of the clinical decision support (CDS) interventions to be standardized, sharable, interoperable, and extensible. No existing system meets all requirements defined by knowledge management users. Conclusions A successful, scalable, integrated rule authoring environment will need to support a number of key requirements and functions in the areas of knowledge representation, metadata, terminology, authoring collaboration, user interface, integration with electronic health record (EHR) systems, testing, and reporting. PMID:23145874

  20. The Critical Incident Technique: An Effective Tool for Gathering Experience from Practicing Engineers

    ERIC Educational Resources Information Center

    Hanson, James H.; Brophy, Patrick D.

    2012-01-01

    Not all knowledge and skills that educators want to pass to students exists yet in textbooks. Some still resides only in the experiences of practicing engineers (e.g., how engineers create new products, how designers identify errors in calculations). The critical incident technique, CIT, is an established method for cognitive task analysis. It is…

  1. Use of Concept Maps as an Assessment Tool in Mechanical Engineering Education

    ERIC Educational Resources Information Center

    Tembe, B. L.; Kamble, S. K.

    2013-01-01

    The purpose of this study to investigate, how third year mechanical engineering students are able to use their knowledge of concept maps in their study of the topic of "Introduction to the Internal Combustion Engines (IICE)". 41 students participated in this study. Firstly, the students were taught about concept maps and then asked to…

  2. Supplemental knowledge acquisition through external product interface for CLIPS

    NASA Technical Reports Server (NTRS)

    Saito, Tim; Ebaud, Stephen; Loftin, Bowen R.

    1990-01-01

    Traditionally, the acquisition of knowledge for expert systems consisted of the interview process with the domain or subject matter expert (SME), observation of domain environment, and information gathering and research which constituted a direct form of knowledge acquisition (KA). The knowledge engineer would be responsible for accumulating pertinent information and/or knowledge from the SME(s) for input into the appropriate expert system development tool. The direct KA process may (or may not) have included forms of data or documentation to incorporate from the SME's surroundings. The differentiation between direct KA and supplemental KA (indirect) would be the difference in the use of data. In acquiring supplemental knowledge, the knowledge engineer would access other types of evidence (manuals, documents, data files, spreadsheets, etc.) that would support the reasoning or premises of the SME. When an expert makes a decision in a particular task, one tool that may have been used to justify a recommendation, would have been a spreadsheet total or column figure. Locating specific decision points from that data within the SME's framework would constitute supplemental KA. Data used for a specific purpose in one system or environment would be used as supplemental knowledge for another, specifically a CLIPS project.

  3. A concept ideation framework for medical device design.

    PubMed

    Hagedorn, Thomas J; Grosse, Ian R; Krishnamurty, Sundar

    2015-06-01

    Medical device design is a challenging process, often requiring collaboration between medical and engineering domain experts. This collaboration can be best institutionalized through systematic knowledge transfer between the two domains coupled with effective knowledge management throughout the design innovation process. Toward this goal, we present the development of a semantic framework for medical device design that unifies a large medical ontology with detailed engineering functional models along with the repository of design innovation information contained in the US Patent Database. As part of our development, existing medical, engineering, and patent document ontologies were modified and interlinked to create a comprehensive medical device innovation and design tool with appropriate properties and semantic relations to facilitate knowledge capture, enrich existing knowledge, and enable effective knowledge reuse for different scenarios. The result is a Concept Ideation Framework for Medical Device Design (CIFMeDD). Key features of the resulting framework include function-based searching and automated inter-domain reasoning to uniquely enable identification of functionally similar procedures, tools, and inventions from multiple domains based on simple semantic searches. The significance and usefulness of the resulting framework for aiding in conceptual design and innovation in the medical realm are explored via two case studies examining medical device design problems. Copyright © 2015 Elsevier Inc. All rights reserved.

  4. Attitudes, Perceptions, and Behavioral Intentions of Engineering Workers toward Web 2.0 Tools in the Workplace

    ERIC Educational Resources Information Center

    Krause, Jaclyn A.

    2010-01-01

    As Web 2.0 tools and technologies increase in popularity in consumer markets, enterprises are seeking ways to take advantage of the rich social knowledge exchanges that these tools offer. The problem this study addresses is that it remains unknown whether employees perceive that these tools offer value to the organization and therefore will be…

  5. Traffic signal operations workshop : an engineer's guide to traffic signal timing and design.

    DOT National Transportation Integrated Search

    2008-01-01

    Scope: (1) Workshop is intended to show engineers and technicians how various guidelines and tools can be used to develop effective signal timing and detection design, (2) Participant is assumed to have a working knowledge of traffic signal equipment...

  6. Towards a theoretical clarification of biomimetics using conceptual tools from engineering design.

    PubMed

    Drack, M; Limpinsel, M; de Bruyn, G; Nebelsick, J H; Betz, O

    2017-12-13

    Many successful examples of biomimetic products are available, and most research efforts in this emerging field are directed towards the development of specific applications. The theoretical and conceptual underpinnings of the knowledge transfer between biologists, engineers and architects are, however, poorly investigated. The present article addresses this gap. We use a 'technomorphic' approach, i.e. the application of conceptual tools derived from engineering design, to better understand the processes operating during a typical biomimetic research project. This helps to elucidate the formal connections between functions, working principles and constructions (in a broad sense)-because the 'form-function-relationship' is a recurring issue in biology and engineering. The presented schema also serves as a conceptual framework that can be implemented for future biomimetic projects. The concepts of 'function' and 'working principle' are identified as the core elements in the biomimetic knowledge transfer towards applications. This schema not only facilitates the development of a common language in the emerging science of biomimetics, but also promotes the interdisciplinary dialogue among its subdisciplines.

  7. Advanced software development workstation: Knowledge base methodology: Methodology for first Engineering Script Language (ESL) knowledge base

    NASA Technical Reports Server (NTRS)

    Peeris, Kumar; Izygon, Michel

    1993-01-01

    This report explains some of the concepts of the ESL prototype and summarizes some of the lessons learned in using the prototype for implementing the Flight Mechanics Tool Kit (FMToolKit) series of Ada programs.

  8. Software support environment design knowledge capture

    NASA Technical Reports Server (NTRS)

    Dollman, Tom

    1990-01-01

    The objective of this task is to assess the potential for using the software support environment (SSE) workstations and associated software for design knowledge capture (DKC) tasks. This assessment will include the identification of required capabilities for DKC and hardware/software modifications needed to support DKC. Several approaches to achieving this objective are discussed and interim results are provided: (1) research into the problem of knowledge engineering in a traditional computer-aided software engineering (CASE) environment, like the SSE; (2) research into the problem of applying SSE CASE tools to develop knowledge based systems; and (3) direct utilization of SSE workstations to support a DKC activity.

  9. Software engineering

    NASA Technical Reports Server (NTRS)

    Fridge, Ernest M., III; Hiott, Jim; Golej, Jim; Plumb, Allan

    1993-01-01

    Today's software systems generally use obsolete technology, are not integrated properly with other software systems, and are difficult and costly to maintain. The discipline of reverse engineering is becoming prominent as organizations try to move their systems up to more modern and maintainable technology in a cost effective manner. The Johnson Space Center (JSC) created a significant set of tools to develop and maintain FORTRAN and C code during development of the space shuttle. This tool set forms the basis for an integrated environment to reengineer existing code into modern software engineering structures which are then easier and less costly to maintain and which allow a fairly straightforward translation into other target languages. The environment will support these structures and practices even in areas where the language definition and compilers do not enforce good software engineering. The knowledge and data captured using the reverse engineering tools is passed to standard forward engineering tools to redesign or perform major upgrades to software systems in a much more cost effective manner than using older technologies. The latest release of the environment was in Feb. 1992.

  10. Promoting Students' Problem Solving Skills and Knowledge of STEM Concepts in a Data-Rich Learning Environment: Using Online Data as a Tool for Teaching about Renewable Energy Technologies

    ERIC Educational Resources Information Center

    Thurmond, Brandi

    2011-01-01

    This study sought to compare a data-rich learning (DRL) environment that utilized online data as a tool for teaching about renewable energy technologies (RET) to a lecture-based learning environment to determine the impact of the learning environment on students' knowledge of Science, Technology, Engineering, and Math (STEM) concepts related…

  11. Automated support for system's engineering and operations - The development of new paradigms

    NASA Technical Reports Server (NTRS)

    Truszkowski, Walt; Hall, Gardiner A.; Jaworski, Allan; Zoch, David

    1992-01-01

    Technological developments in spacecraft ground operations are reviewed. The technological, operations-oriented, managerial, and economic factors driving the evolution of the Mission Operations Control Center (MOCC), and its predecessor the Operational Control Center are examined. The functional components of the various MOCC subsystems are outlined. A brief overview is given of the concepts behind the The Knowledge-Based Software Engineering Environment, the Generic Spacecraft Analysis Assistant, and the Knowledge From Pictures tool.

  12. Machine learning research 1989-90

    NASA Technical Reports Server (NTRS)

    Porter, Bruce W.; Souther, Arthur

    1990-01-01

    Multifunctional knowledge bases offer a significant advance in artificial intelligence because they can support numerous expert tasks within a domain. As a result they amortize the costs of building a knowledge base over multiple expert systems and they reduce the brittleness of each system. Due to the inevitable size and complexity of multifunctional knowledge bases, their construction and maintenance require knowledge engineering and acquisition tools that can automatically identify interactions between new and existing knowledge. Furthermore, their use requires software for accessing those portions of the knowledge base that coherently answer questions. Considerable progress was made in developing software for building and accessing multifunctional knowledge bases. A language was developed for representing knowledge, along with software tools for editing and displaying knowledge, a machine learning program for integrating new information into existing knowledge, and a question answering system for accessing the knowledge base.

  13. Meta-Assessment in a Project-Based Systems Engineering Course

    ERIC Educational Resources Information Center

    Wengrowicz, Niva; Dori, Yehudit Judy; Dori, Dov

    2017-01-01

    Project-based learning (PBL) facilitates significant learning, but it poses a major assessment challenge for assessing individual content knowledge. We developed and implemented an assessment approach and tool for a mandatory undergraduate systems engineering PBL-based course. We call this type of assessment "student-oriented"…

  14. A knowledge based software engineering environment testbed

    NASA Technical Reports Server (NTRS)

    Gill, C.; Reedy, A.; Baker, L.

    1985-01-01

    The Carnegie Group Incorporated and Boeing Computer Services Company are developing a testbed which will provide a framework for integrating conventional software engineering tools with Artifical Intelligence (AI) tools to promote automation and productivity. The emphasis is on the transfer of AI technology to the software development process. Experiments relate to AI issues such as scaling up, inference, and knowledge representation. In its first year, the project has created a model of software development by representing software activities; developed a module representation formalism to specify the behavior and structure of software objects; integrated the model with the formalism to identify shared representation and inheritance mechanisms; demonstrated object programming by writing procedures and applying them to software objects; used data-directed and goal-directed reasoning to, respectively, infer the cause of bugs and evaluate the appropriateness of a configuration; and demonstrated knowledge-based graphics. Future plans include introduction of knowledge-based systems for rapid prototyping or rescheduling; natural language interfaces; blackboard architecture; and distributed processing

  15. Investigating the Impact of Using a CAD Simulation Tool on Students' Learning of Design Thinking

    NASA Astrophysics Data System (ADS)

    Taleyarkhan, Manaz; Dasgupta, Chandan; Garcia, John Mendoza; Magana, Alejandra J.

    2018-02-01

    Engineering design thinking is hard to teach and still harder to learn by novices primarily due to the undetermined nature of engineering problems that often results in multiple solutions. In this paper, we investigate the effect of teaching engineering design thinking to freshmen students by using a computer-aided Design (CAD) simulation software. We present a framework for characterizing different levels of engineering design thinking displayed by students who interacted with the CAD simulation software in the context of a collaborative assignment. This framework describes the presence of four levels of engineering design thinking—beginning designer, adept beginning designer, informed designer, adept informed designer. We present the characteristics associated with each of these four levels as they pertain to four engineering design strategies that students pursued in this study—understanding the design challenge, building knowledge, weighing options and making tradeoffs, and reflecting on the process. Students demonstrated significant improvements in two strategies—understanding the design challenge and building knowledge. We discuss the affordances of the CAD simulation tool along with the learning environment that potentially helped students move towards Adept informed designers while pursuing these design strategies.

  16. Computer-assisted knowledge acquisition for hypermedia systems

    NASA Technical Reports Server (NTRS)

    Steuck, Kurt

    1990-01-01

    The usage of procedural and declarative knowledge to set up the structure or 'web' of a hypermedia environment is described. An automated knowledge acquisition tool was developed that helps a knowledge engineer elicit and represent an expert's knowledge involved in performing procedural tasks. The tool represents both procedural and prerequisite, declarative knowledge that supports each activity performed by the expert. This knowledge is output and subsequently read by a hypertext scripting language to generate the link between blank, but labeled cards. Each step of the expert's activity and each piece of supporting declarative knowledge is set up as an empty node. An instructional developer can then enter detailed instructional material concerning each step and declarative knowledge into these empty nodes. Other research is also described that facilitates the translation of knowledge from one form into a form more readily useable by computerized systems.

  17. FGMReview: design of a knowledge management tool on female genital mutilation.

    PubMed

    Martínez Pérez, Guillermo; Turetsky, Risa

    2015-11-01

    Web-based literature search engines may not be user-friendly for some readers searching for information on female genital mutilation. This is a traditional practice that has no health benefits, and about 140 million girls and women worldwide have undergone it. In 2012, the website FGMReview was created with the aim to offer a user-friendly, accessible, scalable, and innovative knowledge management tool specialized in female genital mutilation. The design of this website was guided by a conceptual model based on the use of benchmarking techniques and requirements engineering, an area of knowledge from the computer informatics field, influenced by the Transcultural Nursing model. The purpose of this article is to describe this conceptual model. Nurses and other health care providers can use this conceptual model to guide their methodological approach to design and launch other eHealth projects. © The Author(s) 2014.

  18. On-line Naval Engineering Skills Supplemental Training Program

    DTIC Science & Technology

    2010-01-01

    Defense Technical University ( DTU ), the technical content for courses would have to be provided by the Naval technical authorities...of technological knowledge related to design engineering such as the DTU , or expanded within the mission scope of an existing organization such as...management program as a training tool for naval design engineers such as the DTU or a technical extension of the DAU program for acquisition training

  19. An Evaluation of Text Mining Tools as Applied to Selected Scientific and Engineering Literature.

    ERIC Educational Resources Information Center

    Trybula, Walter J.; Wyllys, Ronald E.

    2000-01-01

    Addresses an approach to the discovery of scientific knowledge through an examination of data mining and text mining techniques. Presents the results of experiments that investigated knowledge acquisition from a selected set of technical documents by domain experts. (Contains 15 references.) (Author/LRW)

  20. Functional specifications for AI software tools for electric power applications. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Faught, W.S.

    1985-08-01

    The principle barrier to the introduction of artificial intelligence (AI) technology to the electric power industry has not been a lack of interest or appropriate problems, for the industry abounds in both. Like most others, however, the electric power industry lacks the personnel - knowledge engineers - with the special combination of training and skills AI programming demands. Conversely, very few AI specialists are conversant with electric power industry problems and applications. The recent availability of sophisticated AI programming environments is doing much to alleviate this shortage. These products provide a set of powerful and usable software tools that enablemore » even non-AI scientists to rapidly develop AI applications. The purpose of this project was to develop functional specifications for programming tools that, when integrated with existing general-purpose knowledge engineering tools, would expedite the production of AI applications for the electric power industry. Twelve potential applications, representative of major problem domains within the nuclear power industry, were analyzed in order to identify those tools that would be of greatest value in application development. Eight tools were specified, including facilities for power plant modeling, data base inquiry, simulation and machine-machine interface.« less

  1. Software Process Improvement through the Removal of Project-Level Knowledge Flow Obstacles: The Perceptions of Software Engineers

    ERIC Educational Resources Information Center

    Mitchell, Susan Marie

    2012-01-01

    Uncontrollable costs, schedule overruns, and poor end product quality continue to plague the software engineering field. Innovations formulated with the expectation to minimize or eliminate cost, schedule, and quality problems have generally fallen into one of three categories: programming paradigms, software tools, and software process…

  2. Capturing flight system test engineering expertise: Lessons learned

    NASA Technical Reports Server (NTRS)

    Woerner, Irene Wong

    1991-01-01

    Within a few years, JPL will be challenged by the most active mission set in history. Concurrently, flight systems are increasingly more complex. Presently, the knowledge to conduct integration and test of spacecraft and large instruments is held by a few key people, each with many years of experience. JPL is in danger of losing a significant amount of this critical expertise, through retirement, during a period when demand for this expertise is rapidly increasing. The most critical issue at hand is to collect and retain this expertise and develop tools that would ensure the ability to successfully perform the integration and test of future spacecraft and large instruments. The proposed solution was to capture and codity a subset of existing knowledge, and to utilize this captured expertise in knowledge-based systems. First year results and activities planned for the second year of this on-going effort are described. Topics discussed include lessons learned in knowledge acquisition and elicitation techniques, life-cycle paradigms, and rapid prototyping of a knowledge-based advisor (Spacecraft Test Assistant) and a hypermedia browser (Test Engineering Browser). The prototype Spacecraft Test Assistant supports a subset of integration and test activities for flight systems. Browser is a hypermedia tool that allows users easy perusal of spacecraft test topics. A knowledge acquisition tool called ConceptFinder which was developed to search through large volumes of data for related concepts is also described and is modified to semi-automate the process of creating hypertext links.

  3. Web Search Studies: Multidisciplinary Perspectives on Web Search Engines

    NASA Astrophysics Data System (ADS)

    Zimmer, Michael

    Perhaps the most significant tool of our internet age is the web search engine, providing a powerful interface for accessing the vast amount of information available on the world wide web and beyond. While still in its infancy compared to the knowledge tools that precede it - such as the dictionary or encyclopedia - the impact of web search engines on society and culture has already received considerable attention from a variety of academic disciplines and perspectives. This article aims to organize a meta-discipline of “web search studies,” centered around a nucleus of major research on web search engines from five key perspectives: technical foundations and evaluations; transaction log analyses; user studies; political, ethical, and cultural critiques; and legal and policy analyses.

  4. Enabling Innovation and Collaboration Across Geography and Culture: A Case Study of NASA's Systems Engineering Community of Practice

    NASA Technical Reports Server (NTRS)

    Topousis, Daria E.; Murphy, Keri; Robinson, Greg

    2008-01-01

    In 2004, NASA faced major knowledge sharing challenges due to geographically isolated field centers that inhibited personnel from sharing experiences and ideas. Mission failures and new directions for the agency demanded better collaborative tools. In addition, with the push to send astronauts back to the moon and to Mars, NASA recognized that systems engineering would have to improve across the agency. Of the ten field centers, seven had not built a spacecraft in over 30 years, and had lost systems engineering expertise. The Systems Engineering Community of Practice came together to capture the knowledge of its members using the suite of collaborative tools provided by the NASA Engineering Network (NEN.) The NEN provided a secure collaboration space for over 60 practitioners across the agency to assemble and review a NASA systems engineering handbook. Once the handbook was complete, they used the open community area to disseminate it. This case study explores both the technology and the social networking that made the community possible, describes technological approaches that facilitated rapid setup and low maintenance, provides best practices that other organizations could adopt, and discusses the vision for how this community will continue to collaborate across the field centers to benefit the agency as it continues exploring the solar system.

  5. A knowledge-based tool for multilevel decomposition of a complex design problem

    NASA Technical Reports Server (NTRS)

    Rogers, James L.

    1989-01-01

    Although much work has been done in applying artificial intelligence (AI) tools and techniques to problems in different engineering disciplines, only recently has the application of these tools begun to spread to the decomposition of complex design problems. A new tool based on AI techniques has been developed to implement a decomposition scheme suitable for multilevel optimization and display of data in an N x N matrix format.

  6. Software-engineering challenges of building and deploying reusable problem solvers.

    PubMed

    O'Connor, Martin J; Nyulas, Csongor; Tu, Samson; Buckeridge, David L; Okhmatovskaia, Anna; Musen, Mark A

    2009-11-01

    Problem solving methods (PSMs) are software components that represent and encode reusable algorithms. They can be combined with representations of domain knowledge to produce intelligent application systems. A goal of research on PSMs is to provide principled methods and tools for composing and reusing algorithms in knowledge-based systems. The ultimate objective is to produce libraries of methods that can be easily adapted for use in these systems. Despite the intuitive appeal of PSMs as conceptual building blocks, in practice, these goals are largely unmet. There are no widely available tools for building applications using PSMs and no public libraries of PSMs available for reuse. This paper analyzes some of the reasons for the lack of widespread adoptions of PSM techniques and illustrate our analysis by describing our experiences developing a complex, high-throughput software system based on PSM principles. We conclude that many fundamental principles in PSM research are useful for building knowledge-based systems. In particular, the task-method decomposition process, which provides a means for structuring knowledge-based tasks, is a powerful abstraction for building systems of analytic methods. However, despite the power of PSMs in the conceptual modeling of knowledge-based systems, software engineering challenges have been seriously underestimated. The complexity of integrating control knowledge modeled by developers using PSMs with the domain knowledge that they model using ontologies creates a barrier to widespread use of PSM-based systems. Nevertheless, the surge of recent interest in ontologies has led to the production of comprehensive domain ontologies and of robust ontology-authoring tools. These developments present new opportunities to leverage the PSM approach.

  7. Software-engineering challenges of building and deploying reusable problem solvers

    PubMed Central

    O’CONNOR, MARTIN J.; NYULAS, CSONGOR; TU, SAMSON; BUCKERIDGE, DAVID L.; OKHMATOVSKAIA, ANNA; MUSEN, MARK A.

    2012-01-01

    Problem solving methods (PSMs) are software components that represent and encode reusable algorithms. They can be combined with representations of domain knowledge to produce intelligent application systems. A goal of research on PSMs is to provide principled methods and tools for composing and reusing algorithms in knowledge-based systems. The ultimate objective is to produce libraries of methods that can be easily adapted for use in these systems. Despite the intuitive appeal of PSMs as conceptual building blocks, in practice, these goals are largely unmet. There are no widely available tools for building applications using PSMs and no public libraries of PSMs available for reuse. This paper analyzes some of the reasons for the lack of widespread adoptions of PSM techniques and illustrate our analysis by describing our experiences developing a complex, high-throughput software system based on PSM principles. We conclude that many fundamental principles in PSM research are useful for building knowledge-based systems. In particular, the task–method decomposition process, which provides a means for structuring knowledge-based tasks, is a powerful abstraction for building systems of analytic methods. However, despite the power of PSMs in the conceptual modeling of knowledge-based systems, software engineering challenges have been seriously underestimated. The complexity of integrating control knowledge modeled by developers using PSMs with the domain knowledge that they model using ontologies creates a barrier to widespread use of PSM-based systems. Nevertheless, the surge of recent interest in ontologies has led to the production of comprehensive domain ontologies and of robust ontology-authoring tools. These developments present new opportunities to leverage the PSM approach. PMID:23565031

  8. An Ontology and a Software Framework for Competency Modeling and Management

    ERIC Educational Resources Information Center

    Paquette, Gilbert

    2007-01-01

    The importance given to competency management is well justified. Acquiring new competencies is the central goal of any education or knowledge management process. Thus, it must be embedded in any software framework as an instructional engineering tool, to inform the runtime environment of the knowledge that is processed by actors, and their…

  9. Lynx: a database and knowledge extraction engine for integrative medicine.

    PubMed

    Sulakhe, Dinanath; Balasubramanian, Sandhya; Xie, Bingqing; Feng, Bo; Taylor, Andrew; Wang, Sheng; Berrocal, Eduardo; Dave, Utpal; Xu, Jinbo; Börnigen, Daniela; Gilliam, T Conrad; Maltsev, Natalia

    2014-01-01

    We have developed Lynx (http://lynx.ci.uchicago.edu)--a web-based database and a knowledge extraction engine, supporting annotation and analysis of experimental data and generation of weighted hypotheses on molecular mechanisms contributing to human phenotypes and disorders of interest. Its underlying knowledge base (LynxKB) integrates various classes of information from >35 public databases and private collections, as well as manually curated data from our group and collaborators. Lynx provides advanced search capabilities and a variety of algorithms for enrichment analysis and network-based gene prioritization to assist the user in extracting meaningful knowledge from LynxKB and experimental data, whereas its service-oriented architecture provides public access to LynxKB and its analytical tools via user-friendly web services and interfaces.

  10. QA-driven Guidelines Generation for Bacteriotherapy

    PubMed Central

    Pasche, Emilie; Teodoro, Douglas; Gobeill, Julien; Ruch, Patrick; Lovis, Christian

    2009-01-01

    PURPOSE We propose a question-answering (QA) driven generation approach for automatic acquisition of structured rules that can be used in a knowledge authoring tool for antibiotic prescription guidelines management. METHODS: The rule generation is seen as a question-answering problem, where the parameters of the questions are known items of the rule (e.g. an infectious disease, caused by a given bacterium) and answers (e.g. some antibiotics) are obtained by a question-answering engine. RESULTS: When looking for a drug given a pathogen and a disease, top-precision of 0.55 is obtained by the combination of the Boolean engine (PubMed) and the relevance-driven engine (easyIR), which means that for more than half of our evaluation benchmark at least one of the recommended antibiotics was automatically acquired by the rule generation method. CONCLUSION: These results suggest that such an automatic text mining approach could provide a useful tool for guidelines management, by improving knowledge update and discovery. PMID:20351908

  11. Improving the Usefulness of Concept Maps as a Research Tool for Science Education

    ERIC Educational Resources Information Center

    Van Zele, Els; Lenaerts, Josephina; Wieme, Willem

    2004-01-01

    The search for authentic science research tools to evaluate student understanding in a hybrid learning environment with a large multimedia component has resulted in the use of concept maps as a representation of student's knowledge organization. One hundred and seventy third-semester introductory university-level engineering students represented…

  12. How great a thirst? Assembling a river restoration toolkit

    Treesearch

    Steve Harris

    1999-01-01

    The Rio Grande River's biologically troubled status is clearly linked to present and historic water management. To restore the river to pre-settlement conditions will take a "tool kit" that holds authorities, knowledge, and skills needed to correct historical neglect and abuse. Tools include awareness, planning, partnerships, engineering solutions, and a...

  13. 75 FR 71005 - American Education Week, 2010

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-22

    ... maintain our Nation's role as the world's engine of discovery and innovation, my Administration is.... Our Nation's schools can give students the tools, skills, and knowledge to participate fully in our...

  14. Tools for Large-Scale Data Analytic Examination of Relational and Epistemic Networks in Engineering Education

    ERIC Educational Resources Information Center

    Madhavan, Krishna; Johri, Aditya; Xian, Hanjun; Wang, G. Alan; Liu, Xiaomo

    2014-01-01

    The proliferation of digital information technologies and related infrastructure has given rise to novel ways of capturing, storing and analyzing data. In this paper, we describe the research and development of an information system called Interactive Knowledge Networks for Engineering Education Research (iKNEER). This system utilizes a framework…

  15. Experiences in Digital Circuit Design Courses: A Self-Study Platform for Learning Support

    ERIC Educational Resources Information Center

    Bañeres, David; Clarisó, Robert; Jorba, Josep; Serra, Montse

    2014-01-01

    The synthesis of digital circuits is a basic skill in all the bachelor programmes around the ICT area of knowledge, such as Computer Science, Telecommunication Engineering or Electrical Engineering. An important hindrance in the learning process of this skill is that the existing educational tools for the design of circuits do not allow the…

  16. Knowledge-based approach for generating target system specifications from a domain model

    NASA Technical Reports Server (NTRS)

    Gomaa, Hassan; Kerschberg, Larry; Sugumaran, Vijayan

    1992-01-01

    Several institutions in industry and academia are pursuing research efforts in domain modeling to address unresolved issues in software reuse. To demonstrate the concepts of domain modeling and software reuse, a prototype software engineering environment is being developed at George Mason University to support the creation of domain models and the generation of target system specifications. This prototype environment, which is application domain independent, consists of an integrated set of commercial off-the-shelf software tools and custom-developed software tools. This paper describes the knowledge-based tool that was developed as part of the environment to generate target system specifications from a domain model.

  17. Applications and issues of GIS as tool for civil engineering modeling

    USGS Publications Warehouse

    Miles, S.B.; Ho, C.L.

    1999-01-01

    A tool that has proliferated within civil engineering in recent years is geographic information systems (GIS). The goal of a tool is to supplement ability and knowledge that already exists, not to serve as a replacement for that which is lacking. To secure the benefits and avoid misuse of a burgeoning tool, engineers must understand the limitations, alternatives, and context of the tool. The common benefits of using GIS as a supplement to engineering modeling are summarized. Several brief case studies of GIS modeling applications are taken from popular civil engineering literature to demonstrate the wide use and varied implementation of GIS across the discipline. Drawing from the case studies, limitations regarding traditional GIS data models find the implementation of civil engineering models within current GIS are identified and countered by discussing the direction of the next generation of GIS. The paper concludes by highlighting the potential for the misuse of GIS in the context of engineering modeling and suggests that this potential can be reduced through education and awareness. The goal of this paper is to promote awareness of the issues related to GIS-based modeling and to assist in the formulation of questions regarding the application of current GIS. The technology has experienced much publicity of late, with many engineers being perhaps too excited about the usefulness of current GIS. An undoubtedly beneficial side effect of this, however, is that engineers are becoming more aware of GIS and, hopefully, the associated subtleties. Civil engineers must stay informed of GIS issues and progress, but more importantly, civil engineers must inform the GIS community to direct the technology development optimally.

  18. Progress in Metabolic Engineering of Saccharomyces cerevisiae

    PubMed Central

    Nevoigt, Elke

    2008-01-01

    Summary: The traditional use of the yeast Saccharomyces cerevisiae in alcoholic fermentation has, over time, resulted in substantial accumulated knowledge concerning genetics, physiology, and biochemistry as well as genetic engineering and fermentation technologies. S. cerevisiae has become a platform organism for developing metabolic engineering strategies, methods, and tools. The current review discusses the relevance of several engineering strategies, such as rational and inverse metabolic engineering, evolutionary engineering, and global transcription machinery engineering, in yeast strain improvement. It also summarizes existing tools for fine-tuning and regulating enzyme activities and thus metabolic pathways. Recent examples of yeast metabolic engineering for food, beverage, and industrial biotechnology (bioethanol and bulk and fine chemicals) follow. S. cerevisiae currently enjoys increasing popularity as a production organism in industrial (“white”) biotechnology due to its inherent tolerance of low pH values and high ethanol and inhibitor concentrations and its ability to grow anaerobically. Attention is paid to utilizing lignocellulosic biomass as a potential substrate. PMID:18772282

  19. Multi-Disciplinary Design Optimization Using WAVE

    NASA Technical Reports Server (NTRS)

    Irwin, Keith

    2000-01-01

    The current preliminary design tools lack the product performance, quality and cost prediction fidelity required to design Six Sigma products. They are also frequently incompatible with the tools used in detailed design, leading to a great deal of rework and lost or discarded data in the transition from preliminary to detailed design. Thus, enhanced preliminary design tools are needed in order to produce adequate financial returns to the business. To achieve this goal, GEAE has focused on building the preliminary design system around the same geometric 3D solid model that will be used in detailed design. With this approach, the preliminary designer will no longer convert a flowpath sketch into an engine cross section but rather, automatically create 3D solid geometry for structural integrity, life, weight, cost, complexity, producibility, and maintainability assessments. Likewise, both the preliminary design and the detailed design can benefit from the use of the same preliminary part sizing routines. The design analysis tools will also be integrated with the 3D solid model to eliminate manual transfer of data between programs. GEAE has aggressively pursued the computerized control of engineering knowledge for many years. Through its study and validation of 3D CAD programs and processes, GEAE concluded that total system control was not feasible at that time. Prior CAD tools focused exclusively on detail part geometry and Knowledge Based Engineering systems concentrated on rules input and data output. A system was needed to bridge the gap between the two to capture the total system. With the introduction of WAVE Engineering from UGS, the possibilities of an engineering system control device began to formulate. GEAE decided to investigate the new WAVE functionality to accomplish this task. NASA joined GEAE in funding this validation project through Task Order No. 1. With the validation project complete, the second phase under Task Order No. 2 was established to develop an associative control structure (framework) in the UG WAVE environment enabling multi-disciplinary design of turbine propulsion systems. The capabilities of WAVE were evaluated to assess its use as a rapid optimization and productivity tool. This project also identified future WAVE product enhancements that will make the tool still more beneficial for product development.

  20. Mind Maps: Hot New Tools Proposed for Cyberspace Librarians.

    ERIC Educational Resources Information Center

    Humphreys, Nancy K.

    1999-01-01

    Describes how online searchers can use a software tool based on back-of-the-book indexes to assist in dealing with search engine databases compiled by spiders that crawl across the entire Internet or through large Web sites. Discusses human versus machine knowledge, conversion of indexes to mind maps or mini-thesauri, middleware, eXtensible Markup…

  1. Lynx: a database and knowledge extraction engine for integrative medicine

    PubMed Central

    Sulakhe, Dinanath; Balasubramanian, Sandhya; Xie, Bingqing; Feng, Bo; Taylor, Andrew; Wang, Sheng; Berrocal, Eduardo; Dave, Utpal; Xu, Jinbo; Börnigen, Daniela; Gilliam, T. Conrad; Maltsev, Natalia

    2014-01-01

    We have developed Lynx (http://lynx.ci.uchicago.edu)—a web-based database and a knowledge extraction engine, supporting annotation and analysis of experimental data and generation of weighted hypotheses on molecular mechanisms contributing to human phenotypes and disorders of interest. Its underlying knowledge base (LynxKB) integrates various classes of information from >35 public databases and private collections, as well as manually curated data from our group and collaborators. Lynx provides advanced search capabilities and a variety of algorithms for enrichment analysis and network-based gene prioritization to assist the user in extracting meaningful knowledge from LynxKB and experimental data, whereas its service-oriented architecture provides public access to LynxKB and its analytical tools via user-friendly web services and interfaces. PMID:24270788

  2. Effective standards and regulatory tools for respiratory gas monitors and pulse oximeters: the role of the engineer and clinician.

    PubMed

    Weininger, Sandy

    2007-12-01

    Developing safe and effective medical devices involves understanding the hazardous situations that can arise in clinical practice and implementing appropriate risk control measures. The hazardous situations may have their roots in the design or in the use of the device. Risk control measures may be engineering or clinically based. A multidisciplinary team of engineers and clinicians is needed to fully identify and assess the risks and implement and evaluate the effectiveness of the control measures. In this paper, I use three issues, calibration/accuracy, response time, and protective measures/alarms, to highlight the contributions of these groups. This important information is captured in standards and regulatory tools to control risk for respiratory gas monitors and pulse oximeters. This paper begins with a discussion of the framework of safety, explaining how voluntary standards and regulatory tools work. The discussion is followed by an examination of how engineering and clinical knowledge are used to support the assurance of safety.

  3. XML-Based SHINE Knowledge Base Interchange Language

    NASA Technical Reports Server (NTRS)

    James, Mark; Mackey, Ryan; Tikidjian, Raffi

    2008-01-01

    The SHINE Knowledge Base Interchange Language software has been designed to more efficiently send new knowledge bases to spacecraft that have been embedded with the Spacecraft Health Inference Engine (SHINE) tool. The intention of the behavioral model is to capture most of the information generally associated with a spacecraft functional model, while specifically addressing the needs of execution within SHINE and Livingstone. As such, it has some constructs that are based on one or the other.

  4. Strategies for Information Retrieval and Virtual Teaming to Mitigate Risk on NASA's Missions

    NASA Technical Reports Server (NTRS)

    Topousis, Daria; Williams, Gregory; Murphy, Keri

    2007-01-01

    Following the loss of NASA's Space Shuttle Columbia in 2003, it was determined that problems in the agency's organization created an environment that led to the accident. One component of the proposed solution resulted in the formation of the NASA Engineering Network (NEN), a suite of information retrieval and knowledge sharing tools. This paper describes the implementation of this set of search, portal, content management, and semantic technologies, including a unique meta search capability for data from distributed engineering resources. NEN's communities of practice are formed along engineering disciplines where users leverage their knowledge and best practices to collaborate and take informal learning back to their personal jobs and embed it into the procedures of the agency. These results offer insight into using traditional engineering disciplines for virtual teaming and problem solving.

  5. About, for, in or through entrepreneurship in engineering education

    NASA Astrophysics Data System (ADS)

    Mäkimurto-Koivumaa, Soili; Belt, Pekka

    2016-09-01

    Engineering competences form a potential basis for entrepreneurship. There are pressures to find new approaches to entrepreneurship education (EE) in engineering education, as the traditional analytical logic of engineering does not match the modern view of entrepreneurship. Since the previous models do not give tangible enough tools on how to organise EE in practice, this article aims to develop a new framework for EE at the university level. We approach this aim by analysing existing scientific literature complemented by long-term practical observations, enabling a fruitful interplay between theory and practice. The developed framework recommends aspects in EE to be emphasised during each year of the study process. Action-based learning methods are highlighted in the beginning of studies to support students' personal growth. Explicit business knowledge is to be gradually increased only when professional, field-specific knowledge has been adequately accumulated.

  6. Deciding alternative left turn signal phases using expert systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chang, E.C.P.

    1988-01-01

    The Texas Transportation Institute (TTI) conducted a study to investigate the feasibility of applying artificial intelligence (AI) technology and expert systems (ES) design concepts to a traffic engineering problem. Prototype systems were developed to analyze user input, evaluate various reasoning, and suggest suitable left turn phase treatment. These systems were developed using AI programming tools on IBM PC/XT/AT-compatible microcomputers. Two slightly different systems were designed using AI languages; another was built with a knowledge engineering tool. These systems include the PD PROLOG and TURBO PROLOG AI programs, as well as the INSIGHT Production Rule Language.

  7. Delivering spacecraft control centers with embedded knowledge-based systems: The methodology issue

    NASA Technical Reports Server (NTRS)

    Ayache, S.; Haziza, M.; Cayrac, D.

    1994-01-01

    Matra Marconi Space (MMS) occupies a leading place in Europe in the domain of satellite and space data processing systems. The maturity of the knowledge-based systems (KBS) technology, the theoretical and practical experience acquired in the development of prototype, pre-operational and operational applications, make it possible today to consider the wide operational deployment of KBS's in space applications. In this perspective, MMS has to prepare the introduction of the new methods and support tools that will form the basis of the development of such systems. This paper introduces elements of the MMS methodology initiatives in the domain and the main rationale that motivated the approach. These initiatives develop along two main axes: knowledge engineering methods and tools, and a hybrid method approach for coexisting knowledge-based and conventional developments.

  8. Metabolic network flux analysis for engineering plant systems.

    PubMed

    Shachar-Hill, Yair

    2013-04-01

    Metabolic network flux analysis (NFA) tools have proven themselves to be powerful aids to metabolic engineering of microbes by providing quantitative insights into the flows of material and energy through cellular systems. The development and application of NFA tools to plant systems has advanced in recent years and are yielding significant insights and testable predictions. Plants present substantial opportunities for the practical application of NFA but they also pose serious challenges related to the complexity of plant metabolic networks and to deficiencies in our knowledge of their structure and regulation. By considering the tools available and selected examples, this article attempts to assess where and how NFA is most likely to have a real impact on plant biotechnology. Copyright © 2013 Elsevier Ltd. All rights reserved.

  9. A Design Tool for Matching UAV Propeller and Power Plant Performance

    NASA Astrophysics Data System (ADS)

    Mangio, Arion L.

    A large body of knowledge is available for matching propellers to engines for large propeller driven aircraft. Small UAV's and model airplanes operate at much lower Reynolds numbers and use fixed pitch propellers so the information for large aircraft is not directly applicable. A design tool is needed that takes into account Reynolds number effects, allows for gear reduction, and the selection of a propeller optimized for the airframe. The tool developed in this thesis does this using propeller performance data generated from vortex theory or wind tunnel experiments and combines that data with an engine power curve. The thrust, steady state power, RPM, and tip Mach number vs. velocity curves are generated. The Reynolds number vs. non dimensional radial station at an operating point is also found. The tool is then used to design a geared power plant for the SAE Aero Design competition. To measure the power plant performance, a purpose built engine test stand was built. The characteristics of the engine test stand are also presented. The engine test stand was then used to characterize the geared power plant. The power plant uses a 26x16 propeller, 100/13 gear ratio, and an LRP 0.30 cubic inch engine turning at 28,000 RPM and producing 2.2 HP. Lastly, the measured power plant performance is presented. An important result is that 17 lbf of static thrust is produced.

  10. Storying energy consumption: Collective video storytelling in energy efficiency social marketing.

    PubMed

    Gordon, Ross; Waitt, Gordon; Cooper, Paul; Butler, Katherine

    2018-05-01

    Despite calls for more socio-technical research on energy, there is little practical advice to how narratives collected through qualitative research may be melded with technical knowledge from the physical sciences such as engineering and then applied in energy efficiency social action strategies. This is despite established knowledge in the environmental management literature about domestic energy use regarding the utility of social practice theory and narrative framings that socialise everyday consumption. Storytelling is positioned in this paper both as a focus for socio-technical energy research, and as one potential practical tool that can arguably enhance energy efficiency interventions. We draw upon the literature on everyday social practices, and storytelling, to present our framework called 'collective video storytelling' that combines scientific and lay knowledge about domestic energy use to offer a practical tool for energy efficiency management. Collective video storytelling is discussed in the context of Energy+Illawarra, a 3-year cross-disciplinary collaboration between social marketers, human geographers, and engineers to target energy behavioural change within older low-income households in regional NSW, Australia. Copyright © 2018 Elsevier Ltd. All rights reserved.

  11. Object oriented studies into artificial space debris

    NASA Technical Reports Server (NTRS)

    Adamson, J. M.; Marshall, G.

    1988-01-01

    A prototype simulation is being developed under contract to the Royal Aerospace Establishment (RAE), Farnborough, England, to assist in the discrimination of artificial space objects/debris. The methodology undertaken has been to link Object Oriented programming, intelligent knowledge based system (IKBS) techniques and advanced computer technology with numeric analysis to provide a graphical, symbolic simulation. The objective is to provide an additional layer of understanding on top of conventional classification methods. Use is being made of object and rule based knowledge representation, multiple reasoning, truth maintenance and uncertainty. Software tools being used include Knowledge Engineering Environment (KEE) and SymTactics for knowledge representation. Hooks are being developed within the SymTactics framework to incorporate mathematical models describing orbital motion and fragmentation. Penetration and structural analysis can also be incorporated. SymTactics is an Object Oriented discrete event simulation tool built as a domain specific extension to the KEE environment. The tool provides facilities for building, debugging and monitoring dynamic (military) simulations.

  12. Photolithography diagnostic expert systems: a systematic approach to problem solving in a wafer fabrication facility

    NASA Astrophysics Data System (ADS)

    Weatherwax Scott, Caroline; Tsareff, Christopher R.

    1990-06-01

    One of the main goals of process engineering in the semiconductor industry is to improve wafer fabrication productivity and throughput. Engineers must work continuously toward this goal in addition to performing sustaining and development tasks. To accomplish these objectives, managers must make efficient use of engineering resources. One of the tools being used to improve efficiency is the diagnostic expert system. Expert systems are knowledge based computer programs designed to lead the user through the analysis and solution of a problem. Several photolithography diagnostic expert systems have been implemented at the Hughes Technology Center to provide a systematic approach to process problem solving. This systematic approach was achieved by documenting cause and effect analyses for a wide variety of processing problems. This knowledge was organized in the form of IF-THEN rules, a common structure for knowledge representation in expert system technology. These rules form the knowledge base of the expert system which is stored in the computer. The systems also include the problem solving methodology used by the expert when addressing a problem in his area of expertise. Operators now use the expert systems to solve many process problems without engineering assistance. The systems also facilitate the collection of appropriate data to assist engineering in solving unanticipated problems. Currently, several expert systems have been implemented to cover all aspects of the photolithography process. The systems, which have been in use for over a year, include wafer surface preparation (HMDS), photoresist coat and softbake, align and expose on a wafer stepper, and develop inspection. These systems are part of a plan to implement an expert system diagnostic environment throughout the wafer fabrication facility. In this paper, the systems' construction is described, including knowledge acquisition, rule construction, knowledge refinement, testing, and evaluation. The roles played by the process engineering expert and the knowledge engineer are discussed. The features of the systems are shown, particularly the interactive quality of the consultations and the ease of system use.

  13. Expanding the KATE toolbox

    NASA Technical Reports Server (NTRS)

    Thomas, Stan J.

    1993-01-01

    KATE (Knowledge-based Autonomous Test Engineer) is a model-based software system developed in the Artificial Intelligence Laboratory at the Kennedy Space Center for monitoring, fault detection, and control of launch vehicles and ground support systems. In order to bring KATE to the level of performance, functionality, and integratability needed for firing room applications, efforts are underway to implement KATE in the C++ programming language using an X-windows interface. Two programs which were designed and added to the collection of tools which comprise the KATE toolbox are described. The first tool, called the schematic viewer, gives the KATE user the capability to view digitized schematic drawings in the KATE environment. The second tool, called the model editor, gives the KATE model builder a tool for creating and editing knowledge base files. Design and implementation issues having to do with these two tools are discussed. It will be useful to anyone maintaining or extending either the schematic viewer or the model editor.

  14. MLM Builder: An Integrated Suite for Development and Maintenance of Arden Syntax Medical Logic Modules

    PubMed Central

    Sailors, R. Matthew

    1997-01-01

    The Arden Syntax specification for sharable computerized medical knowledge bases has not been widely utilized in the medical informatics community because of a lack of tools for developing Arden Syntax knowledge bases (Medical Logic Modules). The MLM Builder is a Microsoft Windows-hosted CASE (Computer Aided Software Engineering) tool designed to aid in the development and maintenance of Arden Syntax Medical Logic Modules (MLMs). The MLM Builder consists of the MLM Writer (an MLM generation tool), OSCAR (an anagram of Object-oriented ARden Syntax Compiler), a test database, and the MLManager (an MLM management information system). Working together, these components form a self-contained, unified development environment for the creation, testing, and maintenance of Arden Syntax Medical Logic Modules.

  15. FOAMSearch.net: A custom search engine for emergency medicine and critical care.

    PubMed

    Raine, Todd; Thoma, Brent; Chan, Teresa M; Lin, Michelle

    2015-08-01

    The number of online resources read by and pertinent to clinicians has increased dramatically. However, most healthcare professionals still use mainstream search engines as their primary port of entry to the resources on the Internet. These search engines use algorithms that do not make it easy to find clinician-oriented resources. FOAMSearch, a custom search engine (CSE), was developed to find relevant, high-quality online resources for emergency medicine and critical care (EMCC) clinicians. Using Google™ algorithms, it searches a vetted list of >300 blogs, podcasts, wikis, knowledge translation tools, clinical decision support tools and medical journals. Utilisation has increased progressively to >3000 users/month since its launch in 2011. Further study of the role of CSEs to find medical resources is needed, and it might be possible to develop similar CSEs for other areas of medicine. © 2015 Australasian College for Emergency Medicine and Australasian Society for Emergency Medicine.

  16. Processes in construction of failure management expert systems from device design information

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Lance, Nick

    1987-01-01

    This paper analyzes the tasks and problem solving methods used by an engineer in constructing a failure management expert system from design information about the device to te diagnosed. An expert test engineer developed a trouble-shooting expert system based on device design information and experience with similar devices, rather than on specific expert knowledge gained from operating the device or troubleshooting its failures. The construction of the expert system was intensively observed and analyzed. This paper characterizes the knowledge, tasks, methods, and design decisions involved in constructing this type of expert system, and makes recommendations concerning tools for aiding and automating construction of such systems.

  17. The theory of interface slicing

    NASA Technical Reports Server (NTRS)

    Beck, Jon

    1993-01-01

    Interface slicing is a new tool which was developed to facilitate reuse-based software engineering, by addressing the following problems, needs, and issues: (1) size of systems incorporating reused modules; (2) knowledge requirements for program modification; (3) program understanding for reverse engineering; (4) module granularity and domain management; and (5) time and space complexity of conventional slicing. The definition of a form of static program analysis called interface slicing is addressed.

  18. Tacit Knowledge Capture and the Brain-Drain at Electrical Utilities

    NASA Astrophysics Data System (ADS)

    Perjanik, Nicholas Steven

    As a consequence of an aging workforce, electric utilities are at risk of losing their most experienced and knowledgeable electrical engineers. In this research, the problem was a lack of understanding of what electric utilities were doing to capture the tacit knowledge or know-how of these engineers. The purpose of this qualitative research study was to explore the tacit knowledge capture strategies currently used in the industry by conducting a case study of 7 U.S. electrical utilities that have demonstrated an industry commitment to improving operational standards. The research question addressed the implemented strategies to capture the tacit knowledge of retiring electrical engineers and technical personnel. The research methodology involved a qualitative embedded case study. The theories used in this study included knowledge creation theory, resource-based theory, and organizational learning theory. Data were collected through one time interviews of a senior electrical engineer or technician within each utility and a workforce planning or training professional within 2 of the 7 utilities. The analysis included the use of triangulation and content analysis strategies. Ten tacit knowledge capture strategies were identified: (a) formal and informal on-boarding mentorship and apprenticeship programs, (b) formal and informal off-boarding mentorship programs, (c) formal and informal training programs, (d) using lessons learned during training sessions, (e) communities of practice, (f) technology enabled tools, (g) storytelling, (h) exit interviews, (i) rehiring of retirees as consultants, and (j) knowledge risk assessments. This research contributes to social change by offering strategies to capture the know-how needed to ensure operational continuity in the delivery of safe, reliable, and sustainable power.

  19. Developing Systems Engineering Skills Through NASA Summer Intern Project

    NASA Technical Reports Server (NTRS)

    Bhasin, Kul; Barritt, Brian; Golden, Bert; Knoblock, Eric; Matthews, Seth; Warner, Joe

    2010-01-01

    During the Formulation phases of the NASA Project Life Cycle, communication systems engineers are responsible for designing space communication links and analyzing their performance to ensure that the proposed communication architecture is capable of satisfying high-level mission requirements. Senior engineers with extensive experience in communications systems perform these activities. However, the increasing complexity of space systems coupled with the current shortage of communications systems engineers has led to an urgent need for expedited training of new systems engineers. A pilot program, in which college-bound high school and undergraduate students studying various engineering disciplines are immersed in NASA s systems engineering practices, was conceived out of this need. This rapid summerlong training approach is feasible because of the availability of advanced software and technology tools and the students inherent ability to operate such tools. During this pilot internship program, a team of college-level and recently-hired engineers configured and utilized various software applications in the design and analysis of communication links for a plausible lunar sortie mission. The approach taken was to first design the direct-to-Earth communication links for the lunar mission elements, then to design the links between lunar surface and lunar orbital elements. Based on the data obtained from these software applications, an integrated communication system design was realized and the students gained valuable systems engineering knowledge. This paper describes this approach to rapidly training college-bound high school and undergraduate engineering students from various disciplines in NASA s systems engineering practices and tools. A summary of the potential use of NASA s emerging systems engineering internship program in broader applications is also described.

  20. Datasets2Tools, repository and search engine for bioinformatics datasets, tools and canned analyses

    PubMed Central

    Torre, Denis; Krawczuk, Patrycja; Jagodnik, Kathleen M.; Lachmann, Alexander; Wang, Zichen; Wang, Lily; Kuleshov, Maxim V.; Ma’ayan, Avi

    2018-01-01

    Biomedical data repositories such as the Gene Expression Omnibus (GEO) enable the search and discovery of relevant biomedical digital data objects. Similarly, resources such as OMICtools, index bioinformatics tools that can extract knowledge from these digital data objects. However, systematic access to pre-generated ‘canned’ analyses applied by bioinformatics tools to biomedical digital data objects is currently not available. Datasets2Tools is a repository indexing 31,473 canned bioinformatics analyses applied to 6,431 datasets. The Datasets2Tools repository also contains the indexing of 4,901 published bioinformatics software tools, and all the analyzed datasets. Datasets2Tools enables users to rapidly find datasets, tools, and canned analyses through an intuitive web interface, a Google Chrome extension, and an API. Furthermore, Datasets2Tools provides a platform for contributing canned analyses, datasets, and tools, as well as evaluating these digital objects according to their compliance with the findable, accessible, interoperable, and reusable (FAIR) principles. By incorporating community engagement, Datasets2Tools promotes sharing of digital resources to stimulate the extraction of knowledge from biomedical research data. Datasets2Tools is freely available from: http://amp.pharm.mssm.edu/datasets2tools. PMID:29485625

  1. Datasets2Tools, repository and search engine for bioinformatics datasets, tools and canned analyses.

    PubMed

    Torre, Denis; Krawczuk, Patrycja; Jagodnik, Kathleen M; Lachmann, Alexander; Wang, Zichen; Wang, Lily; Kuleshov, Maxim V; Ma'ayan, Avi

    2018-02-27

    Biomedical data repositories such as the Gene Expression Omnibus (GEO) enable the search and discovery of relevant biomedical digital data objects. Similarly, resources such as OMICtools, index bioinformatics tools that can extract knowledge from these digital data objects. However, systematic access to pre-generated 'canned' analyses applied by bioinformatics tools to biomedical digital data objects is currently not available. Datasets2Tools is a repository indexing 31,473 canned bioinformatics analyses applied to 6,431 datasets. The Datasets2Tools repository also contains the indexing of 4,901 published bioinformatics software tools, and all the analyzed datasets. Datasets2Tools enables users to rapidly find datasets, tools, and canned analyses through an intuitive web interface, a Google Chrome extension, and an API. Furthermore, Datasets2Tools provides a platform for contributing canned analyses, datasets, and tools, as well as evaluating these digital objects according to their compliance with the findable, accessible, interoperable, and reusable (FAIR) principles. By incorporating community engagement, Datasets2Tools promotes sharing of digital resources to stimulate the extraction of knowledge from biomedical research data. Datasets2Tools is freely available from: http://amp.pharm.mssm.edu/datasets2tools.

  2. Design and manufacturing challenges of optogenetic neural interfaces: a review

    NASA Astrophysics Data System (ADS)

    Goncalves, S. B.; Ribeiro, J. F.; Silva, A. F.; Costa, R. M.; Correia, J. H.

    2017-08-01

    Optogenetics is a relatively new technology to achieve cell-type specific neuromodulation with millisecond-scale temporal precision. Optogenetic tools are being developed to address neuroscience challenges, and to improve the knowledge about brain networks, with the ultimate aim of catalyzing new treatments for brain disorders and diseases. To reach this ambitious goal the implementation of mature and reliable engineered tools is required. The success of optogenetics relies on optical tools that can deliver light into the neural tissue. Objective/Approach: Here, the design and manufacturing approaches available to the scientific community are reviewed, and current challenges to accomplish appropriate scalable, multimodal and wireless optical devices are discussed. Significance: Overall, this review aims at presenting a helpful guidance to the engineering and design of optical microsystems for optogenetic applications.

  3. Blooms' separation of the final exam of Engineering Mathematics II: Item reliability using Rasch measurement model

    NASA Astrophysics Data System (ADS)

    Fuaad, Norain Farhana Ahmad; Nopiah, Zulkifli Mohd; Tawil, Norgainy Mohd; Othman, Haliza; Asshaari, Izamarlina; Osman, Mohd Hanif; Ismail, Nur Arzilah

    2014-06-01

    In engineering studies and researches, Mathematics is one of the main elements which express physical, chemical and engineering laws. Therefore, it is essential for engineering students to have a strong knowledge in the fundamental of mathematics in order to apply the knowledge to real life issues. However, based on the previous results of Mathematics Pre-Test, it shows that the engineering students lack the fundamental knowledge in certain topics in mathematics. Due to this, apart from making improvements in the methods of teaching and learning, studies on the construction of questions (items) should also be emphasized. The purpose of this study is to assist lecturers in the process of item development and to monitor the separation of items based on Blooms' Taxonomy and to measure the reliability of the items itself usingRasch Measurement Model as a tool. By using Rasch Measurement Model, the final exam questions of Engineering Mathematics II (Linear Algebra) for semester 2 sessions 2012/2013 were analysed and the results will provide the details onthe extent to which the content of the item providesuseful information about students' ability. This study reveals that the items used in Engineering Mathematics II (Linear Algebra) final exam are well constructed but the separation of the items raises concern as it is argued that it needs further attention, as there is abig gap between items at several levels of Blooms' cognitive skill.

  4. Knowledge-Based Software Development Tools

    DTIC Science & Technology

    1993-09-01

    GREEN, C., AND WESTFOLD, S. Knowledge-based programming self-applied. In Machine Intelligence 10, J. E. Hayes, D. Mitchie, and Y. Pao, Eds., Wiley...Technical Report KES.U.84.2, Kestrel Institute, April 1984. [181 KORF, R. E. Toward a model of representation changes. Artificial Intelligence 14, 1...Artificial Intelligence 27, 1 (February 1985), 43-96. Replinted in Readings in Artificial Intelligence and Software Engineering, C. Rich •ad R. Waters

  5. Strain measurements in a rotary engine housing

    NASA Technical Reports Server (NTRS)

    Lee, C. M.; Bond, T. H.; Addy, H. E.; Chun, K. S.; Lu, C. Y.

    1989-01-01

    The development of structural design tools for Rotary Combustion Engines (RCE) using Finite Element Modeling (FEM) requires knowledge about the response of engine materials to various service conditions. This paper describes experimental work that studied housing deformation as a result of thermal, pressure and mechanical loads. The measurement of thermal loads, clamping pressure, and deformation was accomplished by use of high-temperature strain gauges, thermocouples, and a high speed data acquisition system. FEM models for heat transfer stress analysis of the rotor housing will be verified and refined based on these experimental results.

  6. Engineering design knowledge recycling in near-real-time

    NASA Technical Reports Server (NTRS)

    Leifer, Larry; Baya, Vinod; Toye, George; Baudin, Catherine; Underwood, Jody Gevins

    1994-01-01

    It is hypothesized that the capture and reuse of machine readable design records is cost beneficial. This informal engineering notebook design knowledge can be used to model the artifact and the design process. Design rationale is, in part, preserved and available for examination. Redesign cycle time is significantly reduced (Baya et al, 1992). These factors contribute to making it less costly to capture and reuse knowledge than to recreate comparable knowledge (current practice). To test the hypothesis, we have focused on validation of the concept and tools in two 'real design' projects this past year: (1) a short (8 month) turnaround project for NASA life science bioreactor researchers was done by a team of three mechanical engineering graduate students at Stanford University (in a class, ME210abc 'Mechatronic Systems Design and Methodology' taught by one of the authors, Leifer); and (2) a long range (8 to 20 year) international consortium project for NASA's Space Science program (STEP: satellite test of the equivalence principle). Design knowledge capture was supported this year by assigning the use of a Team-Design PowerBook. Design records were cataloged in near-real time. These records were used to qualitatively model the artifact design as it evolved. Dedal, an 'intelligent librarian' developed at NASA-ARC, was used to navigate and retrieve captured knowledge for reuse.

  7. Harnessing expert knowledge: Defining a Bayesian network decision model with limited data-Model structure for the vibration qualification problem

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rizzo, Davinia B.; Blackburn, Mark R.

    As systems become more complex, systems engineers rely on experts to inform decisions. There are few experts and limited data in many complex new technologies. This challenges systems engineers as they strive to plan activities such as qualification in an environment where technical constraints are coupled with the traditional cost, risk, and schedule constraints. Bayesian network (BN) models provide a framework to aid systems engineers in planning qualification efforts with complex constraints by harnessing expert knowledge and incorporating technical factors. By quantifying causal factors, a BN model can provide data about the risk of implementing a decision supplemented with informationmore » on driving factors. This allows a systems engineer to make informed decisions and examine “what-if” scenarios. This paper discusses a novel process developed to define a BN model structure based primarily on expert knowledge supplemented with extremely limited data (25 data sets or less). The model was developed to aid qualification decisions—specifically to predict the suitability of six degrees of freedom (6DOF) vibration testing for qualification. The process defined the model structure with expert knowledge in an unbiased manner. Finally, validation during the process execution and of the model provided evidence the process may be an effective tool in harnessing expert knowledge for a BN model.« less

  8. Harnessing expert knowledge: Defining a Bayesian network decision model with limited data-Model structure for the vibration qualification problem

    DOE PAGES

    Rizzo, Davinia B.; Blackburn, Mark R.

    2018-03-30

    As systems become more complex, systems engineers rely on experts to inform decisions. There are few experts and limited data in many complex new technologies. This challenges systems engineers as they strive to plan activities such as qualification in an environment where technical constraints are coupled with the traditional cost, risk, and schedule constraints. Bayesian network (BN) models provide a framework to aid systems engineers in planning qualification efforts with complex constraints by harnessing expert knowledge and incorporating technical factors. By quantifying causal factors, a BN model can provide data about the risk of implementing a decision supplemented with informationmore » on driving factors. This allows a systems engineer to make informed decisions and examine “what-if” scenarios. This paper discusses a novel process developed to define a BN model structure based primarily on expert knowledge supplemented with extremely limited data (25 data sets or less). The model was developed to aid qualification decisions—specifically to predict the suitability of six degrees of freedom (6DOF) vibration testing for qualification. The process defined the model structure with expert knowledge in an unbiased manner. Finally, validation during the process execution and of the model provided evidence the process may be an effective tool in harnessing expert knowledge for a BN model.« less

  9. Knowledge Engineering for Preservation and Future use of Institutional Knowledge

    NASA Technical Reports Server (NTRS)

    Moreman, Douglas; Dyer, John

    1996-01-01

    This Project has two main thrusts-preservation of special knowledge and its useful representation via computers. NASA is losing the expertise of its engineers and scientists who put together the great missions of the past. We no longer are landing men on the moon. Some of the equipment still used today (such as the RL-10 rocket) was designed decades ago by people who are now retiring. Furthermore, there has been a lack, in some areas of technology, of new projects that overlap with the old and that would have provided opportunities for monitoring by senior engineers of the young ones. We are studying this problem and trying out a couple of methods of soliciting and recording rare knowledge from experts. One method is that of Concept Maps which produces a graphical interface to knowledge even as it helps solicit that knowledge. We arranged for experienced help in this method from John Coffey of the Institute of Human and Machine Technology at the University of West Florida. A second method which we plan to try out in May, is a video-taped review of selected failed missions (e.g., the craft tumbled and blew up). Five senior engineers (most already retired from NASA) will, as a team, analyze available data, illustrating their thought processes as they try to solve the problem of why a space craft failed to complete its mission. The session will be captured in high quality audio and with at least two video cameras. The video can later be used to plan future concept mapping interviews and, in edited form, be a product in itself. Our computer representations of the amassed knowledge may eventually, via the methods of expert systems, be joined with other software being prepared as a suite of tools to aid future engineers designing rocket engines. In addition to representation by multimedia concept maps, we plan to consider linking vast bodies of text (and other media) by hypertexting methods.

  10. Challenges and Advances for Genetic Engineering of Non-model Bacteria and Uses in Consolidated Bioprocessing

    PubMed Central

    Yan, Qiang; Fong, Stephen S.

    2017-01-01

    Metabolic diversity in microorganisms can provide the basis for creating novel biochemical products. However, most metabolic engineering projects utilize a handful of established model organisms and thus, a challenge for harnessing the potential of novel microbial functions is the ability to either heterologously express novel genes or directly utilize non-model organisms. Genetic manipulation of non-model microorganisms is still challenging due to organism-specific nuances that hinder universal molecular genetic tools and translatable knowledge of intracellular biochemical pathways and regulatory mechanisms. However, in the past several years, unprecedented progress has been made in synthetic biology, molecular genetics tools development, applications of omics data techniques, and computational tools that can aid in developing non-model hosts in a systematic manner. In this review, we focus on concerns and approaches related to working with non-model microorganisms including developing molecular genetics tools such as shuttle vectors, selectable markers, and expression systems. In addition, we will discuss: (1) current techniques in controlling gene expression (transcriptional/translational level), (2) advances in site-specific genome engineering tools [homologous recombination (HR) and clustered regularly interspaced short palindromic repeats (CRISPR)], and (3) advances in genome-scale metabolic models (GSMMs) in guiding design of non-model species. Application of these principles to metabolic engineering strategies for consolidated bioprocessing (CBP) will be discussed along with some brief comments on foreseeable future prospects. PMID:29123506

  11. A Desirable Engineer Providing Manifold Prospects Enhanced by Synergy Effect of Science and Liberal Arts

    NASA Astrophysics Data System (ADS)

    Harada, Shoji

    Quick globalization makes prediction of future human activity on production and economy more difficult. This is mainly because of increase in factors affecting relationship among different people with different way of life, culture, tradition and so on. To survive in such complicated world each engineer is desired to provide the knowledge of liberal arts as well as highly specialized knowledge. A synergy power generated by collaborative work of science-and liberal arts-minded way of thinking is a promising tool to unveil difficult world. This paper first describes degradation of liberal arts education during past two decades. Then, several movements for stopping that degradation are introduced in conjunction with the author‧s overseas experiences. Finally, a necessity of bringing up well competitive desirable engineer through collaborative work by university and company is proposed.

  12. The research on construction and application of machining process knowledge base

    NASA Astrophysics Data System (ADS)

    Zhao, Tan; Qiao, Lihong; Qie, Yifan; Guo, Kai

    2018-03-01

    In order to realize the application of knowledge in machining process design, from the perspective of knowledge in the application of computer aided process planning(CAPP), a hierarchical structure of knowledge classification is established according to the characteristics of mechanical engineering field. The expression of machining process knowledge is structured by means of production rules and the object-oriented methods. Three kinds of knowledge base models are constructed according to the representation of machining process knowledge. In this paper, the definition and classification of machining process knowledge, knowledge model, and the application flow of the process design based on the knowledge base are given, and the main steps of the design decision of the machine tool are carried out as an application by using the knowledge base.

  13. IntegromeDB: an integrated system and biological search engine.

    PubMed

    Baitaluk, Michael; Kozhenkov, Sergey; Dubinina, Yulia; Ponomarenko, Julia

    2012-01-19

    With the growth of biological data in volume and heterogeneity, web search engines become key tools for researchers. However, general-purpose search engines are not specialized for the search of biological data. Here, we present an approach at developing a biological web search engine based on the Semantic Web technologies and demonstrate its implementation for retrieving gene- and protein-centered knowledge. The engine is available at http://www.integromedb.org. The IntegromeDB search engine allows scanning data on gene regulation, gene expression, protein-protein interactions, pathways, metagenomics, mutations, diseases, and other gene- and protein-related data that are automatically retrieved from publicly available databases and web pages using biological ontologies. To perfect the resource design and usability, we welcome and encourage community feedback.

  14. Simulation of an Asynchronous Machine by using a Pseudo Bond Graph

    NASA Astrophysics Data System (ADS)

    Romero, Gregorio; Felez, Jesus; Maroto, Joaquin; Martinez, M. Luisa

    2008-11-01

    For engineers, computer simulation, is a basic tool since it enables them to understand how systems work without actually needing to see them. They can learn how they work in different circumstances and optimize their design with considerably less cost in terms of time and money than if they had to carry out tests on a physical system. However, if computer simulation is to be reliable it is essential for the simulation model to be validated. There is a wide range of commercial brands on the market offering products for electrical domain simulation (SPICE, LabVIEW PSCAD,Dymola, Simulink, Simplorer,...). These are powerful tools, but require the engineer to have a perfect knowledge of the electrical field. This paper shows an alternative methodology to can simulate an asynchronous machine using the multidomain Bond Graph technique and apply it in any program that permit the simulation of models based in this technique; no extraordinary knowledge of this technique and electric field are required to understand the process .

  15. Benchmarking expert system tools

    NASA Technical Reports Server (NTRS)

    Riley, Gary

    1988-01-01

    As part of its evaluation of new technologies, the Artificial Intelligence Section of the Mission Planning and Analysis Div. at NASA-Johnson has made timing tests of several expert system building tools. Among the production systems tested were Automated Reasoning Tool, several versions of OPS5, and CLIPS (C Language Integrated Production System), an expert system builder developed by the AI section. Also included in the test were a Zetalisp version of the benchmark along with four versions of the benchmark written in Knowledge Engineering Environment, an object oriented, frame based expert system tool. The benchmarks used for testing are studied.

  16. Knowledge portal for Six Sigma DMAIC process

    NASA Astrophysics Data System (ADS)

    ThanhDat, N.; Claudiu, K. V.; Zobia, R.; Lobont, Lucian

    2016-08-01

    Knowledge plays a crucial role in success of DMAIC (Define, Measure, Analysis, Improve, and Control) execution. It is therefore necessary to share and renew the knowledge. Yet, one problem arising is how to create a place where knowledge are collected and shared effectively. We believe that Knowledge Portal (KP) is an important solution for the problem. In this article, the works concerning with requirements and functionalities for KP are first reviewed. Afterwards, a procedure with necessary tools to develop and implement a KP for DMAIC (KPD) is proposed. Particularly, KPD is built on the basis of free and open-source content and learning management systems, and Ontology Engineering. In order to structure and store knowledge, tools such as Protégé, OWL, as well as OWL-RDF Parsers are used. A Knowledge Reasoner module is developed in PHP language, ARC2, MySQL and SPARQL endpoint for the purpose of querying and inferring knowledge available from Ontologies. In order to validate the availability of the procedure, a KPD is built with the proposed functionalities and tools. The authors find that the KPD benefits an organization in constructing Web sites by itself with simple steps of implementation and low initial costs. It creates a space of knowledge exchange and supports effectively collecting DMAIC reports as well as sharing knowledge created. The authors’ evaluation result shows that DMAIC knowledge is found exactly with a high success rate and a good level of response time of queries.

  17. Generic domain models in software engineering

    NASA Technical Reports Server (NTRS)

    Maiden, Neil

    1992-01-01

    This paper outlines three research directions related to domain-specific software development: (1) reuse of generic models for domain-specific software development; (2) empirical evidence to determine these generic models, namely elicitation of mental knowledge schema possessed by expert software developers; and (3) exploitation of generic domain models to assist modelling of specific applications. It focuses on knowledge acquisition for domain-specific software development, with emphasis on tool support for the most important phases of software development.

  18. Measuring the utility of the Science, Technology, Engineering, Mathematics (STEM) Academy Measurement Tool in assessing the development of K-8 STEM academies as professional learning communities

    NASA Astrophysics Data System (ADS)

    Irish, Teresa J.

    The aim of this study was to provide insights addressing national concerns in Science, Technology, Engineering, and Mathematics (STEM) education by examining how a set of six perimeter urban K-12 schools were transformed into STEM-focused professional learning communities (PLC). The concept of a STEM Academy as a STEM-focused PLC emphasizes the development of a STEM culture where professional discourse and teaching are focused on STEM learning. The STEM Academies examined used the STEM Academy Measurement Tool and Rubric (Tool) as a catalyst for discussion and change. This Tool was developed with input from stakeholders and used for school-wide initiatives, teacher professional development and K-12 student engagement to improve STEM teaching and learning. Two primary goals of this study were to assess the levels of awareness and use of the tool by all stakeholders involved in the project and to determine how the Tool assisted in the development and advancement of these schools as STEM PLCs. Data from the STEM Academy Participant Survey was analyzed to determine stakeholders' perceptions of the Tool in terms of (i) how aware stakeholders were of the Tool, (ii) whether they participated in the use of the Tool, (iii) how the characteristics of PLCs were perceived in their schools, and finally (iv) how the awareness of the Tool influenced teachers' perceptions of the presence of PLC characteristics. Findings indicate that school faculty were aware of the Tool on a number of different levels and evidence exists that the use of the Tool assisted in the development of STEM Academies, however impact varied from school to school. Implications of this study suggest that the survey should be used for a longer period of time to gain more in-depth knowledge on teachers' perceptions of the Tool as a catalyst across time. Additional findings indicate that the process for using the Tool should be ongoing and involve the stakeholders to have the greatest impact on school culture. This research contributes to the knowledge base related to building STEM PLCs aimed at improving K-12 teacher content and pedagogical knowledge as well as student learning and achievement in STEM education.

  19. Knowledge Preservation for Design of Rocket Systems

    NASA Technical Reports Server (NTRS)

    Moreman, Douglas

    2002-01-01

    An engineer at NASA Lewis RC presented a challenge to us at Southern University. Our response to that challenge, stated circa 1993, has evolved into the Knowledge Preservation Project which is here reported. The stated problem was to capture some of the knowledge of retiring NASA engineers and make it useful to younger engineers via computers. We evolved that initial challenge to this - design a system of tools such that, with this system, people might efficiently capture and make available via commonplace computers, deep knowledge of retiring NASA engineers. In the process of proving some of the concepts of this system, we would (and did) capture knowledge from some specific engineers and, so, meet the original challenge along the way to meeting the new. Some of the specific knowledge acquired, particularly that on the RL- 10 engine, was directly relevant to design of rocket engines. We considered and rejected some of the techniques popular in the days we began - specifically "expert systems" and "oral histories". We judged that these old methods had too high a cost per sentence preserved. That cost could be measured in hours of labor of a "knowledge professional". We did spend, particularly in the grant preceding this one, some time creating a couple of "concept maps", one of the latest ideas of the day, but judged this also to be costly in time of a specially trained knowledge-professional. We reasoned that the cost in specialized labor could be lowered if less time were spent being selective about sentences from the engineers and in crafting replacements for those sentences. The trade-off would seem to be that our set of sentences would be less dense in information, but we found a computer-based way around this seeming defect. Our plan, details of which we have been carrying out, was to find methods of extracting information from experts which would be capable of gaining cooperation, and interest, of senior engineers and using their time in a way they would find worthy (and, so, they would give more of their time and recruit time of other engineers as well). We studied these four ways of creating text: 1) the old way, via interviews and discussions - one of our team working with one expert, 2) a group-discussion led by one of the experts themselves and on a topic which inspires interaction of the experts, 3) a spoken dissertation by one expert practiced in giving talks, 4) expropriating, and modifying for our system, some existing reports (such as "oral histories" from the Smithsonian Institution).

  20. Using CLIPS in the domain of knowledge-based massively parallel programming

    NASA Technical Reports Server (NTRS)

    Dvorak, Jiri J.

    1994-01-01

    The Program Development Environment (PDE) is a tool for massively parallel programming of distributed-memory architectures. Adopting a knowledge-based approach, the PDE eliminates the complexity introduced by parallel hardware with distributed memory and offers complete transparency in respect of parallelism exploitation. The knowledge-based part of the PDE is realized in CLIPS. Its principal task is to find an efficient parallel realization of the application specified by the user in a comfortable, abstract, domain-oriented formalism. A large collection of fine-grain parallel algorithmic skeletons, represented as COOL objects in a tree hierarchy, contains the algorithmic knowledge. A hybrid knowledge base with rule modules and procedural parts, encoding expertise about application domain, parallel programming, software engineering, and parallel hardware, enables a high degree of automation in the software development process. In this paper, important aspects of the implementation of the PDE using CLIPS and COOL are shown, including the embedding of CLIPS with C++-based parts of the PDE. The appropriateness of the chosen approach and of the CLIPS language for knowledge-based software engineering are discussed.

  1. Peregrine Sustainer Motor Development

    NASA Technical Reports Server (NTRS)

    Brodell, Chuck; Franklin, Philip

    2015-01-01

    The Peregrine sounding rocket is an in-house NASA design that provides approximately 15 percent better performance than the motor it replaces. The design utilizes common materials and well-characterized architecture to reduce flight issues encountered with the current motors. It engages NASA design, analysts, test engineers and technicians, ballisticians, and systems engineers. The in-house work and collaboration within the government provides flexibility to efficiently accommodate design and program changes as the design matures and enhances the ability to meet schedule milestones. It provides a valuable tool to compare industry costs, develop contracts, and it develops foundational knowledge for the next generation of NASA engineers.

  2. Collaborative search in electronic health records.

    PubMed

    Zheng, Kai; Mei, Qiaozhu; Hanauer, David A

    2011-05-01

    A full-text search engine can be a useful tool for augmenting the reuse value of unstructured narrative data stored in electronic health records (EHR). A prominent barrier to the effective utilization of such tools originates from users' lack of search expertise and/or medical-domain knowledge. To mitigate the issue, the authors experimented with a 'collaborative search' feature through a homegrown EHR search engine that allows users to preserve their search knowledge and share it with others. This feature was inspired by the success of many social information-foraging techniques used on the web that leverage users' collective wisdom to improve the quality and efficiency of information retrieval. The authors conducted an empirical evaluation study over a 4-year period. The user sample consisted of 451 academic researchers, medical practitioners, and hospital administrators. The data were analyzed using a social-network analysis to delineate the structure of the user collaboration networks that mediated the diffusion of knowledge of search. The users embraced the concept with considerable enthusiasm. About half of the EHR searches processed by the system (0.44 million) were based on stored search knowledge; 0.16 million utilized shared knowledge made available by other users. The social-network analysis results also suggest that the user-collaboration networks engendered by the collaborative search feature played an instrumental role in enabling the transfer of search knowledge across people and domains. Applying collaborative search, a social information-foraging technique popularly used on the web, may provide the potential to improve the quality and efficiency of information retrieval in healthcare.

  3. Collaborative search in electronic health records

    PubMed Central

    Mei, Qiaozhu; Hanauer, David A

    2011-01-01

    Objective A full-text search engine can be a useful tool for augmenting the reuse value of unstructured narrative data stored in electronic health records (EHR). A prominent barrier to the effective utilization of such tools originates from users' lack of search expertise and/or medical-domain knowledge. To mitigate the issue, the authors experimented with a ‘collaborative search’ feature through a homegrown EHR search engine that allows users to preserve their search knowledge and share it with others. This feature was inspired by the success of many social information-foraging techniques used on the web that leverage users' collective wisdom to improve the quality and efficiency of information retrieval. Design The authors conducted an empirical evaluation study over a 4-year period. The user sample consisted of 451 academic researchers, medical practitioners, and hospital administrators. The data were analyzed using a social-network analysis to delineate the structure of the user collaboration networks that mediated the diffusion of knowledge of search. Results The users embraced the concept with considerable enthusiasm. About half of the EHR searches processed by the system (0.44 million) were based on stored search knowledge; 0.16 million utilized shared knowledge made available by other users. The social-network analysis results also suggest that the user-collaboration networks engendered by the collaborative search feature played an instrumental role in enabling the transfer of search knowledge across people and domains. Conclusion Applying collaborative search, a social information-foraging technique popularly used on the web, may provide the potential to improve the quality and efficiency of information retrieval in healthcare. PMID:21486887

  4. Towards Evolutional Authoring Support Systems

    ERIC Educational Resources Information Center

    Aroyo, Lora; Mizoguchi, Riichiro

    2004-01-01

    The ultimate aim of this research is to specify and implement a general authoring framework for content and knowledge engineering for Intelligent Educational Systems (IES). In this context we attempt to develop an authoring tool supporting this framework that is powerful in its functionality, generic in its support of instructional strategies and…

  5. Propelling arboriculture into the future

    Treesearch

    E. Gregory McPherson

    2011-01-01

    Research is the engine that propels arboriculture and urban forestry into the future. New knowledge, technologies, and tools provide arborists with improved tree care practices that result in healthier urban forests. The ISA Science and Research Committee (SRC) is composed of 13 professionals and researchers who are dedicated to elevating the importance of research...

  6. Using a Foundational Ontology for Reengineering a Software Enterprise Ontology

    NASA Astrophysics Data System (ADS)

    Perini Barcellos, Monalessa; de Almeida Falbo, Ricardo

    The knowledge about software organizations is considerably relevant to software engineers. The use of a common vocabulary for representing the useful knowledge about software organizations involved in software projects is important for several reasons, such as to support knowledge reuse and to allow communication and interoperability between tools. Domain ontologies can be used to define a common vocabulary for sharing and reuse of knowledge about some domain. Foundational ontologies can be used for evaluating and re-designing domain ontologies, giving to these real-world semantics. This paper presents an evaluating of a Software Enterprise Ontology that was reengineered using the Unified Foundation Ontology (UFO) as basis.

  7. IntegromeDB: an integrated system and biological search engine

    PubMed Central

    2012-01-01

    Background With the growth of biological data in volume and heterogeneity, web search engines become key tools for researchers. However, general-purpose search engines are not specialized for the search of biological data. Description Here, we present an approach at developing a biological web search engine based on the Semantic Web technologies and demonstrate its implementation for retrieving gene- and protein-centered knowledge. The engine is available at http://www.integromedb.org. Conclusions The IntegromeDB search engine allows scanning data on gene regulation, gene expression, protein-protein interactions, pathways, metagenomics, mutations, diseases, and other gene- and protein-related data that are automatically retrieved from publicly available databases and web pages using biological ontologies. To perfect the resource design and usability, we welcome and encourage community feedback. PMID:22260095

  8. ISLE: Intelligent Selection of Loop Electronics. A CLIPS/C++/INGRES integrated application

    NASA Technical Reports Server (NTRS)

    Fischer, Lynn; Cary, Judson; Currie, Andrew

    1990-01-01

    The Intelligent Selection of Loop Electronics (ISLE) system is an integrated knowledge-based system that is used to configure, evaluate, and rank possible network carrier equipment known as Digital Loop Carrier (DLC), which will be used to meet the demands of forecasted telephone services. Determining the best carrier systems and carrier architectures, while minimizing the cost, meeting corporate policies and addressing area service demands, has become a formidable task. Network planners and engineers use the ISLE system to assist them in this task of selecting and configuring the appropriate loop electronics equipment for future telephone services. The ISLE application is an integrated system consisting of a knowledge base, implemented in CLIPS (a planner application), C++, and an object database created from existing INGRES database information. The embedibility, performance, and portability of CLIPS provided us with a tool with which to capture, clarify, and refine corporate knowledge and distribute this knowledge within a larger functional system to network planners and engineers throughout U S WEST.

  9. Culture, social networks, and information sharing: An exploratory study of Japanese aerospace engineers' information-seeking processes and habits in light of cultural factors

    NASA Astrophysics Data System (ADS)

    Sato, Yuko

    The purpose of this study was to investigate the effects of culture and language on Japanese aerospace engineers' information-seeking processes by both quantitative and qualitative approaches. The Japanese sample consisted of 162 members of the Japan Society for Aeronautical and Space Sciences (JSASS). U.S. aerospace engineers served as a reference point, consisting of 213 members of the American Institute of Aeronautics and Astronautics (AIAA). The survey method was utilized in gathering data using self-administered mail questionnaires in order to explore the following eight areas: (1) the content and use of information resources; (2) production and use of information products; (3) methods of accessing information service providers; (4) foreign language skills; (5) studying/researching/collaborating abroad as a tool in expanding information resources; (6) scientific and technical societies as networking tools; (7) alumni associations (school/class reunions) as networking tools; and (8) social, corporate, civic and health/fitness clubs as networking tools. Nine Japanese cultural factors expressed as statements about Japanese society are as follows: (1) information is neither autonomous, objective, nor independent of the subject of cognition; (2) information and knowledge are not readily accessible to the public; (3) emphasis on groups is reinforced in a hierarchical society; (4) social networks thrive as information-sharing vehicles; (5) high context is a predominant form of communication in which most of the information is already in the person, while very little is in the coded, transmitted part of the message; (6) obligations based on mutual trust dictate social behaviors instead of contractual agreements; (7) a surface message is what is presented while a bottom-line message is true feeling privately held; (8) various religious beliefs uphold a work ethic based on harmony; (9) ideas from outside are readily assimilated into its own society. The result of the investigation showed that culture and language affect Japanese aerospace engineers' information-seeking processes. The awareness and the knowledge of such effects will lead to improvement in global information services in aerospace engineering by incorporating various information resource providing organizations.

  10. Designing and Evaluating a Climate Change Course for Upper-Division Engineers and Scientists

    NASA Astrophysics Data System (ADS)

    Samson, P. J.

    2002-12-01

    AOSS 300, GLOBAL ENVIRONMENTAL IMPACT OF TECHNOLOGICAL CHANGE, was created to provide a mechanism for scientific exploration of the unexpected global environmental side effects of technological innovation with emphasis on issues of the atmosphere and oceans. The course is specifically designed to contribute to the desired Accreditation Board for Engineering and Technology (ABET) outcomes that engineering and science graduates possess "the broad education necessary to understand the impact of solutions in a global and societal context." To facilitate this new course a new suite of coupled Flash/PHP/MySQL tools have been created that allow personalization of the students' learning space and interaction with faculty. Using these tools students are challenged to actively participate in the construction of knowledge through development of on-line portfolios that influence course content. This paper reports on lessons learned in the first semester that will guide further course development.

  11. Transmission Planning Analysis Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2015-06-23

    Developed to solve specific problem: Assist transmission planning for regional transfers in interconnected power systems. This work was originated in a study for the U.S. Department of State, to recommend transmission reinforcements for the Central American regional system that interconnects 6 countries. Transmission planning analysis is currently performed by engineers with domainspecific and systemspecific knowledge without a unique methodology. The software codes of this disclosure assists engineers by defining systematic analysis procedures to help identify weak points and make decisions on transmission planning of regional interconnected power systems. Transmission Planning Analysis Tool groups PSS/E results of multiple AC contingency analysismore » and voltage stability analysis and QV analysis of many scenarios of study and arrange them in a systematic way to aid power system planning engineers or transmission operators in effective decision]making process or in the off]line study environment.« less

  12. The Software Management Environment (SME)

    NASA Technical Reports Server (NTRS)

    Valett, Jon D.; Decker, William; Buell, John

    1988-01-01

    The Software Management Environment (SME) is a research effort designed to utilize the past experiences and results of the Software Engineering Laboratory (SEL) and to incorporate this knowledge into a tool for managing projects. SME provides the software development manager with the ability to observe, compare, predict, analyze, and control key software development parameters such as effort, reliability, and resource utilization. The major components of the SME, the architecture of the system, and examples of the functionality of the tool are discussed.

  13. The use of natural language processing on pediatric diagnostic radiology reports in the electronic health record to identify deep venous thrombosis in children.

    PubMed

    Gálvez, Jorge A; Pappas, Janine M; Ahumada, Luis; Martin, John N; Simpao, Allan F; Rehman, Mohamed A; Witmer, Char

    2017-10-01

    Venous thromboembolism (VTE) is a potentially life-threatening condition that includes both deep vein thrombosis (DVT) and pulmonary embolism. We sought to improve detection and reporting of children with a new diagnosis of VTE by applying natural language processing (NLP) tools to radiologists' reports. We validated an NLP tool, Reveal NLP (Health Fidelity Inc, San Mateo, CA) and inference rules engine's performance in identifying reports with deep venous thrombosis using a curated set of ultrasound reports. We then configured the NLP tool to scan all available radiology reports on a daily basis for studies that met criteria for VTE between July 1, 2015, and March 31, 2016. The NLP tool and inference rules engine correctly identified 140 out of 144 reports with positive DVT findings and 98 out of 106 negative reports in the validation set. The tool's sensitivity was 97.2% (95% CI 93-99.2%), specificity was 92.5% (95% CI 85.7-96.7%). Subsequently, the NLP tool and inference rules engine processed 6373 radiology reports from 3371 hospital encounters. The NLP tool and inference rules engine identified 178 positive reports and 3193 negative reports with a sensitivity of 82.9% (95% CI 74.8-89.2) and specificity of 97.5% (95% CI 96.9-98). The system functions well as a safety net to screen patients for HA-VTE on a daily basis and offers value as an automated, redundant system. To our knowledge, this is the first pediatric study to apply NLP technology in a prospective manner for HA-VTE identification.

  14. An Ontology-Based Conceptual Model For Accumulating And Reusing Knowledge In A DMAIC Process

    NASA Astrophysics Data System (ADS)

    Nguyen, ThanhDat; Kifor, Claudiu Vasile

    2015-09-01

    DMAIC (Define, Measure, Analyze, Improve, and Control) is an important process used to enhance quality of processes basing on knowledge. However, it is difficult to access DMAIC knowledge. Conventional approaches meet a problem arising from structuring and reusing DMAIC knowledge. The main reason is that DMAIC knowledge is not represented and organized systematically. In this article, we overcome the problem basing on a conceptual model that is a combination of DMAIC process, knowledge management, and Ontology engineering. The main idea of our model is to utilizing Ontologies to represent knowledge generated by each of DMAIC phases. We build five different knowledge bases for storing all knowledge of DMAIC phases with the support of necessary tools and appropriate techniques in Information Technology area. Consequently, these knowledge bases provide knowledge available to experts, managers, and web users during or after DMAIC execution in order to share and reuse existing knowledge.

  15. Integrated Risk and Knowledge Management Program -- IRKM-P

    NASA Technical Reports Server (NTRS)

    Lengyel, David M.

    2009-01-01

    The NASA Exploration Systems Mission Directorate (ESMD) IRKM-P tightly couples risk management and knowledge management processes and tools to produce an effective "modern" work environment. IRKM-P objectives include: (1) to learn lessons from past and current programs (Apollo, Space Shuttle, and the International Space Station); (2) to generate and share new engineering design, operations, and management best practices through preexisting Continuous Risk Management (CRM) procedures and knowledge-management practices; and (3) to infuse those lessons and best practices into current activities. The conceptual framework of the IRKM-P is based on the assumption that risks highlight potential knowledge gaps that might be mitigated through one or more knowledge management practices or artifacts. These same risks also serve as cues for collection of knowledge particularly, knowledge of technical or programmatic challenges that might recur.

  16. Computational algorithm to evaluate product disassembly cost index

    NASA Astrophysics Data System (ADS)

    Zeid, Ibrahim; Gupta, Surendra M.

    2002-02-01

    Environmentally conscious manufacturing is an important paradigm in today's engineering practice. Disassembly is a crucial factor in implementing this paradigm. Disassembly allows the reuse and recycling of parts and products that reach their death after their life cycle ends. There are many questions that must be answered before a disassembly decision can be reached. The most important question is economical. The cost of disassembly versus the cost of scrapping a product is always considered. This paper develops a computational tool that allows decision-makers to calculate the disassembly cost of a product. The tool makes it simple to perform 'what if' scenarios fairly quickly. The tool is Web based and has two main parts. The front-end part is a Web page and runs on the client side in a Web browser, while the back-end part is a disassembly engine (servlet) that has disassembly knowledge and costing algorithms and runs on the server side. The tool is based on the client/server model that is pervasively utilized throughout the World Wide Web. An example is used to demonstrate the implementation and capabilities of the tool.

  17. Symposium on Automation, Robotics and Advanced Computing for the National Space Program (2nd) Held in Arlington, Virginia on 9-11 March 1987

    DTIC Science & Technology

    1988-02-28

    enormous investment in software. This is an area extremely important objective. We need additional where better methodologies , tools and theories...microscopy (SEM) and optical mi- [131 Hanson, A., et a. "A Methodology for the Develop- croscopy. Current activities include the study of SEM im- ment...through a phased knowledge engineering methodology Center (ARC) and NASA Johnson Space Center consisting of: prototype knowledge base develop- iJSC

  18. Learning Resources Organization Using Ontological Framework

    NASA Astrophysics Data System (ADS)

    Gavrilova, Tatiana; Gorovoy, Vladimir; Petrashen, Elena

    The paper describes the ontological approach to the knowledge structuring for the e-learning portal design as it turns out to be efficient and relevant to current domain conditions. It is primarily based on the visual ontology-based description of the content of the learning materials and this helps to provide productive and personalized access to these materials. The experience of ontology developing for Knowledge Engineering coursetersburg State University is discussed and “OntolingeWiki” tool for creating ontology-based e-learning portals is described.

  19. EDNA: Expert fault digraph analysis using CLIPS

    NASA Technical Reports Server (NTRS)

    Dixit, Vishweshwar V.

    1990-01-01

    Traditionally fault models are represented by trees. Recently, digraph models have been proposed (Sack). Digraph models closely imitate the real system dependencies and hence are easy to develop, validate and maintain. However, they can also contain directed cycles and analysis algorithms are hard to find. Available algorithms tend to be complicated and slow. On the other hand, the tree analysis (VGRH, Tayl) is well understood and rooted in vast research effort and analytical techniques. The tree analysis algorithms are sophisticated and orders of magnitude faster. Transformation of a digraph (cyclic) into trees (CLP, LP) is a viable approach to blend the advantages of the representations. Neither the digraphs nor the trees provide the ability to handle heuristic knowledge. An expert system, to capture the engineering knowledge, is essential. We propose an approach here, namely, expert network analysis. We combine the digraph representation and tree algorithms. The models are augmented by probabilistic and heuristic knowledge. CLIPS, an expert system shell from NASA-JSC will be used to develop a tool. The technique provides the ability to handle probabilities and heuristic knowledge. Mixed analysis, some nodes with probabilities, is possible. The tool provides graphics interface for input, query, and update. With the combined approach it is expected to be a valuable tool in the design process as well in the capture of final design knowledge.

  20. The Environmental Control and Life Support System (ECLSS) advanced automation project

    NASA Technical Reports Server (NTRS)

    Dewberry, Brandon S.; Carnes, Ray

    1990-01-01

    The objective of the environmental control and life support system (ECLSS) Advanced Automation Project is to influence the design of the initial and evolutionary Space Station Freedom Program (SSFP) ECLSS toward a man-made closed environment in which minimal flight and ground manpower is needed. Another objective includes capturing ECLSS design and development knowledge future missions. Our approach has been to (1) analyze the SSFP ECLSS, (2) envision as our goal a fully automated evolutionary environmental control system - an augmentation of the baseline, and (3) document the advanced software systems, hooks, and scars which will be necessary to achieve this goal. From this analysis, prototype software is being developed, and will be tested using air and water recovery simulations and hardware subsystems. In addition, the advanced software is being designed, developed, and tested using automation software management plan and lifecycle tools. Automated knowledge acquisition, engineering, verification and testing tools are being used to develop the software. In this way, we can capture ECLSS development knowledge for future use develop more robust and complex software, provide feedback to the knowledge based system tool community, and ensure proper visibility of our efforts.

  1. Convergence and translation: attitudes to inter-professional learning and teaching of creative problem-solving among medical and engineering students and staff.

    PubMed

    Spoelstra, Howard; Stoyanov, Slavi; Burgoyne, Louise; Bennett, Deirdre; Sweeney, Catherine; Drachsler, Hendrik; Vanderperren, Katrien; Van Huffel, Sabine; McSweeney, John; Shorten, George; O'Flynn, Siun; Cantillon-Murphy, Padraig; O'Tuathaigh, Colm

    2014-01-22

    Healthcare worldwide needs translation of basic ideas from engineering into the clinic. Consequently, there is increasing demand for graduates equipped with the knowledge and skills to apply interdisciplinary medicine/engineering approaches to the development of novel solutions for healthcare. The literature provides little guidance regarding barriers to, and facilitators of, effective interdisciplinary learning for engineering and medical students in a team-based project context. A quantitative survey was distributed to engineering and medical students and staff in two universities, one in Ireland and one in Belgium, to chart knowledge and practice in interdisciplinary learning and teaching, and of the teaching of innovation. We report important differences for staff and students between the disciplines regarding attitudes towards, and perceptions of, the relevance of interdisciplinary learning opportunities, and the role of creativity and innovation. There was agreement across groups concerning preferred learning, instructional styles, and module content. Medical students showed greater resistance to the use of structured creativity tools and interdisciplinary teams. The results of this international survey will help to define the optimal learning conditions under which undergraduate engineering and medicine students can learn to consider the diverse factors which determine the success or failure of a healthcare engineering solution.

  2. Convergence and translation: attitudes to inter-professional learning and teaching of creative problem-solving among medical and engineering students and staff

    PubMed Central

    2014-01-01

    Background Healthcare worldwide needs translation of basic ideas from engineering into the clinic. Consequently, there is increasing demand for graduates equipped with the knowledge and skills to apply interdisciplinary medicine/engineering approaches to the development of novel solutions for healthcare. The literature provides little guidance regarding barriers to, and facilitators of, effective interdisciplinary learning for engineering and medical students in a team-based project context. Methods A quantitative survey was distributed to engineering and medical students and staff in two universities, one in Ireland and one in Belgium, to chart knowledge and practice in interdisciplinary learning and teaching, and of the teaching of innovation. Results We report important differences for staff and students between the disciplines regarding attitudes towards, and perceptions of, the relevance of interdisciplinary learning opportunities, and the role of creativity and innovation. There was agreement across groups concerning preferred learning, instructional styles, and module content. Medical students showed greater resistance to the use of structured creativity tools and interdisciplinary teams. Conclusions The results of this international survey will help to define the optimal learning conditions under which undergraduate engineering and medicine students can learn to consider the diverse factors which determine the success or failure of a healthcare engineering solution. PMID:24450310

  3. Learners' Performance in Mathematics: A Case Study of Public High Schools, South Africa

    ERIC Educational Resources Information Center

    Mapaire, Lawrence

    2016-01-01

    Mathematics is fundamental to national prosperity in providing tools for understanding science, technology, engineering and economics. It is essential in public decision-making and for participation in the knowledge economy. Mathematics equips pupils with uniquely powerful ways to describe, analyse and change the world. It can stimulate moments of…

  4. A Modular Approach for Teaching Partial Discharge Phenomenon through Experiment

    ERIC Educational Resources Information Center

    Chatterjee, B.; Dey, D.; Chakravorti, S.

    2011-01-01

    Partial discharge (PD) monitoring is an effective predictive maintenance tool for electrical power equipment. As a result, an understanding of the theory related to PD and the associated measurement techniques is now necessary knowledge for power engineers in their professional life. This paper presents a modular course on PD phenomenon in which…

  5. Entertainment-Education and the Ethics of Social Intervention.

    ERIC Educational Resources Information Center

    Cambridge, Vibert; And Others

    More specifically than the general concept of "development," the use of entertainment media as a tool for social intervention implies the purposive utilization of the mass media to engineer specific changes in knowledge, attitudes, or practice. Thus, this type of use of the entertainment media is inseparable from the notion of "what…

  6. Seeking an Online Social Media Radar

    ERIC Educational Resources Information Center

    ter Veen, James

    2014-01-01

    Purpose: The purpose of this paper is to explore how the application of Systems Engineering tools and techniques can be applied to rapidly process and analyze the vast amounts of data present in social media in order to yield practical knowledge for Command and Control (C2) systems. Design/methodology/approach: Based upon comparative analysis of…

  7. Fault diagnosis in orbital refueling operations

    NASA Technical Reports Server (NTRS)

    Boy, Guy A.

    1988-01-01

    Usually, operation manuals are provided for helping astronauts during space operations. These manuals include normal and malfunction procedures. Transferring operation manual knowledge into a computerized form is not a trivial task. This knowledge is generally written by designers or operation engineers and is often quite different from the user logic. The latter is usually a compiled version of the former. Experiments are in progress to assess the user logic. HORSES (Human - Orbital Refueling System - Expert System) is an attempt to include both of these logics in the same tool. It is designed to assist astronauts during monitoring and diagnosis tasks. Basically, HORSES includes a situation recognition level coupled to an analytical diagnoser, and a meta-level working on both of the previous levels. HORSES is a good tool for modeling task models and is also more broadly useful for knowledge design. The presentation is represented by abstract and overhead visuals only.

  8. Big data and new knowledge in medicine: the thinking, training, and tools needed for a learning health system.

    PubMed

    Krumholz, Harlan M

    2014-07-01

    Big data in medicine--massive quantities of health care data accumulating from patients and populations and the advanced analytics that can give those data meaning--hold the prospect of becoming an engine for the knowledge generation that is necessary to address the extensive unmet information needs of patients, clinicians, administrators, researchers, and health policy makers. This article explores the ways in which big data can be harnessed to advance prediction, performance, discovery, and comparative effectiveness research to address the complexity of patients, populations, and organizations. Incorporating big data and next-generation analytics into clinical and population health research and practice will require not only new data sources but also new thinking, training, and tools. Adequately utilized, these reservoirs of data can be a practically inexhaustible source of knowledge to fuel a learning health care system. Project HOPE—The People-to-People Health Foundation, Inc.

  9. STRUTEX: A prototype knowledge-based system for initially configuring a structure to support point loads in two dimensions

    NASA Technical Reports Server (NTRS)

    Robers, James L.; Sobieszczanski-Sobieski, Jaroslaw

    1989-01-01

    Only recently have engineers begun making use of Artificial Intelligence (AI) tools in the area of conceptual design. To continue filling this void in the design process, a prototype knowledge-based system, called STRUTEX has been developed to initially configure a structure to support point loads in two dimensions. This prototype was developed for testing the application of AI tools to conceptual design as opposed to being a testbed for new methods for improving structural analysis and optimization. This system combines numerical and symbolic processing by the computer with interactive problem solving aided by the vision of the user. How the system is constructed to interact with the user is described. Of special interest is the information flow between the knowledge base and the data base under control of the algorithmic main program. Examples of computed and refined structures are presented during the explanation of the system.

  10. Big Data And New Knowledge In Medicine: The Thinking, Training, And Tools Needed For A Learning Health System

    PubMed Central

    Krumholz, Harlan M.

    2017-01-01

    Big data in medicine--massive quantities of health care data accumulating from patients and populations and the advanced analytics that can give it meaning--hold the prospect of becoming an engine for the knowledge generation that is necessary to address the extensive unmet information needs of patients, clinicians, administrators, researchers, and health policy makers. This paper explores the ways in which big data can be harnessed to advance prediction, performance, discovery, and comparative effectiveness research to address the complexity of patients, populations, and organizations. Incorporating big data and next-generation analytics into clinical and population health research and practice will require not only new data sources but also new thinking, training, and tools. Adequately used, these reservoirs of data can be a practically inexhaustible source of knowledge to fuel a learning health care system. PMID:25006142

  11. Enhancements to the KATE model-based reasoning system

    NASA Technical Reports Server (NTRS)

    Thomas, Stan J.

    1994-01-01

    KATE (Knowledge-based Autonomous Test Engineer) is a model-based software system developed in the Artificial Intelligence Laboratory at the Kennedy Space Center for monitoring, fault detection, and control of launch vehicles and ground support systems. This report describes two software efforts which enhance the functionality and usability of KATE. The first addition, a flow solver, adds to KATE a tool for modeling the flow of liquid in a pipe system. The second addition adds support for editing KATE knowledge base files to the Emacs editor. The body of this report discusses design and implementation issues having to do with these two tools. It will be useful to anyone maintaining or extending either the flow solver or the editor enhancements.

  12. Kate's Model Verification Tools

    NASA Technical Reports Server (NTRS)

    Morgan, Steve

    1991-01-01

    Kennedy Space Center's Knowledge-based Autonomous Test Engineer (KATE) is capable of monitoring electromechanical systems, diagnosing their errors, and even repairing them when they crash. A survey of KATE's developer/modelers revealed that they were already using a sophisticated set of productivity enhancing tools. They did request five more, however, and those make up the body of the information presented here: (1) a transfer function code fitter; (2) a FORTRAN-Lisp translator; (3) three existing structural consistency checkers to aid in syntax checking their modeled device frames; (4) an automated procedure for calibrating knowledge base admittances to protect KATE's hardware mockups from inadvertent hand valve twiddling; and (5) three alternatives for the 'pseudo object', a programming patch that currently apprises KATE's modeling devices of their operational environments.

  13. How can systems engineering inform the methods of programme evaluation in health professions education?

    PubMed

    Rojas, David; Grierson, Lawrence; Mylopoulos, Maria; Trbovich, Patricia; Bagli, Darius; Brydges, Ryan

    2018-04-01

    We evaluate programmes in health professions education (HPE) to determine their effectiveness and value. Programme evaluation has evolved from use of reductionist frameworks to those addressing the complex interactions between programme factors. Researchers in HPE have recently suggested a 'holistic programme evaluation' aiming to better describe and understand the implications of 'emergent processes and outcomes'. We propose a programme evaluation framework informed by principles and tools from systems engineering. Systems engineers conceptualise complexity and emergent elements in unique ways that may complement and extend contemporary programme evaluations in HPE. We demonstrate how the abstract decomposition space (ADS), an engineering knowledge elicitation tool, provides the foundation for a systems engineering informed programme evaluation designed to capture both planned and emergent programme elements. We translate the ADS tool to use education-oriented language, and describe how evaluators can use it to create a programme-specific ADS through iterative refinement. We provide a conceptualisation of emergent elements and an equation that evaluators can use to identify the emergent elements in their programme. Using our framework, evaluators can analyse programmes not as isolated units with planned processes and planned outcomes, but as unfolding, complex interactive systems that will exhibit emergent processes and emergent outcomes. Subsequent analysis of these emergent elements will inform the evaluator as they seek to optimise and improve the programme. Our proposed systems engineering informed programme evaluation framework provides principles and tools for analysing the implications of planned and emergent elements, as well as their potential interactions. We acknowledge that our framework is preliminary and will require application and constant refinement. We suggest that our framework will also advance our understanding of the construct of 'emergence' in HPE research. © 2017 John Wiley & Sons Ltd and The Association for the Study of Medical Education.

  14. Preface to MOST-ONISW 2009

    NASA Astrophysics Data System (ADS)

    Doerr, Martin; Freitas, Fred; Guizzardi, Giancarlo; Han, Hyoil

    Ontology is a cross-disciplinary field concerned with the study of concepts and theories that can be used for representing shared conceptualizations of specific domains. Ontological Engineering is a discipline in computer and information science concerned with the development of techniques, methods, languages and tools for the systematic construction of concrete artifacts capturing these representations, i.e., models (e.g., domain ontologies) and metamodels (e.g., upper-level ontologies). In recent years, there has been a growing interest in the application of formal ontology and ontological engineering to solve modeling problems in diverse areas in computer science such as software and data engineering, knowledge representation, natural language processing, information science, among many others.

  15. Map of Life - A Dashboard for Monitoring Planetary Species Distributions

    NASA Astrophysics Data System (ADS)

    Jetz, W.

    2016-12-01

    Geographic information about biodiversity is vital for understanding the many services nature provides and their potential changes, yet remains unreliable and often insufficient. By integrating a wide range of knowledge about species distributions and their dynamics over time, Map of Life supports global biodiversity education, monitoring, research and decision-making. Built on a scalable web platform geared for large biodiversity and environmental data, Map of Life endeavors provides species range information globally and species lists for any area. With data and technology provided by NASA and Google Earth Engine, tools under development use remote sensing-based environmental layers to enable on-the-fly predictions of species distributions, range changes, and early warning signals for threatened species. The ultimate vision is a globally connected, collaborative knowledge- and tool-base for regional and local biodiversity decision-making, education, monitoring, and projection. For currently available tools, more information and to follow progress, go to MOL.org.

  16. Design of Smart Educational Robot as a Tool For Teaching Media Based on Contextual Teaching and Learning to Improve the Skill of Electrical Engineering Student

    NASA Astrophysics Data System (ADS)

    Zuhrie, M. S.; Basuki, I.; Asto, B. I. G. P.; Anifah, L.

    2018-04-01

    The development of robotics in Indonesia has been very encouraging. The barometer is the success of the Indonesian Robot Contest. The focus of research is a teaching module manufacturing, planning mechanical design, control system through microprocessor technology and maneuverability of the robot. Contextual Teaching and Learning (CTL) strategy is the concept of learning where the teacher brings the real world into the classroom and encourage students to make connections between knowledge possessed by its application in everyday life. This research the development model used is the 4-D model. This Model consists of four stages: Define Stage, Design Stage, Develop Stage, and Disseminate Stage. This research was conducted by applying the research design development with the aim to produce a tool of learning in the form of smart educational robot modules and kit based on Contextual Teaching and Learning at the Department of Electrical Engineering to improve the skills of the Electrical Engineering student. Socialization questionnaires showed that levels of the student majoring in electrical engineering competencies image currently only limited to conventional machines. The average assessment is 3.34 validator included in either category. Modules developed can give hope to the future are able to produce Intelligent Robot Tool for Teaching.

  17. A knowledge-based system design/information tool for aircraft flight control systems

    NASA Technical Reports Server (NTRS)

    Mackall, Dale A.; Allen, James G.

    1989-01-01

    Research aircraft have become increasingly dependent on advanced control systems to accomplish program goals. These aircraft are integrating multiple disciplines to improve performance and satisfy research objectives. This integration is being accomplished through electronic control systems. Because of the number of systems involved and the variety of engineering disciplines, systems design methods and information management have become essential to program success. The primary objective of the system design/information tool for aircraft flight control system is to help transfer flight control system design knowledge to the flight test community. By providing all of the design information and covering multiple disciplines in a structured, graphical manner, flight control systems can more easily be understood by the test engineers. This will provide the engineers with the information needed to thoroughly ground test the system and thereby reduce the likelihood of serious design errors surfacing in flight. The secondary objective is to apply structured design techniques to all of the design domains. By using the techniques in the top level system design down through the detailed hardware and software designs, it is hoped that fewer design anomalies will result. The flight test experiences of three highly complex, integrated aircraft programs are reviewed: the X-29 forward-swept wing, the advanced fighter technology integration (AFTI) F-16, and the highly maneuverable aircraft technology (HiMAT) program. Significant operating anomalies and the design errors which cause them, are examined to help identify what functions a system design/information tool should provide to assist designers in avoiding errors.

  18. NASA Engine Icing Research Overview: Aeronautics Evaluation and Test Capabilities (AETC) Project

    NASA Technical Reports Server (NTRS)

    Veres, Joseph P.

    2015-01-01

    The occurrence of ice accretion within commercial high bypass aircraft turbine engines has been reported by airlines under certain atmospheric conditions. Engine anomalies have taken place at high altitudes that have been attributed to ice crystal ingestion by the engine. The ice crystals can result in degraded engine performance, loss of thrust control, compressor surge or stall, and flameout of the combustor. The Aviation Safety Program at NASA has taken on the technical challenge of a turbofan engine icing caused by ice crystals which can exist in high altitude convective clouds. The NASA engine icing project consists of an integrated approach with four concurrent and ongoing research elements, each of which feeds critical information to the next element. The project objective is to gain understanding of high altitude ice crystals by developing knowledge bases and test facilities for testing full engines and engine components. The first element is to utilize a highly instrumented aircraft to characterize the high altitude convective cloud environment. The second element is the enhancement of the Propulsion Systems Laboratory altitude test facility for gas turbine engines to include the addition of an ice crystal cloud. The third element is basic research of the fundamental physics associated with ice crystal ice accretion. The fourth and final element is the development of computational tools with the goal of simulating the effects of ice crystal ingestion on compressor and gas turbine engine performance. The NASA goal is to provide knowledge to the engine and aircraft manufacturing communities to help mitigate, or eliminate turbofan engine interruptions, engine damage, and failures due to ice crystal ingestion.

  19. NASA's Systems Engineering Approaches for Addressing Public Health Surveillance Requirements

    NASA Technical Reports Server (NTRS)

    Vann, Timi

    2003-01-01

    NASA's systems engineering has its heritage in space mission analysis and design, including the end-to-end approach to managing every facet of the extreme engineering required for successful space missions. NASA sensor technology, understanding of remote sensing, and knowledge of Earth system science, can be powerful new tools for improved disease surveillance and environmental public health tracking. NASA's systems engineering framework facilitates the match between facilitates the match between partner needs and decision support requirements in the areas of 1) Science/Data; 2) Technology; 3) Integration. Partnerships between NASA and other Federal agencies are diagrammed in this viewgraph presentation. NASA's role in these partnerships is to provide systemic and sustainable solutions that contribute to the measurable enhancement of a partner agency's disease surveillance efforts.

  20. The acquisition and transfer of knowledge of electrokinetic-hydrodynamics (EKHD) fundamentals: an introductory graduate-level course

    NASA Astrophysics Data System (ADS)

    Pascal, Jennifer; Tíjaro-Rojas, Rocío; Oyanader, Mario A.; Arce, Pedro E.

    2017-09-01

    Relevant engineering applications, such as bioseparation of proteins and DNA, soil-cleaning, motion of colloidal particles in different media, electrical field-based cancer treatments, and the cleaning of surfaces and coating flows, belongs to the family of 'Applied Field Sensitive Process Technologies' requiring an external field to move solutes in a fluid within a fibrous (or porous) domain. This field incorporates an additional variable that makes the analysis very challenging and can create for the student a number of new problems to solve. A graduate-level course, based on active-learning approaches and High Performance Learning Environments, where transfer of knowledge plays a key role, was designed by the Chemical Engineering Department at Tennessee Technological University. This course, where the fundamentals principles of EKHD were taught to science, engineering and technology students was designed by the Chemical Engineering Department at the Tennessee Technological University, Cookeville, TN. An important number of these students were able to grasp the tools required to advance their research projects that led to numerous technical presentations in professional society meetings and publications in peered-reviewed journals.

  1. Engineering a Cause and Cure to Climate Change; Working a culture change with our Future Engineers.

    NASA Astrophysics Data System (ADS)

    Hudier, E. J. J.

    2014-12-01

    Where scientist unravel the laws of nature giving the human race the means to remodel their environment, engineers are the tools that put together the very technologies that give humans this power. Early on, along our first steps through this industrialization era, development was the key word, nature could digest our waste products no matter what. We have managed to tamper with our atmosphere's gas composition and the climate is slowly remodelling our way of life. Engineers are now expected to be a key part of the solution. Engineering programs have evolved to include new dimensions such as ethics, communication and environment. We want future engineers to put these dimensions first while working on new machine designs, concepts and procedures. As undergraduate students with a deep science background we also want them to be a source of information for their co-workers and more. How well are we getting through? How good teachers our future engineers will be? This work take a look at the teaching/learning successes comparing engineering students with students attending an undergraduate program in biology. Methods emphasizing the acquisition of knowledge through lectures and reading assignments are tested along with activities aiming at unraveling the scientific fundamental behind environmental issues and putting forward original solutions to specific problematic. Concept knowledge scores, communications' quality and activities evaluations by students are discussed.

  2. The role of discourse in group knowledge construction: A case study of engineering students

    NASA Astrophysics Data System (ADS)

    Kittleson, Julie M.; Southerland, Sherry A.

    2004-03-01

    This qualitative study examined the role of discourse (verbal elements of language) and Discourse (nonverbal elements related to the use of language, such as ways of thinking, valuing, and using tools and technologies) in the process of group knowledge construction of mechanical engineering students. Data included interviews, participant observations, and transcripts from lab sessions of a group of students working on their senior design project. These data were analyzed using discourse analysis focusing on instances of concept negotiation, interaction in which multiple people contribute to the evolving conceptual conversation. In this context, despite instructors' attempts to enhance the collaboration of group members, concept negotiation was rare. In an effort to understand this rarity, we identified themes related to an engineering Discourse, which included participants' assumptions about the purpose of group work, the views about effective groups, and their epistemologies and ontologies. We explore how the themes associated with the engineering Discourse played a role in how and when the group engaged in concept negotiation. We found that underlying ideologies and assumptions related to the engineering Discourse played both facilitating and inhibitory roles related to the group's conceptually based interactions.

  3. STS Case Study Development Support

    NASA Technical Reports Server (NTRS)

    Rosa de Jesus, Dan A.; Johnson, Grace K.

    2013-01-01

    The Shuttle Case Study Collection (SCSC) has been developed using lessons learned documented by NASA engineers, analysts, and contractors. The SCSC provides educators with a new tool to teach real-world engineering processes with the goal of providing unique educational materials that enhance critical thinking, decision-making and problem-solving skills. During this third phase of the project, responsibilities included: the revision of the Hyper Text Markup Language (HTML) source code to ensure all pages follow World Wide Web Consortium (W3C) standards, and the addition and edition of website content, including text, documents, and images. Basic HTML knowledge was required, as was basic knowledge of photo editing software, and training to learn how to use NASA's Content Management System for website design. The outcome of this project was its release to the public.

  4. Methodology for identifying and representing knowledge in the scope of CMM inspection resource selection

    NASA Astrophysics Data System (ADS)

    Martínez, S.; Barreiro, J.; Cuesta, E.; Álvarez, B. J.; González, D.

    2012-04-01

    This paper is focused on the task of elicitation and structuring of knowledge related to selection of inspection resources. The final goal is to obtain an informal model of knowledge oriented to the inspection planning in coordinate measuring machines. In the first tasks, where knowledge is captured, it is necessary to use tools that make easier the analysis and structuring of knowledge, so that rules of selection can be easily stated to configure the inspection resources. In order to store the knowledge a so-called Onto-Process ontology has been developed. This ontology may be of application to diverse processes in manufacturing engineering. This paper describes the decomposition of the ontology in terms of general units of knowledge and others more specific for selection of sensor assemblies in inspection planning with touch sensors.

  5. Fuzzy Logic Engine

    NASA Technical Reports Server (NTRS)

    Howard, Ayanna

    2005-01-01

    The Fuzzy Logic Engine is a software package that enables users to embed fuzzy-logic modules into their application programs. Fuzzy logic is useful as a means of formulating human expert knowledge and translating it into software to solve problems. Fuzzy logic provides flexibility for modeling relationships between input and output information and is distinguished by its robustness with respect to noise and variations in system parameters. In addition, linguistic fuzzy sets and conditional statements allow systems to make decisions based on imprecise and incomplete information. The user of the Fuzzy Logic Engine need not be an expert in fuzzy logic: it suffices to have a basic understanding of how linguistic rules can be applied to the user's problem. The Fuzzy Logic Engine is divided into two modules: (1) a graphical-interface software tool for creating linguistic fuzzy sets and conditional statements and (2) a fuzzy-logic software library for embedding fuzzy processing capability into current application programs. The graphical- interface tool was developed using the Tcl/Tk programming language. The fuzzy-logic software library was written in the C programming language.

  6. Microplastic Exposure Assessment in Aquatic Environments: Learning from Similarities and Differences to Engineered Nanoparticles.

    PubMed

    Hüffer, Thorsten; Praetorius, Antonia; Wagner, Stephan; von der Kammer, Frank; Hofmann, Thilo

    2017-03-07

    Microplastics (MPs) have been identified as contaminants of emerging concern in aquatic environments and research into their behavior and fate has been sharply increasing in recent years. Nevertheless, significant gaps remain in our understanding of several crucial aspects of MP exposure and risk assessment, including the quantification of emissions, dominant fate processes, types of analytical tools required for characterization and monitoring, and adequate laboratory protocols for analysis and hazard testing. This Feature aims at identifying transferrable knowledge and experience from engineered nanoparticle (ENP) exposure assessment. This is achieved by comparing ENP and MPs based on their similarities as particulate contaminants, whereas critically discussing specific differences. We also highlight the most pressing research priorities to support an efficient development of tools and methods for MPs environmental risk assessment.

  7. Determination of Gibbs energies of formation in aqueous solution using chemical engineering tools.

    PubMed

    Toure, Oumar; Dussap, Claude-Gilles

    2016-08-01

    Standard Gibbs energies of formation are of primary importance in the field of biothermodynamics. In the absence of any directly measured values, thermodynamic calculations are required to determine the missing data. For several biochemical species, this study shows that the knowledge of the standard Gibbs energy of formation of the pure compounds (in the gaseous, solid or liquid states) enables to determine the corresponding standard Gibbs energies of formation in aqueous solutions. To do so, using chemical engineering tools (thermodynamic tables and a model enabling to predict activity coefficients, solvation Gibbs energies and pKa data), it becomes possible to determine the partial chemical potential of neutral and charged components in real metabolic conditions, even in concentrated mixtures. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. Explore-create-share study: An evaluation of teachers as curriculum innovators in engineering education

    NASA Astrophysics Data System (ADS)

    Berry, Ayora

    The purpose of this study was to investigate the effects of a curriculum design-based (CDB) professional development model on K-12 teachers' capacity to integrate engineering education in the classroom. This teacher professional development approach differs from other training programs where teachers learn how to use a standard curriculum and adopt it in their classrooms. In a CDB professional development model teachers actively design lessons, student resources, and assessments for their classroom instruction. In other science, technology, engineering and mathematics (STEM) disciplines, CDB professional development has been reported to (a) position teachers as architects of change, (b) provide a professional learning vehicle for educators to reflect on instructional practices and develop content knowledge, (c) inspire a sense of ownership in curriculum decision-making among teachers, and (d) use an instructional approach that is coherent with teachers' interests and professional goals. The CDB professional development program in this study used the Explore-Create-Share (ECS) framework as an instructional model to support teacher-led curriculum design and implementation. To evaluate the impact of the CDB professional development and associated ECS instructional model, three research studies were conducted. In each study, the participants completed a six-month CDB professional development program, the PTC STEM Certificate Program, that included sixty-two instructional contact hours. Participants learned about industry and education engineering concepts, tested engineering curricula, collaborated with K-12 educators and industry professionals, and developed project-based engineering curricula using the ECS framework. The first study evaluated the impact of the CDB professional development program on teachers' engineering knowledge, self-efficacy in designing engineering curriculum, and instructional practice in developing project-based engineering units. The study included twenty-six teachers and data was collected pre-, mid-, and post-program using teacher surveys and a curriculum analysis instrument. The second study evaluated teachers' perceptions of the ECS model as a curriculum authoring tool and the quality of the curriculum units they developed. The study included sixty-two participants and data was collected post-program using teacher surveys and a curriculum analysis instrument. The third study evaluated teachers' experiences implementing ECS units in the classroom with a focus on identifying the benefits, challenges and solutions associated with project-based engineering in the classroom. The study included thirty-one participants and data was collected using an open-ended survey instrument after teachers completed implementation of the ECS curriculum unit. Results of these three studies indicate that teachers can be prepared to integrate engineering in the classroom using a CDB professional development model. Teachers reported an increase in engineering content knowledge, improved their self-efficacy in curriculum planning, and developed high quality instructional units that were aligned to engineering design practices and STEM educational standards. The ECS instructional model was acknowledged as a valuable tool for developing and implementing engineering education in the classroom. Teachers reported that ECS curriculum design aligned with their teaching goals, provided a framework to integrate engineering with other subject-area concepts, and incorporated innovative teaching strategies. After implementing ECS units in the classroom, teachers reported that the ECS model engaged students in engineering design challenges that were situated in a real world context and required the application of interdisciplinary content knowledge and skills. Teachers also reported a number of challenges related to scheduling, content alignment, and access to resources. In the face of these obstacles, teachers presented a number of solutions that included optimization of one's teaching practice, being resource savvy, and adopting a growth mindset.

  9. Rule-based simulation models

    NASA Technical Reports Server (NTRS)

    Nieten, Joseph L.; Seraphine, Kathleen M.

    1991-01-01

    Procedural modeling systems, rule based modeling systems, and a method for converting a procedural model to a rule based model are described. Simulation models are used to represent real time engineering systems. A real time system can be represented by a set of equations or functions connected so that they perform in the same manner as the actual system. Most modeling system languages are based on FORTRAN or some other procedural language. Therefore, they must be enhanced with a reaction capability. Rule based systems are reactive by definition. Once the engineering system has been decomposed into a set of calculations using only basic algebraic unary operations, a knowledge network of calculations and functions can be constructed. The knowledge network required by a rule based system can be generated by a knowledge acquisition tool or a source level compiler. The compiler would take an existing model source file, a syntax template, and a symbol table and generate the knowledge network. Thus, existing procedural models can be translated and executed by a rule based system. Neural models can be provide the high capacity data manipulation required by the most complex real time models.

  10. Case-based Reasoning for Automotive Engine Performance Tune-up

    NASA Astrophysics Data System (ADS)

    Vong, C. M.; Huang, H.; Wong, P. K.

    2010-05-01

    The automotive engine performance tune-up is greatly affected by the calibration of its electronic control unit (ECU). The ECU calibration is traditionally done by trial-and-error method. This traditional method consumes a large amount of time and money because of a large number of dynamometer tests. To resolve this problem, case based reasoning (CBR) is employed, so that an existing and effective ECU setup can be adapted to fit another similar class of engines. The adaptation procedure is done through a more sophisticated step called case-based adaptation (CBA) [1, 2]. CBA is an effective knowledge management tool, which can interactively learn the expert adaptation knowledge. The paper briefly reviews the methodologies of CBR and CBA. Then the application to ECU calibration is described via a case study. With CBR and CBA, the efficiency of calibrating an ECU can be enhanced. A prototype system has also been developed to verify the usefulness of CBR in ECU calibration.

  11. Pharmaceutical and industrial protein engineering: where we are?

    PubMed

    Amara, Amro Abd-Al-Fattah

    2013-01-01

    The huge amount of information, the big number of scientists and their efforts, labs, man/hrs, fund, companies all and others factors build the success of the amazing new branch of genetic engineering the 'protein engineering' (PE). It concerns with the modification of protein structure/function(s) or building protein from scratch. The engineered proteins usually have new criteria(s). Engineering proteins can be mediated on the level of genes or proteins. PE fined its way in different important sectors including industrial, pharmaceutical and medicinal ones. Aspects about PE and its applications will be discussed with this review. The concept, tools, and the industrial applications of the protein, engineered proteins and PE will be under focus. In order to get up to date knowledge about the applications of PE in basic protein and molecular biology, several examples are discussed. PE can play a significant role in different industrial and pharmaceutical sectors if used wisely and selectively.

  12. Tools and Approaches for the Construction of Knowledge Models from the Neuroscientific Literature

    PubMed Central

    Burns, Gully A. P. C.; Khan, Arshad M.; Ghandeharizadeh, Shahram; O’Neill, Mark A.; Chen, Yi-Shin

    2015-01-01

    Within this paper, we describe a neuroinformatics project (called “NeuroScholar,” http://www.neuroscholar.org/) that enables researchers to examine, manage, manipulate, and use the information contained within the published neuroscientific literature. The project is built within a multi-level, multi-component framework constructed with the use of software engineering methods that themselves provide code-building functionality for neuroinformaticians. We describe the different software layers of the system. First, we present a hypothetical usage scenario illustrating how NeuroScholar permits users to address large-scale questions in a way that would otherwise be impossible. We do this by applying NeuroScholar to a “real-world” neuroscience question: How is stress-related information processed in the brain? We then explain how the overall design of NeuroScholar enables the system to work and illustrate different components of the user interface. We then describe the knowledge management strategy we use to store interpretations. Finally, we describe the software engineering framework we have devised (called the “View-Primitive-Data Model framework,” [VPDMf]) to provide an open-source, accelerated software development environment for the project. We believe that NeuroScholar will be useful to experimental neuroscientists by helping them interact with the primary neuroscientific literature in a meaningful way, and to neuroinformaticians by providing them with useful, affordable software engineering tools. PMID:15055395

  13. The BOEING 777 - concurrent engineering and digital pre-assembly

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abarbanel, B.

    The processes created on the 777 for checking designs were called {open_quotes}digital pre-assembly{close_quotes}. Using FlyThru(tm), a spin-off of a Boeing advanced computing research project, engineers were able to view up to 1500 models (15000 solids) in 3d traversing that data at high speed. FlyThru(tm) was rapidly deployed in 1991 to meet the needs of the 777 for large scale product visualization and verification. The digital pre-assembly process has bad fantastic results. The 777 has had far fewer assembly and systems problems compared to previous airplane programs. Today, FlyThru(tm) is installed on hundreds of workstations on almost every airplane program, andmore » is being used on Space Station, F22, AWACS, and other defense projects. It`s applications have gone far beyond just design review. In many ways, FlyThru is a Data Warehouse supported by advanced tools for analysis. It is today being integrated with Knowledge Based Engineering geometry generation tools.« less

  14. Class Model Development Using Business Rules

    NASA Astrophysics Data System (ADS)

    Skersys, Tomas; Gudas, Saulius

    New developments in the area of computer-aided system engineering (CASE) greatly improve processes of the information systems development life cycle (ISDLC). Much effort is put into the quality improvement issues, but IS development projects still suffer from the poor quality of models during the system analysis and design cycles. At some degree, quality of models that are developed using CASE tools can be assured using various. automated. model comparison, syntax. checking procedures. It. is also reasonable to check these models against the business domain knowledge, but the domain knowledge stored in the repository of CASE tool (enterprise model) is insufficient (Gudas et al. 2004). Involvement of business domain experts into these processes is complicated because non- IT people often find it difficult to understand models that were developed by IT professionals using some specific modeling language.

  15. Spaceflight Safety on the North Coast of America

    NASA Technical Reports Server (NTRS)

    Ciancone, Michael L.; Havenhill, Maria T.; Terlep, Judith A.

    1996-01-01

    Spaceflight Safety (SFS) engineers at NASA Lewis Research Center (LeRC) are responsible for evaluating the microgravity fluids and combustion experiments, payloads and facilities developed at NASA LeRC which are manifested for spaceflight on the Space Shuttle, the Russian space station Mir, and/or the International Space Station (ISS). An ongoing activity at NASA LeRC is the comprehensive training of its SFS engineers through the creation and use of safety tools and processes. Teams of SFS engineers worked on the development of an Internet website (containing a spaceflight safety knowledge database and electronic templates of safety products) and the establishment of a technical peer review process (known as the Safety Assurance for Lewis Spaceflight Activities (SALSA) review).

  16. Acoustical standards in engineering acoustics

    NASA Astrophysics Data System (ADS)

    Burkhard, Mahlon D.

    2004-05-01

    The Engineering Acoustics Technical Committee is concerned with the evolution and improvement of acoustical techniques and apparatus, and with the promotion of new applications of acoustics. As cited in the Membership Directory and Handbook (2002), the interest areas include transducers and arrays; underwater acoustic systems; acoustical instrumentation and monitoring; applied sonics, promotion of useful effects, information gathering and transmission; audio engineering; acoustic holography and acoustic imaging; acoustic signal processing (equipment and techniques); and ultrasound and infrasound. Evident connections between engineering and standards are needs for calibration, consistent terminology, uniform presentation of data, reference levels, or design targets for product development. Thus for the acoustical engineer standards are both a tool for practices, for communication, and for comparison of his efforts with those of others. Development of many standards depends on knowledge of the way products are put together for the market place and acoustical engineers provide important input to the development of standards. Acoustical engineers and members of the Engineering Acoustics arm of the Society both benefit from and contribute to the Acoustical Standards of the Acoustical Society.

  17. Development of the Spacecraft Materials Selector Expert System

    NASA Technical Reports Server (NTRS)

    Pippin, H. G.

    2000-01-01

    A specific knowledge base to evaluate the on-orbit performance of selected materials on spacecraft is being developed under contract to the NASA SEE program. An artificial intelligence software package, the Boeing Expert System Tool (BEST), contains an inference engine used to operate knowledge bases constructed to selectively recall and distribute information about materials performance in space applications. This same system is used to make estimates of the environmental exposures expected for a given space flight. The performance capabilities of the Spacecraft Materials Selector (SMS) knowledge base are described in this paper. A case history for a planned flight experiment on ISS is shown as an example of the use of the SMS, and capabilities and limitations of the knowledge base are discussed.

  18. A knowledge authoring tool for clinical decision support.

    PubMed

    Dunsmuir, Dustin; Daniels, Jeremy; Brouse, Christopher; Ford, Simon; Ansermino, J Mark

    2008-06-01

    Anesthesiologists in the operating room are unable to constantly monitor all data generated by physiological monitors. They are further distracted by clinical and educational tasks. An expert system would ideally provide assistance to the anesthesiologist in this data-rich environment. Clinical monitoring expert systems have not been widely adopted, as traditional methods of knowledge encoding require both expert medical and programming skills, making knowledge acquisition difficult. A software application was developed for use as a knowledge authoring tool for physiological monitoring. This application enables clinicians to create knowledge rules without the need of a knowledge engineer or programmer. These rules are designed to provide clinical diagnosis, explanations and treatment advice for optimal patient care to the clinician in real time. By intelligently combining data from physiological monitors and demographical data sources the expert system can use these rules to assist in monitoring the patient. The knowledge authoring process is simplified by limiting connective relationships between rules. The application is designed to allow open collaboration between communities of clinicians to build a library of rules for clinical use. This design provides clinicians with a system for parameter surveillance and expert advice with a transparent pathway of reasoning. A usability evaluation demonstrated that anesthesiologists can rapidly develop useful rules for use in a predefined clinical scenario.

  19. Rapid Modeling and Analysis Tools: Evolution, Status, Needs and Directions

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Stone, Thomas J.; Ransom, Jonathan B. (Technical Monitor)

    2002-01-01

    Advanced aerospace systems are becoming increasingly more complex, and customers are demanding lower cost, higher performance, and high reliability. Increased demands are placed on the design engineers to collaborate and integrate design needs and objectives early in the design process to minimize risks that may occur later in the design development stage. High performance systems require better understanding of system sensitivities much earlier in the design process to meet these goals. The knowledge, skills, intuition, and experience of an individual design engineer will need to be extended significantly for the next generation of aerospace system designs. Then a collaborative effort involving the designer, rapid and reliable analysis tools and virtual experts will result in advanced aerospace systems that are safe, reliable, and efficient. This paper discusses the evolution, status, needs and directions for rapid modeling and analysis tools for structural analysis. First, the evolution of computerized design and analysis tools is briefly described. Next, the status of representative design and analysis tools is described along with a brief statement on their functionality. Then technology advancements to achieve rapid modeling and analysis are identified. Finally, potential future directions including possible prototype configurations are proposed.

  20. Evolving from bioinformatics in-the-small to bioinformatics in-the-large.

    PubMed

    Parker, D Stott; Gorlick, Michael M; Lee, Christopher J

    2003-01-01

    We argue the significance of a fundamental shift in bioinformatics, from in-the-small to in-the-large. Adopting a large-scale perspective is a way to manage the problems endemic to the world of the small-constellations of incompatible tools for which the effort required to assemble an integrated system exceeds the perceived benefit of the integration. Where bioinformatics in-the-small is about data and tools, bioinformatics in-the-large is about metadata and dependencies. Dependencies represent the complexities of large-scale integration, including the requirements and assumptions governing the composition of tools. The popular make utility is a very effective system for defining and maintaining simple dependencies, and it offers a number of insights about the essence of bioinformatics in-the-large. Keeping an in-the-large perspective has been very useful to us in large bioinformatics projects. We give two fairly different examples, and extract lessons from them showing how it has helped. These examples both suggest the benefit of explicitly defining and managing knowledge flows and knowledge maps (which represent metadata regarding types, flows, and dependencies), and also suggest approaches for developing bioinformatics database systems. Generally, we argue that large-scale engineering principles can be successfully adapted from disciplines such as software engineering and data management, and that having an in-the-large perspective will be a key advantage in the next phase of bioinformatics development.

  1. Effects of 3D Printing Project-based Learning on Preservice Elementary Teachers' Science Attitudes, Science Content Knowledge, and Anxiety About Teaching Science

    NASA Astrophysics Data System (ADS)

    Novak, Elena; Wisdom, Sonya

    2018-05-01

    3D printing technology is a powerful educational tool that can promote integrative STEM education by connecting engineering, technology, and applications of science concepts. Yet, research on the integration of 3D printing technology in formal educational contexts is extremely limited. This study engaged preservice elementary teachers (N = 42) in a 3D Printing Science Project that modeled a science experiment in the elementary classroom on why things float or sink using 3D printed boats. The goal was to explore how collaborative 3D printing inquiry-based learning experiences affected preservice teachers' science teaching self-efficacy beliefs, anxiety toward teaching science, interest in science, perceived competence in K-3 technology and engineering science standards, and science content knowledge. The 3D printing project intervention significantly decreased participants' science teaching anxiety and improved their science teaching efficacy, science interest, and perceived competence in K-3 technological and engineering design science standards. Moreover, an analysis of students' project reflections and boat designs provided an insight into their collaborative 3D modeling design experiences. The study makes a contribution to the scarce body of knowledge on how teacher preparation programs can utilize 3D printing technology as a means of preparing prospective teachers to implement the recently adopted engineering and technology standards in K-12 science education.

  2. A prototype system for perinatal knowledge engineering using an artificial intelligence tool.

    PubMed

    Sokol, R J; Chik, L

    1988-01-01

    Though several perinatal expert systems are extant, the use of artificial intelligence has, as yet, had minimal impact in medical computing. In this evaluation of the potential of AI techniques in the development of a computer based "Perinatal Consultant," a "top down" approach to the development of a perinatal knowledge base was taken, using as a source for such a knowledge base a 30-page manuscript of a chapter concerning high risk pregnancy. The UNIX utility "style" was used to parse sentences and obtain key words and phrases, both as part of a natural language interface and to identify key perinatal concepts. Compared with the "gold standard" of sentences containing key facts as chosen by the experts, a semiautomated method using a nonmedical speller to identify key words and phrases in context functioned with a sensitivity of 79%, i.e., approximately 8 in 10 key sentences were detected as the basis for PROLOG, rules and facts for the knowledge base. These encouraging results suggest that functional perinatal expert systems may well be expedited by using programming utilities in conjunction with AI tools and published literature.

  3. Tool for Constructing Data Albums for Significant Weather Events

    NASA Astrophysics Data System (ADS)

    Kulkarni, A.; Ramachandran, R.; Conover, H.; McEniry, M.; Goodman, H.; Zavodsky, B. T.; Braun, S. A.; Wilson, B. D.

    2012-12-01

    Case study analysis and climatology studies are common approaches used in Atmospheric Science research. Research based on case studies involves a detailed description of specific weather events using data from different sources, to characterize physical processes in play for a given event. Climatology-based research tends to focus on the representativeness of a given event, by studying the characteristics and distribution of a large number of events. To gather relevant data and information for case studies and climatology analysis is tedious and time consuming; current Earth Science data systems are not suited to assemble multi-instrument, multi mission datasets around specific events. For example, in hurricane science, finding airborne or satellite data relevant to a given storm requires searching through web pages and data archives. Background information related to damages, deaths, and injuries requires extensive online searches for news reports and official storm summaries. We will present a knowledge synthesis engine to create curated "Data Albums" to support case study analysis and climatology studies. The technological challenges in building such a reusable and scalable knowledge synthesis engine are several. First, how to encode domain knowledge in a machine usable form? This knowledge must capture what information and data resources are relevant and the semantic relationships between the various fragments of information and data. Second, how to extract semantic information from various heterogeneous sources including unstructured texts using the encoded knowledge? Finally, how to design a structured database from the encoded knowledge to store all information and to support querying? The structured database must allow both knowledge overviews of an event as well as drill down capability needed for detailed analysis. An application ontology driven framework is being used to design the knowledge synthesis engine. The knowledge synthesis engine is being applied to build a portal for hurricane case studies at the Global Hydrology and Resource Center (GHRC), a NASA Data Center. This portal will auto-generate Data Albums for specific hurricane events, compiling information from distributed resources such as NASA field campaign collections, relevant data sets, storm reports, pictures, videos and other useful sources.

  4. Artificial intelligence within the chemical laboratory.

    PubMed

    Winkel, P

    1994-01-01

    Various techniques within the area of artificial intelligence such as expert systems and neural networks may play a role during the problem-solving processes within the clinical biochemical laboratory. Neural network analysis provides a non-algorithmic approach to information processing, which results in the ability of the computer to form associations and to recognize patterns or classes among data. It belongs to the machine learning techniques which also include probabilistic techniques such as discriminant function analysis and logistic regression and information theoretical techniques. These techniques may be used to extract knowledge from example patients to optimize decision limits and identify clinically important laboratory quantities. An expert system may be defined as a computer program that can give advice in a well-defined area of expertise and is able to explain its reasoning. Declarative knowledge consists of statements about logical or empirical relationships between things. Expert systems typically separate declarative knowledge residing in a knowledge base from the inference engine: an algorithm that dynamically directs and controls the system when it searches its knowledge base. A tool is an expert system without a knowledge base. The developer of an expert system uses a tool by entering knowledge into the system. Many, if not the majority of problems encountered at the laboratory level are procedural. A problem is procedural if it is possible to write up a step-by-step description of the expert's work or if it can be represented by a decision tree. To solve problems of this type only small expert system tools and/or conventional programming are required.(ABSTRACT TRUNCATED AT 250 WORDS)

  5. System Level Uncertainty Assessment for Collaborative RLV Design

    NASA Technical Reports Server (NTRS)

    Charania, A. C.; Bradford, John E.; Olds, John R.; Graham, Matthew

    2002-01-01

    A collaborative design process utilizing Probabilistic Data Assessment (PDA) is showcased. Given the limitation of financial resources by both the government and industry, strategic decision makers need more than just traditional point designs, they need to be aware of the likelihood of these future designs to meet their objectives. This uncertainty, an ever-present character in the design process, can be embraced through a probabilistic design environment. A conceptual design process is presented that encapsulates the major engineering disciplines for a Third Generation Reusable Launch Vehicle (RLV). Toolsets consist of aerospace industry standard tools in disciplines such as trajectory, propulsion, mass properties, cost, operations, safety, and economics. Variations of the design process are presented that use different fidelities of tools. The disciplinary engineering models are used in a collaborative engineering framework utilizing Phoenix Integration's ModelCenter and AnalysisServer environment. These tools allow the designer to join disparate models and simulations together in a unified environment wherein each discipline can interact with any other discipline. The design process also uses probabilistic methods to generate the system level output metrics of interest for a RLV conceptual design. The specific system being examined is the Advanced Concept Rocket Engine 92 (ACRE-92) RLV. Previous experience and knowledge (in terms of input uncertainty distributions from experts and modeling and simulation codes) can be coupled with Monte Carlo processes to best predict the chances of program success.

  6. United States Air Force Civil Engineering Additive Manufacturing Applications: Tools and Jigs

    DTIC Science & Technology

    designs for printing applications. The overall results push forward the Air Forces 3D printing knowledge while providing critical information for decision makers on this up and coming technology....the results indicate that 3Dscanning technology will reach a point within the next 5 years where it can help foster the rapid build-up of 3D CE asset

  7. Alternate Reality Games as an Informal Learning Tool for Generating STEM Engagement among Underrepresented Youth: A Qualitative Evaluation of the Source

    ERIC Educational Resources Information Center

    Gilliam, Melissa; Jagoda, Patrick; Fabiyi, Camille; Lyman, Phoebe; Wilson, Claire; Hill, Brandon; Bouris, Alida

    2017-01-01

    This project developed and studied "The Source," an alternate reality game (ARG) designed to foster interest and knowledge related to science, technology, engineering, and math (STEM) among youth from populations underrepresented in STEM fields. ARGs are multiplayer games that engage participants across several media such as shared…

  8. Rhetorical Impact through Hedging Devices in the "Results and Discussion" Part of a Civil Engineering Research Article

    ERIC Educational Resources Information Center

    Khamesian, Minoo

    2015-01-01

    It is common knowledge that hedging devices as a rhetorical technique common in all persuasive writing are considerably important in scientific discourse, for they are tools which facilitate presenting claims or arguments in a polite, acceptable and respectful manner. In addition, they are discoursal resources available to a scientific writer's…

  9. An Alternative Method To Assess Student's Knowledge about the Concept of Limit in Engineering Teaching.

    ERIC Educational Resources Information Center

    Troncoso, Carlos; Lavalle, Andrea; Curia, Leopoldo; Daniele, Elaine; Chrobak, Ricardo

    The present work has the purpose of showing the evolution of topics or mathematical concepts that are both relevant and with marked grades of abstraction. In this report is specifically described the utilization of metacognitive tools. These include concept maps, the Gowin heuristic vee, and the clinical interview. They are efficient in showing…

  10. Validation of RetroPath, a computer-aided design tool for metabolic pathway engineering.

    PubMed

    Fehér, Tamás; Planson, Anne-Gaëlle; Carbonell, Pablo; Fernández-Castané, Alfred; Grigoras, Ioana; Dariy, Ekaterina; Perret, Alain; Faulon, Jean-Loup

    2014-11-01

    Metabolic engineering has succeeded in biosynthesis of numerous commodity or high value compounds. However, the choice of pathways and enzymes used for production was many times made ad hoc, or required expert knowledge of the specific biochemical reactions. In order to rationalize the process of engineering producer strains, we developed the computer-aided design (CAD) tool RetroPath that explores and enumerates metabolic pathways connecting the endogenous metabolites of a chassis cell to the target compound. To experimentally validate our tool, we constructed 12 top-ranked enzyme combinations producing the flavonoid pinocembrin, four of which displayed significant yields. Namely, our tool queried the enzymes found in metabolic databases based on their annotated and predicted activities. Next, it ranked pathways based on the predicted efficiency of the available enzymes, the toxicity of the intermediate metabolites and the calculated maximum product flux. To implement the top-ranking pathway, our procedure narrowed down a list of nine million possible enzyme combinations to 12, a number easily assembled and tested. One round of metabolic network optimization based on RetroPath output further increased pinocembrin titers 17-fold. In total, 12 out of the 13 enzymes tested in this work displayed a relative performance that was in accordance with its predicted score. These results validate the ranking function of our CAD tool, and open the way to its utilization in the biosynthesis of novel compounds. Copyright © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Exploiting Expertise and Knowledge Sharing Online for the Benefit of NASA's GN&C Community of Practice

    NASA Technical Reports Server (NTRS)

    Topousis, Daria E.; Lebsock, Kenneth L.; Dennehy, Cornelius J.

    2010-01-01

    In 2004, NASA faced major knowledge sharing challenges due to geographically isolated field centers that inhibited engineers from sharing their experiences, expertise, ideas, and lessons learned. The necessity to collaborate on complex development projects and the reality of constrained project resources together drove the need for ensuring that personnel at all NASA centers had comparable skill sets and that engineers could find resources in a timely fashion. Mission failures and new directions for the Agency also demanded better collaborative tools for NASA's engineering workforce. In response to these needs, the online NASA Engineering Network (NEN) was formed by the NASA Office of the Chief Engineer to provide a multi-faceted system for overcoming geographic and cultural barriers. NEN integrates communities of practice with a cross-repository search and the Lessons Learned Information System. This paper describes the features of the GN&C engineering discipline CoP site which went live on NEN in May of 2008 as an online means of gathering input and guidance from practitioners. It allows GN&C discipline expertise captured at one field center to be shared in a collaborative way with the larger discipline CoP spread across the entire Agency. The site enables GN&C engineers to find the information they need quickly, to find solutions to questions from experienced engineers, and to connect with other practitioners regardless of geographic location, thus increasing the probability of project success.

  12. Robotic Refueling Mission

    NASA Image and Video Library

    2017-12-08

    Goddard's Ritsko Wins 2011 SAVE Award The winner of the 2011 SAVE Award is Matthew Ritsko, a Goddard financial manager. His tool lending library would track and enable sharing of expensive space-flight tools and hardware after projects no longer need them. This set of images represents the types of tools used at NASA. To read more go to: www.nasa.gov/topics/people/features/ritsko-save.html The engineering mockup of the Robotic Refueling Mission (RRM) module is currently on display within the press building at the Kennedy Space Center in Florida. The RRM mission is a joint effort between NASA and the Canadian Space Agency designed to demonstrate and test the tools, technologies, and techniques needed to robotically refuel satellites in space. Reporters have the opportunity to get a close-up view of the replica module and tools that are a part of the final shuttle mission payload. SSCO engineers test an RRM tool. To learn more about the RRM go to: ssco.gsfc.nasa.gov/ NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Join us on Facebook Find us on Instagram

  13. Utilization of a radiology-centric search engine.

    PubMed

    Sharpe, Richard E; Sharpe, Megan; Siegel, Eliot; Siddiqui, Khan

    2010-04-01

    Internet-based search engines have become a significant component of medical practice. Physicians increasingly rely on information available from search engines as a means to improve patient care, provide better education, and enhance research. Specialized search engines have emerged to more efficiently meet the needs of physicians. Details about the ways in which radiologists utilize search engines have not been documented. The authors categorized every 25th search query in a radiology-centric vertical search engine by radiologic subspecialty, imaging modality, geographic location of access, time of day, use of abbreviations, misspellings, and search language. Musculoskeletal and neurologic imagings were the most frequently searched subspecialties. The least frequently searched were breast imaging, pediatric imaging, and nuclear medicine. Magnetic resonance imaging and computed tomography were the most frequently searched modalities. A majority of searches were initiated in North America, but all continents were represented. Searches occurred 24 h/day in converted local times, with a majority occurring during the normal business day. Misspellings and abbreviations were common. Almost all searches were performed in English. Search engine utilization trends are likely to mirror trends in diagnostic imaging in the region from which searches originate. Internet searching appears to function as a real-time clinical decision-making tool, a research tool, and an educational resource. A more thorough understanding of search utilization patterns can be obtained by analyzing phrases as actually entered as well as the geographic location and time of origination. This knowledge may contribute to the development of more efficient and personalized search engines.

  14. Knowledge engineering tools for reasoning with scientific observations and interpretations: a neural connectivity use case.

    PubMed

    Russ, Thomas A; Ramakrishnan, Cartic; Hovy, Eduard H; Bota, Mihail; Burns, Gully A P C

    2011-08-22

    We address the goal of curating observations from published experiments in a generalizable form; reasoning over these observations to generate interpretations and then querying this interpreted knowledge to supply the supporting evidence. We present web-application software as part of the 'BioScholar' project (R01-GM083871) that fully instantiates this process for a well-defined domain: using tract-tracing experiments to study the neural connectivity of the rat brain. The main contribution of this work is to provide the first instantiation of a knowledge representation for experimental observations called 'Knowledge Engineering from Experimental Design' (KEfED) based on experimental variables and their interdependencies. The software has three parts: (a) the KEfED model editor - a design editor for creating KEfED models by drawing a flow diagram of an experimental protocol; (b) the KEfED data interface - a spreadsheet-like tool that permits users to enter experimental data pertaining to a specific model; (c) a 'neural connection matrix' interface that presents neural connectivity as a table of ordinal connection strengths representing the interpretations of tract-tracing data. This tool also allows the user to view experimental evidence pertaining to a specific connection. BioScholar is built in Flex 3.5. It uses Persevere (a noSQL database) as a flexible data store and PowerLoom® (a mature First Order Logic reasoning system) to execute queries using spatial reasoning over the BAMS neuroanatomical ontology. We first introduce the KEfED approach as a general approach and describe its possible role as a way of introducing structured reasoning into models of argumentation within new models of scientific publication. We then describe the design and implementation of our example application: the BioScholar software. This is presented as a possible biocuration interface and supplementary reasoning toolkit for a larger, more specialized bioinformatics system: the Brain Architecture Management System (BAMS).

  15. Knowledge engineering tools for reasoning with scientific observations and interpretations: a neural connectivity use case

    PubMed Central

    2011-01-01

    Background We address the goal of curating observations from published experiments in a generalizable form; reasoning over these observations to generate interpretations and then querying this interpreted knowledge to supply the supporting evidence. We present web-application software as part of the 'BioScholar' project (R01-GM083871) that fully instantiates this process for a well-defined domain: using tract-tracing experiments to study the neural connectivity of the rat brain. Results The main contribution of this work is to provide the first instantiation of a knowledge representation for experimental observations called 'Knowledge Engineering from Experimental Design' (KEfED) based on experimental variables and their interdependencies. The software has three parts: (a) the KEfED model editor - a design editor for creating KEfED models by drawing a flow diagram of an experimental protocol; (b) the KEfED data interface - a spreadsheet-like tool that permits users to enter experimental data pertaining to a specific model; (c) a 'neural connection matrix' interface that presents neural connectivity as a table of ordinal connection strengths representing the interpretations of tract-tracing data. This tool also allows the user to view experimental evidence pertaining to a specific connection. BioScholar is built in Flex 3.5. It uses Persevere (a noSQL database) as a flexible data store and PowerLoom® (a mature First Order Logic reasoning system) to execute queries using spatial reasoning over the BAMS neuroanatomical ontology. Conclusions We first introduce the KEfED approach as a general approach and describe its possible role as a way of introducing structured reasoning into models of argumentation within new models of scientific publication. We then describe the design and implementation of our example application: the BioScholar software. This is presented as a possible biocuration interface and supplementary reasoning toolkit for a larger, more specialized bioinformatics system: the Brain Architecture Management System (BAMS). PMID:21859449

  16. A computational platform to maintain and migrate manual functional annotations for BioCyc databases.

    PubMed

    Walsh, Jesse R; Sen, Taner Z; Dickerson, Julie A

    2014-10-12

    BioCyc databases are an important resource for information on biological pathways and genomic data. Such databases represent the accumulation of biological data, some of which has been manually curated from literature. An essential feature of these databases is the continuing data integration as new knowledge is discovered. As functional annotations are improved, scalable methods are needed for curators to manage annotations without detailed knowledge of the specific design of the BioCyc database. We have developed CycTools, a software tool which allows curators to maintain functional annotations in a model organism database. This tool builds on existing software to improve and simplify annotation data imports of user provided data into BioCyc databases. Additionally, CycTools automatically resolves synonyms and alternate identifiers contained within the database into the appropriate internal identifiers. Automating steps in the manual data entry process can improve curation efforts for major biological databases. The functionality of CycTools is demonstrated by transferring GO term annotations from MaizeCyc to matching proteins in CornCyc, both maize metabolic pathway databases available at MaizeGDB, and by creating strain specific databases for metabolic engineering.

  17. A Knowledge-Based and Model-Driven Requirements Engineering Approach to Conceptual Satellite Design

    NASA Astrophysics Data System (ADS)

    Dos Santos, Walter A.; Leonor, Bruno B. F.; Stephany, Stephan

    Satellite systems are becoming even more complex, making technical issues a significant cost driver. The increasing complexity of these systems makes requirements engineering activities both more important and difficult. Additionally, today's competitive pressures and other market forces drive manufacturing companies to improve the efficiency with which they design and manufacture space products and systems. This imposes a heavy burden on systems-of-systems engineering skills and particularly on requirements engineering which is an important phase in a system's life cycle. When this is poorly performed, various problems may occur, such as failures, cost overruns and delays. One solution is to underpin the preliminary conceptual satellite design with computer-based information reuse and integration to deal with the interdisciplinary nature of this problem domain. This can be attained by taking a model-driven engineering approach (MDE), in which models are the main artifacts during system development. MDE is an emergent approach that tries to address system complexity by the intense use of models. This work outlines the use of SysML (Systems Modeling Language) and a novel knowledge-based software tool, named SatBudgets, to deal with these and other challenges confronted during the conceptual phase of a university satellite system, called ITASAT, currently being developed by INPE and some Brazilian universities.

  18. Model-based reasoning for system and software engineering: The Knowledge From Pictures (KFP) environment

    NASA Technical Reports Server (NTRS)

    Bailin, Sydney; Paterra, Frank; Henderson, Scott; Truszkowski, Walt

    1993-01-01

    This paper presents a discussion of current work in the area of graphical modeling and model-based reasoning being undertaken by the Automation Technology Section, Code 522.3, at Goddard. The work was initially motivated by the growing realization that the knowledge acquisition process was a major bottleneck in the generation of fault detection, isolation, and repair (FDIR) systems for application in automated Mission Operations. As with most research activities this work started out with a simple objective: to develop a proof-of-concept system demonstrating that a draft rule-base for a FDIR system could be automatically realized by reasoning from a graphical representation of the system to be monitored. This work was called Knowledge From Pictures (KFP) (Truszkowski et. al. 1992). As the work has successfully progressed the KFP tool has become an environment populated by a set of tools that support a more comprehensive approach to model-based reasoning. This paper continues by giving an overview of the graphical modeling objectives of the work, describing the three tools that now populate the KFP environment, briefly presenting a discussion of related work in the field, and by indicating future directions for the KFP environment.

  19. Tested Tools and Techniques for Promoting STEM Programming in Libraries: Fifteen Years of the Lunar and Planetary Institute's Explore Program

    NASA Astrophysics Data System (ADS)

    LaConte, K.; Shipp, S.; Shupla, C.; Shaner, A.; Buxner, S.; Canipe, M.; Jaksha, A.

    2015-11-01

    Libraries are evolving to serve the changing needs of their communities—and many now encompass science, technology, engineering, and mathematics (STEM) programming. For 15 years, the Lunar and Planetary Institute (LPI) has partnered with library staff to create over 100 hands-on Earth and space science and engineering activities. In-person and online librarian training has prepared a vibrant network of over 1000 informal educators. Program evaluation has shown that Explore! training increases participants' knowledge, and that participants actively use Explore! materials and feel more prepared to offer science and engineering experiences and more comfortable using related resources. Through training, participants become more committed to providing and advocating for science and engineering programming. Explore! serves as a model for effective product development and training practices for serving library staff, increasingly our partners in the advancement of STEM education. Specific approaches and tools that contributed to the success of Explore! are outlined here for adoption by community STEM experts—including professionals and hobbyists in STEM fields and STEM educators who are seeking to share their passion and experience with others through partnerships with libraries.

  20. ESA's tools for internal charging

    NASA Astrophysics Data System (ADS)

    Sorensen, J.; Rodgers, D. J.; Ryden, K. A.; Latham, P. M.; Wrenn, G. L.; Levy, L.; Panabiere, G.

    2000-06-01

    Electrostatic discharges, caused by bulk charging of spacecraft insulating materials, are a major cause of satellite anomalies. A quantitative knowledge of the charge build-up is essential in order to eliminate these problems in the design stage. This is a presentation of ESA's tools to assess whether a given structure is liable to experience electrostatic discharges or not. A study has been made of the physical phenomenon, and an engineering specification has been created to be used to assess a structure for potential discharge problems. The specification has been implemented in a new software DICTAT. The implementation of tests in dedicated facilities is an important part of the specification, and tests have been performed to validate the new tool.

  1. MARVEL: A knowledge-based productivity enhancement tool for real-time multi-mission and multi-subsystem spacecraft operations

    NASA Astrophysics Data System (ADS)

    Schwuttke, Ursula M.; Veregge, John, R.; Angelino, Robert; Childs, Cynthia L.

    1990-10-01

    The Monitor/Analyzer of Real-time Voyager Engineering Link (MARVEL) is described. It is the first automation tool to be used in an online mode for telemetry monitoring and analysis in mission operations. MARVEL combines standard automation techniques with embedded knowledge base systems to simultaneously provide real time monitoring of data from subsystems, near real time analysis of anomaly conditions, and both real time and non-real time user interface functions. MARVEL is currently capable of monitoring the Computer Command Subsystem (CCS), Flight Data Subsystem (FDS), and Attitude and Articulation Control Subsystem (AACS) for both Voyager spacecraft, simultaneously, on a single workstation. The goal of MARVEL is to provide cost savings and productivity enhancement in mission operations and to reduce the need for constant availability of subsystem expertise.

  2. EMERGING POLLUTANTS, MASS SPECTROMETRY, AND ...

    EPA Pesticide Factsheets

    Historically fundamental to amassing our understanding of environmental processes and chemical pollution is the realm of mass spectrometry (MS) - the mainstay of analytical chemistry - the workhorse that supplies definitive data that environmental scientists and engineers reply upon for identifying molecular compositions (and ultimately structures) of chemicals. While the power of MS has long been visible to the practicing environmental chemist, it borders on obscurity to the lay public and many scientists. While MS has played a long, historic (and largely invisible) role in establishing our knowledge of environmental processes and pollution, what recognition it does enjoy is usually relegated to that of a tool. It is usually the relevance or significance of the knowledge acquired from the application of the tool that has ultimate meaning to the public and science at large - not how the data were acquired. Methods (736/800): Mass Spectrometry and the

  3. Novel optical methodologies in studying mechanical signal transduction in mammalian cells

    NASA Technical Reports Server (NTRS)

    Stamatas, G. N.; McIntire, L. V.

    1999-01-01

    For the last 3 decades evidence has been accumulating that some types of mammalian cells respond to their mechanically active environment by altering their morphology, growth rate, and metabolism. The study of such responses is very important in understanding, physiological and pathological conditions ranging from bone formation to atherosclerosis. Obtaining this knowledge has been the goal for an active research area in bioengineering termed cell mechanotransduction. The advancement of optical methodologies used in cell biology research has given the tools to elucidate cellular mechanisms that would otherwise be impossible to visualize. Combined with molecular biology techniques, they give engineers invaluable tools in understanding the chemical pathways involved in mechanotransduction. Herein we briefly review the current knowledge on mechanical signal transduction in mammalian cells, focusing on the application of novel optical techniques in the ongoing research.

  4. Expert system for web based collaborative CAE

    NASA Astrophysics Data System (ADS)

    Hou, Liang; Lin, Zusheng

    2006-11-01

    An expert system for web based collaborative CAE was developed based on knowledge engineering, relational database and commercial FEA (Finite element analysis) software. The architecture of the system was illustrated. In this system, the experts' experiences, theories and typical examples and other related knowledge, which will be used in the stage of pre-process in FEA, were categorized into analysis process and object knowledge. Then, the integrated knowledge model based on object-oriented method and rule based method was described. The integrated reasoning process based on CBR (case based reasoning) and rule based reasoning was presented. Finally, the analysis process of this expert system in web based CAE application was illustrated, and an analysis example of a machine tool's column was illustrated to prove the validity of the system.

  5. An overview of expert systems. [artificial intelligence

    NASA Technical Reports Server (NTRS)

    Gevarter, W. B.

    1982-01-01

    An expert system is defined and its basic structure is discussed. The knowledge base, the inference engine, and uses of expert systems are discussed. Architecture is considered, including choice of solution direction, reasoning in the presence of uncertainty, searching small and large search spaces, handling large search spaces by transforming them and by developing alternative or additional spaces, and dealing with time. Existing expert systems are reviewed. Tools for building such systems, construction, and knowledge acquisition and learning are discussed. Centers of research and funding sources are listed. The state-of-the-art, current problems, required research, and future trends are summarized.

  6. An Analysis of Factors that Influence the Success of Expeditionary Civil Engineer Hub-and-spoke Organizations

    DTIC Science & Technology

    2013-03-01

    planning tools, expeditionary airbase location optimization, knowledge transfer at deployment rotation turnover, exercises and evaluations, and others...laying airfield matting  Constructing earth berms and dikes for fuel bladders or unsheltered aircraft  Modifying existing facilities for alternate...Providing all essential utilities  Constructing earth berms and access roads for bomb dumps  Constructing communication tower foundations

  7. The risk concept and its application in natural hazard risk management in Switzerland

    NASA Astrophysics Data System (ADS)

    Bründl, M.; Romang, H. E.; Bischof, N.; Rheinberger, C. M.

    2009-05-01

    Over the last ten years, a risk-based approach to manage natural hazards - termed the risk concept - has been introduced to the management of natural hazards in Switzerland. Large natural hazard events, new political initiatives and limited financial resources have led to the development and introduction of new planning instruments and software tools that should support natural hazard engineers and planners to effectively and efficiently deal with natural hazards. Our experience with these new instruments suggests an improved integration of the risk concept into the community of natural hazard engineers and planners. Important factors for the acceptance of these new instruments are the integration of end-users during the development process, the knowledge exchange between science, developers and end-users as well as training and education courses for users. Further improvements require the maintenance of this knowledge exchange and a mindful adaptation of the instruments to case-specific circumstances.

  8. Integrating different knowledge sources and disciplines for practical applications in Forest and Agricultural Engineering

    NASA Astrophysics Data System (ADS)

    Guzmán, Gema; Castillo, Carlos; Taguas, Encarnación

    2013-04-01

    One of the aims of 'The Bologna Process' is to promote among the students the acquisition of practical, social and creative skills to face real-life situations and to solve the difficulties they might find during their professional life. It involves an important change in the educational system, from a traditional approach focused on teaching, towards a new one that encourages learning. Under this context, University teaching implies the design of activities addressed to the dissemination of "know-how" to solve different problems associated with two technical disciplines: Forest and Agricultural Engineering. This study presents a preliminary experience where a group of information and communication technologies (ICT) such as, audiovisual resources (videos, reports and photo gallery), virtual visits to blogs and interactive activities have been used to provide a comprehensive knowledge of the environmental and sociocultural components of the landscape in order to facilitate the decision-making process in the engineering project context . With these tools, the students must study and characterize all these aspects in order to justify the chosen solutions and the project design. This approach was followed in the analysis of the limiting factors of practical cases in projects about forestation, landscape restoration and hydrological planning. This communication shows how this methodology has been applied in Forest and Agricultural Engineering and the students' experience with these innovative tools. The use of ICTs involved a friendly framework that stimulated students' interest and made subjects more attractive, since it allowed to assess the complex relationships between landscape, history and economy. Furthermore, this type of activities promotes the interdisciplinary training and the acquisition of creative and autonomous skills which are not included in many cases into the main objectives of the subjects.

  9. Engineering stem cells for future medicine.

    PubMed

    Ricotti, Leonardo; Menciassi, Arianna

    2013-03-01

    Despite their great potential in regenerative medicine applications, stem cells (especially pluripotent ones) currently show a limited clinical success, partly due to a lack of biological knowledge, but also due to a lack of specific and advanced technological instruments able to overcome the current boundaries of stem cell functional maturation and safe/effective therapeutic delivery. This paper aims at describing recent insights, current limitations, and future horizons related to therapeutic stem cells, by analyzing the potential of different bioengineering disciplines in bringing stem cells toward a safe clinical use. First, we clarify how and why stem cells should be properly engineered and which could be in a near future the challenges and the benefits connected with this process. Second, we identify different routes toward stem cell differentiation and functional maturation, relying on chemical, mechanical, topographical, and direct/indirect physical stimulation. Third, we highlight how multiscale modeling could strongly support and optimize stem cell engineering. Finally, we focus on future robotic tools that could provide an added value to the extent of translating basic biological knowledge into clinical applications, by developing ad hoc enabling technologies for stem cell delivery and control.

  10. An advanced search engine for patent analytics in medicinal chemistry.

    PubMed

    Pasche, Emilie; Gobeill, Julien; Teodoro, Douglas; Gaudinat, Arnaud; Vishnykova, Dina; Lovis, Christian; Ruch, Patrick

    2012-01-01

    Patent collections contain an important amount of medical-related knowledge, but existing tools were reported to lack of useful functionalities. We present here the development of TWINC, an advanced search engine dedicated to patent retrieval in the domain of health and life sciences. Our tool embeds two search modes: an ad hoc search to retrieve relevant patents given a short query and a related patent search to retrieve similar patents given a patent. Both search modes rely on tuning experiments performed during several patent retrieval competitions. Moreover, TWINC is enhanced with interactive modules, such as chemical query expansion, which is of prior importance to cope with various ways of naming biomedical entities. While the related patent search showed promising performances, the ad-hoc search resulted in fairly contrasted results. Nonetheless, TWINC performed well during the Chemathlon task of the PatOlympics competition and experts appreciated its usability.

  11. Combinatorial genetic perturbation to refine metabolic circuits for producing biofuels and biochemicals.

    PubMed

    Kim, Hyo Jin; Turner, Timothy Lee; Jin, Yong-Su

    2013-11-01

    Recent advances in metabolic engineering have enabled microbial factories to compete with conventional processes for producing fuels and chemicals. Both rational and combinatorial approaches coupled with synthetic and systematic tools play central roles in metabolic engineering to create and improve a selected microbial phenotype. Compared to knowledge-based rational approaches, combinatorial approaches exploiting biological diversity and high-throughput screening have been demonstrated as more effective tools for improving various phenotypes of interest. In particular, identification of unprecedented targets to rewire metabolic circuits for maximizing yield and productivity of a target chemical has been made possible. This review highlights general principles and the features of the combinatorial approaches using various libraries to implement desired phenotypes for strain improvement. In addition, recent applications that harnessed the combinatorial approaches to produce biofuels and biochemicals will be discussed. Copyright © 2013 Elsevier Inc. All rights reserved.

  12. Sensor data validation and reconstruction. Phase 1: System architecture study

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The sensor validation and data reconstruction task reviewed relevant literature and selected applicable validation and reconstruction techniques for further study; analyzed the selected techniques and emphasized those which could be used for both validation and reconstruction; analyzed Space Shuttle Main Engine (SSME) hot fire test data to determine statistical and physical relationships between various parameters; developed statistical and empirical correlations between parameters to perform validation and reconstruction tasks, using a computer aided engineering (CAE) package; and conceptually designed an expert system based knowledge fusion tool, which allows the user to relate diverse types of information when validating sensor data. The host hardware for the system is intended to be a Sun SPARCstation, but could be any RISC workstation with a UNIX operating system and a windowing/graphics system such as Motif or Dataviews. The information fusion tool is intended to be developed using the NEXPERT Object expert system shell, and the C programming language.

  13. A web-based rapid assessment tool for production publishing solutions

    NASA Astrophysics Data System (ADS)

    Sun, Tong

    2010-02-01

    Solution assessment is a critical first-step in understanding and measuring the business process efficiency enabled by an integrated solution package. However, assessing the effectiveness of any solution is usually a very expensive and timeconsuming task which involves lots of domain knowledge, collecting and understanding the specific customer operational context, defining validation scenarios and estimating the expected performance and operational cost. This paper presents an intelligent web-based tool that can rapidly assess any given solution package for production publishing workflows via a simulation engine and create a report for various estimated performance metrics (e.g. throughput, turnaround time, resource utilization) and operational cost. By integrating the digital publishing workflow ontology and an activity based costing model with a Petri-net based workflow simulation engine, this web-based tool allows users to quickly evaluate any potential digital publishing solutions side-by-side within their desired operational contexts, and provides a low-cost and rapid assessment for organizations before committing any purchase. This tool also benefits the solution providers to shorten the sales cycles, establishing a trustworthy customer relationship and supplement the professional assessment services with a proven quantitative simulation and estimation technology.

  14. A knowledge-based system design/information tool for aircraft flight control systems

    NASA Technical Reports Server (NTRS)

    Mackall, Dale A.; Allen, James G.

    1991-01-01

    Research aircraft have become increasingly dependent on advanced electronic control systems to accomplish program goals. These aircraft are integrating multiple disciplines to improve performance and satisfy research objective. This integration is being accomplished through electronic control systems. Systems design methods and information management have become essential to program success. The primary objective of the system design/information tool for aircraft flight control is to help transfer flight control system design knowledge to the flight test community. By providing all of the design information and covering multiple disciplines in a structured, graphical manner, flight control systems can more easily be understood by the test engineers. This will provide the engineers with the information needed to thoroughly ground test the system and thereby reduce the likelihood of serious design errors surfacing in flight. The secondary object is to apply structured design techniques to all of the design domains. By using the techniques in the top level system design down through the detailed hardware and software designs, it is hoped that fewer design anomalies will result. The flight test experiences are reviewed of three highly complex, integrated aircraft programs: the X-29 forward swept wing; the advanced fighter technology integration (AFTI) F-16; and the highly maneuverable aircraft technology (HiMAT) program. Significant operating technologies, and the design errors which cause them, is examined to help identify what functions a system design/informatin tool should provide to assist designers in avoiding errors.

  15. The application of SSADM to modelling the logical structure of proteins.

    PubMed

    Saldanha, J; Eccles, J

    1991-10-01

    A logical design that describes the overall structure of proteins, together with a more detailed design describing secondary and some supersecondary structures, has been constructed using the computer-aided software engineering (CASE) tool, Auto-mate. Auto-mate embodies the philosophy of the Structured Systems Analysis and Design Method (SSADM) which enables the logical design of computer systems. Our design will facilitate the building of large information systems, such as databases and knowledgebases in the field of protein structure, by the derivation of system requirements from our logical model prior to producing the final physical system. In addition, the study has highlighted the ease of employing SSADM as a formalism in which to conduct the transferral of concepts from an expert into a design for a knowledge-based system that can be implemented on a computer (the knowledge-engineering exercise). It has been demonstrated how SSADM techniques may be extended for the purpose of modelling the constituent Prolog rules. This facilitates the integration of the logical system design model with the derived knowledge-based system.

  16. Training mechanical engineering students to utilize biological inspiration during product development.

    PubMed

    Bruck, Hugh A; Gershon, Alan L; Golden, Ira; Gupta, Satyandra K; Gyger, Lawrence S; Magrab, Edward B; Spranklin, Brent W

    2007-12-01

    The use of bio-inspiration for the development of new products and devices requires new educational tools for students consisting of appropriate design and manufacturing technologies, as well as curriculum. At the University of Maryland, new educational tools have been developed that introduce bio-inspired product realization to undergraduate mechanical engineering students. These tools include the development of a bio-inspired design repository, a concurrent fabrication and assembly manufacturing technology, a series of undergraduate curriculum modules and a new senior elective in the bio-inspired robotics area. This paper first presents an overview of the two new design and manufacturing technologies that enable students to realize bio-inspired products, and describes how these technologies are integrated into the undergraduate educational experience. Then, the undergraduate curriculum modules are presented, which provide students with the fundamental design and manufacturing principles needed to support bio-inspired product and device development. Finally, an elective bio-inspired robotics project course is present, which provides undergraduates with the opportunity to demonstrate the application of the knowledge acquired through the curriculum modules in their senior year using the new design and manufacturing technologies.

  17. The development of a digital logic concept inventory

    NASA Astrophysics Data System (ADS)

    Herman, Geoffrey Lindsay

    Instructors in electrical and computer engineering and in computer science have developed innovative methods to teach digital logic circuits. These methods attempt to increase student learning, satisfaction, and retention. Although there are readily accessible and accepted means for measuring satisfaction and retention, there are no widely accepted means for assessing student learning. Rigorous assessment of learning is elusive because differences in topic coverage, curriculum and course goals, and exam content prevent direct comparison of two teaching methods when using tools such as final exam scores or course grades. Because of these difficulties, computing educators have issued a general call for the adoption of assessment tools to critically evaluate and compare the various teaching methods. Science, Technology, Engineering, and Mathematics (STEM) education researchers commonly measure students' conceptual learning to compare how much different pedagogies improve learning. Conceptual knowledge is often preferred because all engineering courses should teach a fundamental set of concepts even if they emphasize design or analysis to different degrees. Increasing conceptual learning is also important, because students who can organize facts and ideas within a consistent conceptual framework are able to learn new information quickly and can apply what they know in new situations. If instructors can accurately assess their students' conceptual knowledge, they can target instructional interventions to remedy common problems. To properly assess conceptual learning, several researchers have developed concept inventories (CIs) for core subjects in engineering sciences. CIs are multiple-choice assessment tools that evaluate how well a student's conceptual framework matches the accepted conceptual framework of a discipline or common faulty conceptual frameworks. We present how we created and evaluated the digital logic concept inventory (DLCI).We used a Delphi process to identify the important and difficult concepts to include on the DLCI. To discover and describe common student misconceptions, we interviewed students who had completed a digital logic course. Students vocalized their thoughts as they solved digital logic problems. We analyzed the interview data using a qualitative grounded theory approach. We have administered the DLCI at several institutions and have checked the validity, reliability, and bias of the DLCI with classical testing theory procedures. These procedures consisted of follow-up interviews with students, analysis of administration results with statistical procedures, and expert feedback. We discuss these results and present the DLCI's potential for providing a meaningful tool for comparing student learning at different institutions.

  18. Object-oriented design tools for supramolecular devices and biomedical nanotechnology.

    PubMed

    Lee, Stephen C; Bhalerao, Khaustaub; Ferrari, Mauro

    2004-05-01

    Nanotechnology provides multifunctional agents for in vivo use that increasingly blur the distinction between pharmaceuticals and medical devices. Realization of such therapeutic nanodevices requires multidisciplinary effort that is difficult for individual device developers to sustain, and identification of appropriate collaborations outside ones own field can itself be challenging. Further, as in vivo nanodevices become increasingly complex, their design will increasingly demand systems level thinking. System engineering tools such as object-oriented analysis, object-oriented design (OOA/D) and unified modeling language (UML) are applicable to nanodevices built from biological components, help logically manage the knowledge needed to design them, and help identify useful collaborative relationships for device designers. We demonstrate the utility of these systems engineering tools by reverse engineering an existing molecular device (the bacmid molecular cloning system) using them, and illustrate how object-oriented approaches identify fungible components (objects) in nanodevices in a way that facilitates design of families of related devices, rather than single inventions. We also explore the utility of object-oriented approaches for design of another class of therapeutic nanodevices, vaccines. While they are useful for design of current nanodevices, the power of systems design tools for biomedical nanotechnology will become increasingly apparent as the complexity and sophistication of in vivo nanosystems increases. The nested, hierarchical nature of object-oriented approaches allows treatment of devices as objects in higher-order structures, and so will facilitate concatenation of multiple devices into higher-order, higher-function nanosystems.

  19. Novel Genetic Tools to Accelerate Our Understanding of Photosynthesis and Lipid Accumulation

    DTIC Science & Technology

    2014-08-20

    understanding of photosynthesis and lipid accumulation Martin C. Jonikas, Ph.D. Carnegie Institution for Science, Department of Plant Biology 260...knowledge of algal lipid metabolism and photosynthesis . Advances in our basic understanding of these processes will facilitate genetic engineering of...algae to improve lipid yields. Currently, one of the greatest roadblocks in the study of algal photosynthesis and lipid metabolism is the slow pace of

  20. Key Future Engineering Capabilities for Human Capital Retention

    NASA Astrophysics Data System (ADS)

    Sivich, Lorrie

    Projected record retirements of Baby Boomer generation engineers have been predicted to result in significant losses of mission-critical knowledge in space, national security, and future scientific ventures vital to high-technology corporations. No comprehensive review or analysis of engineering capabilities has been performed to identify threats related to the specific loss of mission-critical knowledge posed by the increasing retirement of tenured engineers. Archival data from a single diversified Fortune 500 aerospace manufacturing engineering company's engineering career database were analyzed to ascertain whether relationships linking future engineering capabilities, engineering disciplines, and years of engineering experience could be identified to define critical knowledge transfer models. Chi square, logistic, and linear regression analyses were used to map patterns of discipline-specific, mission-critical knowledge using archival data of engineers' perceptions of engineering capabilities, key developmental experiences, and knowledge learned from their engineering careers. The results from the study were used to document key engineering future capabilities. The results were then used to develop a proposed human capital retention plan to address specific key knowledge gaps of younger engineers as veteran engineers retire. The potential for social change from this study involves informing leaders of aerospace engineering corporations on how to build better quality mentoring or succession plans to fill the void of lost knowledge from retiring engineers. This plan can secure mission-critical knowledge for younger engineers for current and future product development and increased global competitiveness in the technology market.

  1. Decision support tool for diagnosing the source of variation

    NASA Astrophysics Data System (ADS)

    Masood, Ibrahim; Azrul Azhad Haizan, Mohamad; Norbaya Jumali, Siti; Ghazali, Farah Najihah Mohd; Razali, Hazlin Syafinaz Md; Shahir Yahya, Mohd; Azlan, Mohd Azwir bin

    2017-08-01

    Identifying the source of unnatural variation (SOV) in manufacturing process is essential for quality control. The Shewhart control chart patterns (CCPs) are commonly used to monitor the SOV. However, a proper interpretation of CCPs associated to its SOV requires a high skill industrial practitioner. Lack of knowledge in process engineering will lead to erroneous corrective action. The objective of this study is to design the operating procedures of computerized decision support tool (DST) for process diagnosis. The DST is an embedded tool in CCPs recognition scheme. Design methodology involves analysis of relationship between geometrical features, manufacturing process and CCPs. The DST contents information about CCPs and its possible root cause error and description on SOV phenomenon such as process deterioration in tool bluntness, offsetting tool, loading error, and changes in materials hardness. The DST will be useful for an industrial practitioner in making effective troubleshooting.

  2. CHO microRNA engineering is growing up: Recent successes and future challenges☆

    PubMed Central

    Jadhav, Vaibhav; Hackl, Matthias; Druz, Aliaksandr; Shridhar, Smriti; Chung, Cheng-Yu; Heffner, Kelley M.; Kreil, David P.; Betenbaugh, Mike; Shiloach, Joseph; Barron, Niall; Grillari, Johannes; Borth, Nicole

    2013-01-01

    microRNAs with their ability to regulate complex pathways that control cellular behavior and phenotype have been proposed as potential targets for cell engineering in the context of optimization of biopharmaceutical production cell lines, specifically of Chinese Hamster Ovary cells. However, until recently, research was limited by a lack of genomic sequence information on this industrially important cell line. With the publication of the genomic sequence and other relevant data sets for CHO cells since 2011, the doors have been opened for an improved understanding of CHO cell physiology and for the development of the necessary tools for novel engineering strategies. In the present review we discuss both knowledge on the regulatory mechanisms of microRNAs obtained from other biological models and proof of concepts already performed on CHO cells, thus providing an outlook of potential applications of microRNA engineering in production cell lines. PMID:23916872

  3. STRUTEX: A prototype knowledge-based system for initially configuring a structure to support point loads in two dimensions

    NASA Technical Reports Server (NTRS)

    Rogers, James L.; Feyock, Stefan; Sobieszczanski-Sobieski, Jaroslaw

    1988-01-01

    The purpose of this research effort is to investigate the benefits that might be derived from applying artificial intelligence tools in the area of conceptual design. Therefore, the emphasis is on the artificial intelligence aspects of conceptual design rather than structural and optimization aspects. A prototype knowledge-based system, called STRUTEX, was developed to initially configure a structure to support point loads in two dimensions. This system combines numerical and symbolic processing by the computer with interactive problem solving aided by the vision of the user by integrating a knowledge base interface and inference engine, a data base interface, and graphics while keeping the knowledge base and data base files separate. The system writes a file which can be input into a structural synthesis system, which combines structural analysis and optimization.

  4. Making Sense of Rocket Science - Building NASA's Knowledge Management Program

    NASA Technical Reports Server (NTRS)

    Holm, Jeanne

    2002-01-01

    The National Aeronautics and Space Administration (NASA) has launched a range of KM activities-from deploying intelligent "know-bots" across millions of electronic sources to ensuring tacit knowledge is transferred across generations. The strategy and implementation focuses on managing NASA's wealth of explicit knowledge, enabling remote collaboration for international teams, and enhancing capture of the key knowledge of the workforce. An in-depth view of the work being done at the Jet Propulsion Laboratory (JPL) shows the integration of academic studies and practical applications to architect, develop, and deploy KM systems in the areas of document management, electronic archives, information lifecycles, authoring environments, enterprise information portals, search engines, experts directories, collaborative tools, and in-process decision capture. These systems, together, comprise JPL's architecture to capture, organize, store, and distribute key learnings for the U.S. exploration of space.

  5. The extracellular matrix: Structure, composition, age-related differences, tools for analysis and applications for tissue engineering.

    PubMed

    Kular, Jaspreet K; Basu, Shouvik; Sharma, Ram I

    2014-01-01

    The extracellular matrix is a structural support network made up of diverse proteins, sugars and other components. It influences a wide number of cellular processes including migration, wound healing and differentiation, all of which is of particular interest to researchers in the field of tissue engineering. Understanding the composition and structure of the extracellular matrix will aid in exploring the ways the extracellular matrix can be utilised in tissue engineering applications especially as a scaffold. This review summarises the current knowledge of the composition, structure and functions of the extracellular matrix and introduces the effect of ageing on extracellular matrix remodelling and its contribution to cellular functions. Additionally, the current analytical technologies to study the extracellular matrix and extracellular matrix-related cellular processes are also reviewed.

  6. Distinctions between intelligent manufactured and constructed systems and a new discipline for intelligent infrastructure hypersystems

    NASA Astrophysics Data System (ADS)

    Aktan, A. Emin

    2003-08-01

    Although the interconnected systems nature of the infrastructures, and the complexity of interactions between their engineered, socio-technical and natural constituents have been recognized for some time, the principles of effectively operating, protecting and preserving such systems by taking full advantage of "modeling, simulations, optimization, control and decision making" tools developed by the systems engineering and operations research community have not been adequately studied or discussed by many engineers including the writer. Differential and linear equation systems, numerical and finite element modeling techniques, statistical and probabilistic representations are universal, however, different disciplines have developed their distinct approaches to conceptualizing, idealizing and modeling the systems they commonly deal with. The challenge is in adapting and integrating deterministic and stochastic, geometric and numerical, physics-based and "soft (data-or-knowledge based)", macroscopic or microscopic models developed by various disciplines for simulating infrastructure systems. There is a lot to be learned by studying how different disciplines have studied, improved and optimized the systems relating to various processes and products in their domains. Operations research has become a fifty-year old discipline addressing complex systems problems. Its mathematical tools range from linear programming to decision processes and game theory. These tools are used extensively in management and finance, as well as by industrial engineers for optimizing and quality control. Progressive civil engineering academic programs have adopted "systems engineering" as a focal area. However, most of the civil engineering systems programs remain focused on constructing and analyzing highly idealized, often generic models relating to the planning or operation of transportation, water or waste systems, maintenance management, waste management or general infrastructure hazards risk management. We further note that in the last decade there have been efforts for "agent-based" modeling of synthetic infrastructure systems by taking advantage of supercomputers at various DOE Laboratories. However, whether there is any similitude between such synthetic and actual systems needs investigating further.

  7. Rocksalt nitride metal/semiconductor superlattices: A new class of artificially structured materials

    NASA Astrophysics Data System (ADS)

    Saha, Bivas; Shakouri, Ali; Sands, Timothy D.

    2018-06-01

    Artificially structured materials in the form of superlattice heterostructures enable the search for exotic new physics and novel device functionalities, and serve as tools to push the fundamentals of scientific and engineering knowledge. Semiconductor heterostructures are the most celebrated and widely studied artificially structured materials, having led to the development of quantum well lasers, quantum cascade lasers, measurements of the fractional quantum Hall effect, and numerous other scientific concepts and practical device technologies. However, combining metals with semiconductors at the atomic scale to develop metal/semiconductor superlattices and heterostructures has remained a profoundly difficult scientific and engineering challenge. Though the potential applications of metal/semiconductor heterostructures could range from energy conversion to photonic computing to high-temperature electronics, materials challenges primarily had severely limited progress in this pursuit until very recently. In this article, we detail the progress that has taken place over the last decade to overcome the materials engineering challenges to grow high quality epitaxial, nominally single crystalline metal/semiconductor superlattices based on transition metal nitrides (TMN). The epitaxial rocksalt TiN/(Al,Sc)N metamaterials are the first pseudomorphic metal/semiconductor superlattices to the best of our knowledge, and their physical properties promise a new era in superlattice physics and device engineering.

  8. Model-based diagnostics for Space Station Freedom

    NASA Technical Reports Server (NTRS)

    Fesq, Lorraine M.; Stephan, Amy; Martin, Eric R.; Lerutte, Marcel G.

    1991-01-01

    An innovative approach to fault management was recently demonstrated for the NASA LeRC Space Station Freedom (SSF) power system testbed. This project capitalized on research in model-based reasoning, which uses knowledge of a system's behavior to monitor its health. The fault management system (FMS) can isolate failures online, or in a post analysis mode, and requires no knowledge of failure symptoms to perform its diagnostics. An in-house tool called MARPLE was used to develop and run the FMS. MARPLE's capabilities are similar to those available from commercial expert system shells, although MARPLE is designed to build model-based as opposed to rule-based systems. These capabilities include functions for capturing behavioral knowledge, a reasoning engine that implements a model-based technique known as constraint suspension, and a tool for quickly generating new user interfaces. The prototype produced by applying MARPLE to SSF not only demonstrated that model-based reasoning is a valuable diagnostic approach, but it also suggested several new applications of MARPLE, including an integration and testing aid, and a complement to state estimation.

  9. Developing an Environmental Decision Support System for Stream Management: the STREAMES Experience

    NASA Astrophysics Data System (ADS)

    Riera, J.; Argerich, A.; Comas, J.; Llorens, E.; Martí, E.; Godé, L.; Pargament, D.; Puig, M.; Sabater, F.

    2005-05-01

    Transferring research knowledge to stream managers is crucial for scientifically sound management. Environmental decision support systems are advocated as an effective means to accomplish this. STREAMES (STream REAach Management: an Expert System) is a decision tree based EDSS prototype developed within the context of an European project as a tool to assist water managers in the diagnosis of problems, detection of causes, and selection of management strategies for coping with stream degradation issues related mostly to excess nutrient availability. STREAMES was developed by a team of scientists, water managers, and experts in knowledge engineering. Although the tool focuses on management at the stream reach scale, it also incorporates a mass-balance catchment nutrient emission model and a simple GIS module. We will briefly present the prototype and share our experience in its development. Emphasis will be placed on the process of knowledge acquisition, the design process, the pitfalls and benefits of the communication between scientists and managers, and the potential for future development of STREAMES, particularly in the context of the EU Water Framework Directive.

  10. Electrical stimulation: a novel tool for tissue engineering.

    PubMed

    Balint, Richard; Cassidy, Nigel J; Cartmell, Sarah H

    2013-02-01

    New advances in tissue engineering are being made through the application of different types of electrical stimuli to influence cell proliferation and differentiation. Developments made in the last decade have allowed us to improve the structure and functionality of tissue-engineered products through the use of growth factors, hormones, drugs, physical stimuli, bioreactor use, and two-dimensional (2-D) and three-dimensional (3-D) artificial extracellular matrices (with various material properties and topography). Another potential type of stimulus is electricity, which is important in the physiology and development of the majority of all human tissues. Despite its great potential, its role in tissue regeneration and its ability to influence cell migration, orientation, proliferation, and differentiation has rarely been considered in tissue engineering. This review highlights the importance of endogenous electrical stimulation, gathering the current knowledge on its natural occurrence and role in vivo, discussing the novel methods of delivering this stimulus and examining its cellular and tissue level effects, while evaluating how the technique could benefit the tissue engineering discipline in the future.

  11. A Web Centric Architecture for Deploying Multi-Disciplinary Engineering Design Processes

    NASA Technical Reports Server (NTRS)

    Woyak, Scott; Kim, Hongman; Mullins, James; Sobieszczanski-Sobieski, Jaroslaw

    2004-01-01

    There are continuous needs for engineering organizations to improve their design process. Current state of the art techniques use computational simulations to predict design performance, and optimize it through advanced design methods. These tools have been used mostly by individual engineers. This paper presents an architecture for achieving results at an organization level beyond individual level. The next set of gains in process improvement will come from improving the effective use of computers and software within a whole organization, not just for an individual. The architecture takes advantage of state of the art capabilities to produce a Web based system to carry engineering design into the future. To illustrate deployment of the architecture, a case study for implementing advanced multidisciplinary design optimization processes such as Bi-Level Integrated System Synthesis is discussed. Another example for rolling-out a design process for Design for Six Sigma is also described. Each example explains how an organization can effectively infuse engineering practice with new design methods and retain the knowledge over time.

  12. Methodology to build medical ontology from textual resources.

    PubMed

    Baneyx, Audrey; Charlet, Jean; Jaulent, Marie-Christine

    2006-01-01

    In the medical field, it is now established that the maintenance of unambiguous thesauri goes through ontologies. Our research task is to help pneumologists code acts and diagnoses with a software that represents medical knowledge through a domain ontology. In this paper, we describe our general methodology aimed at knowledge engineers in order to build various types of medical ontologies based on terminology extraction from texts. The hypothesis is to apply natural language processing tools to textual patient discharge summaries to develop the resources needed to build an ontology in pneumology. Results indicate that the joint use of distributional analysis and lexico-syntactic patterns performed satisfactorily for building such ontologies.

  13. Space Weather Monitoring for ISS Space Environments Engineering and Crew Auroral Observations

    NASA Technical Reports Server (NTRS)

    Minow, Joseph I.; Pettit, Donald R.; Hartman, William A.

    2012-01-01

    The awareness of potentially significant impacts of space weather on spaceand ground ]based technological systems has generated a strong desire in many sectors of government and industry to effectively transform knowledge and understanding of the variable space environment into useful tools and applications for use by those entities responsible for systems that may be vulnerable to space weather impacts. Essentially, effectively transitioning science knowledge to useful applications relevant to space weather has become important. This talk will present proven methodologies that have been demonstrated to be effective, and how in the current environment those can be applied to space weather transition efforts.

  14. Search engine as a diagnostic tool in difficult immunological and allergologic cases: is Google useful?

    PubMed

    Lombardi, C; Griffiths, E; McLeod, B; Caviglia, A; Penagos, M

    2009-07-01

    Web search engines are an important tool in communication and diffusion of knowledge. Among these, Google appears to be the most popular one: in August 2008, it accounted for 87% of all web searches in the UK, compared with Yahoo's 3.3%. Google's value as a diagnostic guide in general medicine was recently reported. The aim of this comparative cross-sectional study was to evaluate whether searching Google with disease-related terms was effective in the identification and diagnosis of complex immunological and allergic cases. Forty-five case reports were randomly selected by an independent observer from peer-reviewed medical journals. Clinical data were presented separately to three investigators, blinded to the final diagnoses. Investigator A was a Consultant with an expert knowledge in Internal Medicine and Allergy (IM&A) and basic computing skills. Investigator B was a Registrar in IM&A. Investigator C was a Research Nurse. Both Investigators B and C were familiar with computers and search engines. For every clinical case presented, each investigator independently carried out an Internet search using Google to provide a final diagnosis. Their results were then compared with the published diagnoses. Correct diagnoses were provided in 30/45 (66%) cases, 39/45 (86%) cases, and in 29/45 (64%) cases by investigator A, B, and C, respectively. All of the three investigators achieved the correct diagnosis in 19 cases (42%), and all of them failed in two cases. This Google-based search was useful to identify an appropriate diagnosis in complex immunological and allergic cases. Computing skills may help to get better results.

  15. Knowledge Engineering and Education.

    ERIC Educational Resources Information Center

    Lopez, Antonio M., Jr.; Donlon, James

    2001-01-01

    Discusses knowledge engineering, computer software, and possible applications in the field of education. Highlights include the distinctions between data, information, and knowledge; knowledge engineering as a subfield of artificial intelligence; knowledge acquisition; data mining; ontology development for subject terms; cognitive apprentices; and…

  16. Analysis, Simulation, and Verification of Knowledge-Based, Rule-Based, and Expert Systems

    NASA Technical Reports Server (NTRS)

    Hinchey, Mike; Rash, James; Erickson, John; Gracanin, Denis; Rouff, Chris

    2010-01-01

    Mathematically sound techniques are used to view a knowledge-based system (KBS) as a set of processes executing in parallel and being enabled in response to specific rules being fired. The set of processes can be manipulated, examined, analyzed, and used in a simulation. The tool that embodies this technology may warn developers of errors in their rules, but may also highlight rules (or sets of rules) in the system that are underspecified (or overspecified) and need to be corrected for the KBS to operate as intended. The rules embodied in a KBS specify the allowed situations, events, and/or results of the system they describe. In that sense, they provide a very abstract specification of a system. The system is implemented through the combination of the system specification together with an appropriate inference engine, independent of the algorithm used in that inference engine. Viewing the rule base as a major component of the specification, and choosing an appropriate specification notation to represent it, reveals how additional power can be derived from an approach to the knowledge-base system that involves analysis, simulation, and verification. This innovative approach requires no special knowledge of the rules, and allows a general approach where standardized analysis, verification, simulation, and model checking techniques can be applied to the KBS.

  17. Enterprise resource planning (ERP) implementation using the value engineering methodology and Six Sigma tools

    NASA Astrophysics Data System (ADS)

    Leu, Jun-Der; Lee, Larry Jung-Hsing

    2017-09-01

    Enterprise resource planning (ERP) is a software solution that integrates the operational processes of the business functions of an enterprise. However, implementing ERP systems is a complex process. In addition to the technical issues, companies must address problems associated with business process re-engineering, time and budget control, and organisational change. Numerous industrial studies have shown that the failure rate of ERP implementation is high, even for well-designed systems. Thus, ERP projects typically require a clear methodology to support the project execution and effectiveness. In this study, we propose a theoretical model for ERP implementation. The value engineering (VE) method forms the basis of the proposed framework, which integrates Six Sigma tools. The proposed framework encompasses five phases: knowledge generation, analysis, creation, development and execution. In the VE method, potential ERP problems related to software, hardware, consultation and organisation are analysed in a group-decision manner and in relation to value, and Six Sigma tools are applied to avoid any project defects. We validate the feasibility of the proposed model by applying it to an international manufacturing enterprise in Taiwan. The results show improvements in customer response time and operational efficiency in terms of work-in-process and turnover of materials. Based on the evidence from the case study, the theoretical framework is discussed together with the study's limitations and suggestions for future research.

  18. Institutional Case-Based Study on the Effect of Research Methods on Project Work in the Curriculum of Mechanical Engineering Programmes in Ghanaian Polytechnics

    ERIC Educational Resources Information Center

    Baffour-Awuah, Emmanuel

    2015-01-01

    Preparing students for Project Work (PROJ 1 and PROJ 2) require them to go through Research Methods (RE) as part of the curriculum though it takes the centre stage of the entire preparation process. Knowledge of the relationships between the two could be a useful tool in improving the performance of students in the former. The purpose of the case…

  19. Human Systems Integration Competency Development for Navy Systems Commands

    DTIC Science & Technology

    2012-09-01

    cognizance of Applied Engineering /Psychology relative to knowledge engineering, training, teamwork, user interface design and decision sciences. KSA...cognizance of Applied Engineering /Psychology relative to knowledge engineering, training, teamwork, user interface design and decision sciences...requirements (as required). Fundamental cognizance of Applied Engineering / Psychology relative to knowledge engineering, training, team work, user

  20. Data Mining and Knowledge Discovery tools for exploiting big Earth-Observation data

    NASA Astrophysics Data System (ADS)

    Espinoza Molina, D.; Datcu, M.

    2015-04-01

    The continuous increase in the size of the archives and in the variety and complexity of Earth-Observation (EO) sensors require new methodologies and tools that allow the end-user to access a large image repository, to extract and to infer knowledge about the patterns hidden in the images, to retrieve dynamically a collection of relevant images, and to support the creation of emerging applications (e.g.: change detection, global monitoring, disaster and risk management, image time series, etc.). In this context, we are concerned with providing a platform for data mining and knowledge discovery content from EO archives. The platform's goal is to implement a communication channel between Payload Ground Segments and the end-user who receives the content of the data coded in an understandable format associated with semantics that is ready for immediate exploitation. It will provide the user with automated tools to explore and understand the content of highly complex images archives. The challenge lies in the extraction of meaningful information and understanding observations of large extended areas, over long periods of time, with a broad variety of EO imaging sensors in synergy with other related measurements and data. The platform is composed of several components such as 1.) ingestion of EO images and related data providing basic features for image analysis, 2.) query engine based on metadata, semantics and image content, 3.) data mining and knowledge discovery tools for supporting the interpretation and understanding of image content, 4.) semantic definition of the image content via machine learning methods. All these components are integrated and supported by a relational database management system, ensuring the integrity and consistency of Terabytes of Earth Observation data.

  1. Exploring Mission Concepts with the JPL Innovation Foundry A-Team

    NASA Technical Reports Server (NTRS)

    Ziemer, John K.; Ervin, Joan; Lang, Jared

    2013-01-01

    The JPL Innovation Foundry has established a new approach for exploring, developing, and evaluating early concepts called the A-Team. The A-Team combines innovative collaborative methods with subject matter expertise and analysis tools to help mature mission concepts. Science, implementation, and programmatic elements are all considered during an A-Team study. Methods are grouped by Concept Maturity Level (CML), from 1 through 3, including idea generation and capture (CML 1), initial feasibility assessment (CML 2), and trade space exploration (CML 3). Methods used for each CML are presented, and the key team roles are described from two points of view: innovative methods and technical expertise. A-Team roles for providing innovative methods include the facilitator, study lead, and assistant study lead. A-Team roles for providing technical expertise include the architect, lead systems engineer, and integration engineer. In addition to these key roles, each A-Team study is uniquely staffed to match the study topic and scope including subject matter experts, scientists, technologists, flight and instrument systems engineers, and program managers as needed. Advanced analysis and collaborative engineering tools (e.g. cost, science traceability, mission design, knowledge capture, study and analysis support infrastructure) are also under development for use in A-Team studies and will be discussed briefly. The A-Team facilities provide a constructive environment for innovative ideas from all aspects of mission formulation to eliminate isolated studies and come together early in the development cycle when they can provide the biggest impact. This paper provides an overview of the A-Team, its study processes, roles, methods, tools and facilities.

  2. a Conceptual Framework for Virtual Geographic Environments Knowledge Engineering

    NASA Astrophysics Data System (ADS)

    You, Lan; Lin, Hui

    2016-06-01

    VGE geographic knowledge refers to the abstract and repeatable geo-information which is related to the geo-science problem, geographical phenomena and geographical laws supported by VGE. That includes expert experiences, evolution rule, simulation processes and prediction results in VGE. This paper proposes a conceptual framework for VGE knowledge engineering in order to effectively manage and use geographic knowledge in VGE. Our approach relies on previous well established theories on knowledge engineering and VGE. The main contribution of this report is following: (1) The concepts of VGE knowledge and VGE knowledge engineering which are defined clearly; (2) features about VGE knowledge different with common knowledge; (3) geographic knowledge evolution process that help users rapidly acquire knowledge in VGE; and (4) a conceptual framework for VGE knowledge engineering providing the supporting methodologies system for building an intelligent VGE. This conceptual framework systematically describes the related VGE knowledge theories and key technologies. That will promote the rapid transformation from geodata to geographic knowledge, and furtherly reduce the gap between the data explosion and knowledge absence.

  3. Engineering and evaluating drug delivery particles in microfluidic devices.

    PubMed

    Björnmalm, Mattias; Yan, Yan; Caruso, Frank

    2014-09-28

    The development of new and improved particle-based drug delivery is underpinned by an enhanced ability to engineer particles with high fidelity and integrity, as well as increased knowledge of their biological performance. Microfluidics can facilitate these processes through the engineering of spatiotemporally highly controlled environments using designed microstructures in combination with physical phenomena present at the microscale. In this review, we discuss microfluidics in the context of addressing key challenges in particle-based drug delivery. We provide an overview of how microfluidic devices can: (i) be employed to engineer particles, by providing highly controlled interfaces, and (ii) be used to establish dynamic in vitro models that mimic in vivo environments for studying the biological behavior of engineered particles. Finally, we discuss how the flexible and modular nature of microfluidic devices provides opportunities to create increasingly realistic models of the in vivo milieu (including multi-cell, multi-tissue and even multi-organ devices), and how ongoing developments toward commercialization of microfluidic tools are opening up new opportunities for the engineering and evaluation of drug delivery particles. Copyright © 2014 Elsevier B.V. All rights reserved.

  4. Didactic tools for understanding respiratory physiology

    NASA Astrophysics Data System (ADS)

    Donnelly Kehoe, P.; Bratovich, C.; Perrone, Ms; Mendez Castells, L.

    2007-11-01

    The challenges in Bioengineering are not only the application of engineering knowledge to the measurement of physiological variables, but also the simulation of biological systems. Experience has shown that the physiology of the respiratory system involves a set of concepts that cannot be effectively taught without the help of a group of didactic tools that contribute to the measurement of characteristic specific variables and to the simulation of the system itself. This article describes a series of tools designed to optimize the teaching of the respiratory system, including the use of spirometers and software developed entirely by undergraduate Bioengineering students from Universidad Nacional de Entre Rios (UNER). The impact these resources have caused on the understanding of the topic and how each of them has facilitated the interpretation of the concepts by the students is also discussed.

  5. Protein Design for Nanostructural Engineering: General Aspects.

    PubMed

    Grove, Tijana Z; Cortajarena, Aitziber L

    2016-01-01

    This chapter aims to introduce the main challenges in the field of protein design for engineering of nanostructures and functional materials. First, we introduce proteins and illustrate the key characteristics that open many possibilities for the use of proteins in nanotechnology. Then, we describe the current state of the art of nanopatterning techniques and the actual needs of the emerging field of nanotechnology to develop new tools in order to achieve precise control and manipulation of elements at the nanoscale. In this sense, the increasing knowledge of protein science and advances in protein design allow to tackle current challenges such as the design of nanodevices, nanopatterned surfaces, and nanomachines. This book highlights the recent progresses of protein nanotechnology over the last decade and emphasizes the power of protein engineering through illustrative examples of protein based-assemblies and their potential applications.

  6. Nanoscale Engineering of Designer Cellulosomes.

    PubMed

    Gunnoo, Melissabye; Cazade, Pierre-André; Galera-Prat, Albert; Nash, Michael A; Czjzek, Mirjam; Cieplak, Marek; Alvarez, Beatriz; Aguilar, Marina; Karpol, Alon; Gaub, Hermann; Carrión-Vázquez, Mariano; Bayer, Edward A; Thompson, Damien

    2016-07-01

    Biocatalysts showcase the upper limit obtainable for high-speed molecular processing and transformation. Efforts to engineer functionality in synthetic nanostructured materials are guided by the increasing knowledge of evolving architectures, which enable controlled molecular motion and precise molecular recognition. The cellulosome is a biological nanomachine, which, as a fundamental component of the plant-digestion machinery from bacterial cells, has a key potential role in the successful development of environmentally-friendly processes to produce biofuels and fine chemicals from the breakdown of biomass waste. Here, the progress toward so-called "designer cellulosomes", which provide an elegant alternative to enzyme cocktails for lignocellulose breakdown, is reviewed. Particular attention is paid to rational design via computational modeling coupled with nanoscale characterization and engineering tools. Remaining challenges and potential routes to industrial application are put forward. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. A Data-Driven Solution for Performance Improvement

    NASA Technical Reports Server (NTRS)

    2002-01-01

    Marketed as the "Software of the Future," Optimal Engineering Systems P.I. EXPERT(TM) technology offers statistical process control and optimization techniques that are critical to businesses looking to restructure or accelerate operations in order to gain a competitive edge. Kennedy Space Center granted Optimal Engineering Systems the funding and aid necessary to develop a prototype of the process monitoring and improvement software. Completion of this prototype demonstrated that it was possible to integrate traditional statistical quality assurance tools with robust optimization techniques in a user- friendly format that is visually compelling. Using an expert system knowledge base, the software allows the user to determine objectives, capture constraints and out-of-control processes, predict results, and compute optimal process settings.

  8. Avoiding Human Error in Mission Operations: Cassini Flight Experience

    NASA Technical Reports Server (NTRS)

    Burk, Thomas A.

    2012-01-01

    Operating spacecraft is a never-ending challenge and the risk of human error is ever- present. Many missions have been significantly affected by human error on the part of ground controllers. The Cassini mission at Saturn has not been immune to human error, but Cassini operations engineers use tools and follow processes that find and correct most human errors before they reach the spacecraft. What is needed are skilled engineers with good technical knowledge, good interpersonal communications, quality ground software, regular peer reviews, up-to-date procedures, as well as careful attention to detail and the discipline to test and verify all commands that will be sent to the spacecraft. Two areas of special concern are changes to flight software and response to in-flight anomalies. The Cassini team has a lot of practical experience in all these areas and they have found that well-trained engineers with good tools who follow clear procedures can catch most errors before they get into command sequences to be sent to the spacecraft. Finally, having a robust and fault-tolerant spacecraft that allows ground controllers excellent visibility of its condition is the most important way to ensure human error does not compromise the mission.

  9. Portable inference engine: An extended CLIPS for real-time production systems

    NASA Technical Reports Server (NTRS)

    Le, Thach; Homeier, Peter

    1988-01-01

    The present C-Language Integrated Production System (CLIPS) architecture has not been optimized to deal with the constraints of real-time production systems. Matching in CLIPS is based on the Rete Net algorithm, whose assumption of working memory stability might fail to be satisfied in a system subject to real-time dataflow. Further, the CLIPS forward-chaining control mechanism with a predefined conflict resultion strategy may not effectively focus the system's attention on situation-dependent current priorties, or appropriately address different kinds of knowledge which might appear in a given application. Portable Inference Engine (PIE) is a production system architecture based on CLIPS which attempts to create a more general tool while addressing the problems of real-time expert systems. Features of the PIE design include a modular knowledge base, a modified Rete Net algorithm, a bi-directional control strategy, and multiple user-defined conflict resolution strategies. Problems associated with real-time applications are analyzed and an explanation is given for how the PIE architecture addresses these problems.

  10. KAM (Knowledge Acquisition Module): A tool to simplify the knowledge acquisition process

    NASA Technical Reports Server (NTRS)

    Gettig, Gary A.

    1988-01-01

    Analysts, knowledge engineers and information specialists are faced with increasing volumes of time-sensitive data in text form, either as free text or highly structured text records. Rapid access to the relevant data in these sources is essential. However, due to the volume and organization of the contents, and limitations of human memory and association, frequently: (1) important information is not located in time; (2) reams of irrelevant data are searched; and (3) interesting or critical associations are missed due to physical or temporal gaps involved in working with large files. The Knowledge Acquisition Module (KAM) is a microcomputer-based expert system designed to assist knowledge engineers, analysts, and other specialists in extracting useful knowledge from large volumes of digitized text and text-based files. KAM formulates non-explicit, ambiguous, or vague relations, rules, and facts into a manageable and consistent formal code. A library of system rules or heuristics is maintained to control the extraction of rules, relations, assertions, and other patterns from the text. These heuristics can be added, deleted or customized by the user. The user can further control the extraction process with optional topic specifications. This allows the user to cluster extracts based on specific topics. Because KAM formalizes diverse knowledge, it can be used by a variety of expert systems and automated reasoning applications. KAM can also perform important roles in computer-assisted training and skill development. Current research efforts include the applicability of neural networks to aid in the extraction process and the conversion of these extracts into standard formats.

  11. Impacts of Permafrost on Infrastructure and Ecosystem Services

    NASA Astrophysics Data System (ADS)

    Trochim, E.; Schuur, E.; Schaedel, C.; Kelly, B. P.

    2017-12-01

    The Study of Environmental Arctic Change (SEARCH) program developed knowledge pyramids as a tool for advancing scientific understanding and making this information accessible for decision makers. Knowledge pyramids are being used to synthesize, curate and disseminate knowledge of changing land ice, sea ice, and permafrost in the Arctic. Each pyramid consists of a one-two page summary brief in broadly accessible language and literature organized by levels of detail including synthesizes and scientific building blocks. Three knowledge pyramids have been produced related to permafrost on carbon, infrastructure, and ecosystem services. Each brief answers key questions with high societal relevance framed in policy-relevant terms. The knowledge pyramids concerning infrastructure and ecosystem services were developed in collaboration with researchers specializing in the specific topic areas in order to identify the most pertinent issues and accurately communicate information for integration into policy and planning. For infrastructure, the main issue was the need to build consensus in the engineering and science communities for developing improved methods for incorporating data applicable to building infrastructure on permafrost. In ecosystem services, permafrost provides critical landscape properties which affect basic human needs including fuel and drinking water availability, access to hunting and harvest, and fish and wildlife habitat. Translating these broad and complex topics necessitated a systematic and iterative approach to identifying key issues and relating them succinctly to the best state of the art research. The development of the knowledge pyramids provoked collaboration and synthesis across distinct research and engineering communities. The knowledge pyramids also provide a solid basis for policy development and the format allows the content to be regularly updated as the research community advances.

  12. A novel knowledge-based system for interpreting complex engineering drawings: theory, representation, and implementation.

    PubMed

    Lu, Tong; Tai, Chiew-Lan; Yang, Huafei; Cai, Shijie

    2009-08-01

    We present a novel knowledge-based system to automatically convert real-life engineering drawings to content-oriented high-level descriptions. The proposed method essentially turns the complex interpretation process into two parts: knowledge representation and knowledge-based interpretation. We propose a new hierarchical descriptor-based knowledge representation method to organize the various types of engineering objects and their complex high-level relations. The descriptors are defined using an Extended Backus Naur Form (EBNF), facilitating modification and maintenance. When interpreting a set of related engineering drawings, the knowledge-based interpretation system first constructs an EBNF-tree from the knowledge representation file, then searches for potential engineering objects guided by a depth-first order of the nodes in the EBNF-tree. Experimental results and comparisons with other interpretation systems demonstrate that our knowledge-based system is accurate and robust for high-level interpretation of complex real-life engineering projects.

  13. Two implementations of the Expert System for the Flight Analysis System (ESFAS) project

    NASA Technical Reports Server (NTRS)

    Wang, Lui

    1988-01-01

    A comparison is made between the two most sophisticated expert system building tools, the Automated Reasoning Tool (ART) and the Knowledge Engineering Environment (KEE). The same problem domain (ESFAS) was used in making the comparison. The Expert System for the Flight Analysis System (ESFAS) acts as an intelligent front end for the Flight Analysis System (FAS). FAS is a complex configuration controlled set of interrelated processors (FORTRAN routines) which will be used by the Mission Planning and Analysis Div. (MPAD) to design and analyze Shuttle and potential Space Station missions. Implementations of ESFAS are described. The two versions represent very different programming paradigms; ART uses rules and KEE uses objects. Due to each of the tools philosophical differences, KEE is implemented using a depth first traversal algorithm, whereas ART uses a user directed traversal method. Either tool could be used to solve this particular problem.

  14. Process for selecting engineering tools : applied to selecting a SysML tool.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    De Spain, Mark J.; Post, Debra S.; Taylor, Jeffrey L.

    2011-02-01

    Process for Selecting Engineering Tools outlines the process and tools used to select a SysML (Systems Modeling Language) tool. The process is general in nature and users could use the process to select most engineering tools and software applications.

  15. A Search Engine That's Aware of Your Needs

    NASA Technical Reports Server (NTRS)

    2005-01-01

    Internet research can be compared to trying to drink from a firehose. Such a wealth of information is available that even the simplest inquiry can sometimes generate tens of thousands of leads, more information than most people can handle, and more burdensome than most can endure. Like everyone else, NASA scientists rely on the Internet as a primary search tool. Unlike the average user, though, NASA scientists perform some pretty sophisticated, involved research. To help manage the Internet and to allow researchers at NASA to gain better, more efficient access to the wealth of information, the Agency needed a search tool that was more refined and intelligent than the typical search engine. Partnership NASA funded Stottler Henke, Inc., of San Mateo, California, a cutting-edge software company, with a Small Business Innovation Research (SBIR) contract to develop the Aware software for searching through the vast stores of knowledge quickly and efficiently. The partnership was through NASA s Ames Research Center.

  16. Guidance, navigation, and control subsystem equipment selection algorithm using expert system methods

    NASA Technical Reports Server (NTRS)

    Allen, Cheryl L.

    1991-01-01

    Enhanced engineering tools can be obtained through the integration of expert system methodologies and existing design software. The application of these methodologies to the spacecraft design and cost model (SDCM) software provides an improved technique for the selection of hardware for unmanned spacecraft subsystem design. The knowledge engineering system (KES) expert system development tool was used to implement a smarter equipment section algorithm than that which is currently achievable through the use of a standard data base system. The guidance, navigation, and control subsystems of the SDCM software was chosen as the initial subsystem for implementation. The portions of the SDCM code which compute the selection criteria and constraints remain intact, and the expert system equipment selection algorithm is embedded within this existing code. The architecture of this new methodology is described and its implementation is reported. The project background and a brief overview of the expert system is described, and once the details of the design are characterized, an example of its implementation is demonstrated.

  17. Multi-Stage Hybrid Rocket Conceptual Design for Micro-Satellites Launch using Genetic Algorithm

    NASA Astrophysics Data System (ADS)

    Kitagawa, Yosuke; Kitagawa, Koki; Nakamiya, Masaki; Kanazaki, Masahiro; Shimada, Toru

    The multi-objective genetic algorithm (MOGA) is applied to the multi-disciplinary conceptual design problem for a three-stage launch vehicle (LV) with a hybrid rocket engine (HRE). MOGA is an optimization tool used for multi-objective problems. The parallel coordinate plot (PCP), which is a data mining method, is employed in the post-process in MOGA for design knowledge discovery. A rocket that can deliver observing micro-satellites to the sun-synchronous orbit (SSO) is designed. It consists of an oxidizer tank containing liquid oxidizer, a combustion chamber containing solid fuel, a pressurizing tank and a nozzle. The objective functions considered in this study are to minimize the total mass of the rocket and to maximize the ratio of the payload mass to the total mass. To calculate the thrust and the engine size, the regression rate is estimated based on an empirical model for a paraffin (FT-0070) propellant. Several non-dominated solutions are obtained using MOGA, and design knowledge is discovered for the present hybrid rocket design problem using a PCP analysis. As a result, substantial knowledge on the design of an LV with an HRE is obtained for use in space transportation.

  18. Systematic technology transfer from biology to engineering.

    PubMed

    Vincent, Julian F V; Mann, Darrell L

    2002-02-15

    Solutions to problems move only very slowly between different disciplines. Transfer can be greatly speeded up with suitable abstraction and classification of problems. Russian researchers working on the TRIZ (Teoriya Resheniya Izobretatelskikh Zadatch) method for inventive problem solving have identified systematic means of transferring knowledge between different scientific and engineering disciplines. With over 1500 person years of effort behind it, TRIZ represents the biggest study of human creativity ever conducted, whose aim has been to establish a system into which all known solutions can be placed, classified in terms of function. At present, the functional classification structure covers nearly 3 000 000 of the world's successful patents and large proportions of the known physical, chemical and mathematical knowledge-base. Additional tools are the identification of factors which prevent the attainment of new technology, leading directly to a system of inventive principles which will resolve the impasse, a series of evolutionary trends of development, and to a system of methods for effecting change in a system (Su-fields). As yet, the database contains little biological knowledge despite early recognition by the instigator of TRIZ (Genrich Altshuller) that one day it should. This is illustrated by natural systems evolved for thermal stability and the maintenance of cleanliness.

  19. Streamlining the Design-to-Build Transition with Build-Optimization Software Tools.

    PubMed

    Oberortner, Ernst; Cheng, Jan-Fang; Hillson, Nathan J; Deutsch, Samuel

    2017-03-17

    Scaling-up capabilities for the design, build, and test of synthetic biology constructs holds great promise for the development of new applications in fuels, chemical production, or cellular-behavior engineering. Construct design is an essential component in this process; however, not every designed DNA sequence can be readily manufactured, even using state-of-the-art DNA synthesis methods. Current biological computer-aided design and manufacture tools (bioCAD/CAM) do not adequately consider the limitations of DNA synthesis technologies when generating their outputs. Designed sequences that violate DNA synthesis constraints may require substantial sequence redesign or lead to price-premiums and temporal delays, which adversely impact the efficiency of the DNA manufacturing process. We have developed a suite of build-optimization software tools (BOOST) to streamline the design-build transition in synthetic biology engineering workflows. BOOST incorporates knowledge of DNA synthesis success determinants into the design process to output ready-to-build sequences, preempting the need for sequence redesign. The BOOST web application is available at https://boost.jgi.doe.gov and its Application Program Interfaces (API) enable integration into automated, customized DNA design processes. The herein presented results highlight the effectiveness of BOOST in reducing DNA synthesis costs and timelines.

  20. A fuzzy case based reasoning tool for model based approach to rocket engine health monitoring

    NASA Technical Reports Server (NTRS)

    Krovvidy, Srinivas; Nolan, Adam; Hu, Yong-Lin; Wee, William G.

    1992-01-01

    In this system we develop a fuzzy case based reasoner that can build a case representation for several past anomalies detected, and we develop case retrieval methods that can be used to index a relevant case when a new problem (case) is presented using fuzzy sets. The choice of fuzzy sets is justified by the uncertain data. The new problem can be solved using knowledge of the model along with the old cases. This system can then be used to generalize the knowledge from previous cases and use this generalization to refine the existing model definition. This in turn can help to detect failures using the model based algorithms.

  1. Re-Engineering the Tropical Rainfall Measuring Mission (TRMM) Satellite Utilizing Goddard Space Flight Center (GSFC) Mission Services Center (GMSEC) Middleware Based Technology to Enable Lights Out Operations and Autonomous Re-Dump of Lost Telemetry Data

    NASA Technical Reports Server (NTRS)

    Marius, Julio L.; Busch, Jim

    2008-01-01

    The Tropical Rainfall Measuring Mission (TRMM) spacecraft was launched in November of 1996 in order to obtain unique three dimensional radar cross sectional observations of cloud structures with particular interest in hurricanes. The TRMM mission life was recently extended with current estimates that operations will continue through the 2012-2013 timeframe. Faced with this extended mission profile, the project has embarked on a technology refresh and re-engineering effort. TRMM has recently implemented a re-engineering effort to expand a middleware based messaging architecture to enable fully redundant lights-out of flight operations activities. The middleware approach is based on the Goddard Mission Services Evolution Center (GMSEC) architecture, tools and associated open-source Applications Programming Interface (API). Middleware based messaging systems are useful in spacecraft operations and automation systems because private node based knowledge (such as that within a telemetry and command system) can be broadcast on the middleware messaging bus and hence enable collaborative decisions to be made by multiple subsystems. In this fashion, private data is made public and distributed within the local area network and multiple nodes can remain synchronized with other nodes. This concept is useful in a fully redundant architecture whereby one node is monitoring the processing of the 'prime' node so that in the event of a failure the backup node can assume operations of the prime, without loss of state knowledge. This paper will review and present the experiences, architecture, approach and lessons learned of the TRMM re-engineering effort centered on the GMSEC middleware architecture and tool suite. Relevant information will be presented that relates to the dual redundant parallel nature of the Telemetry and Command (T and C) and Front-End systems and how these systems can interact over a middleware bus to achieve autonomous operations including autonomous commanding to recover missing science data during the same spacecraft contact.

  2. How Engineers Negotiate Domain Boundaries in a Complex, Interdisciplinary Engineering Project

    NASA Technical Reports Server (NTRS)

    Panther, Grace; Montfort, Devlin; Pirtle, Zachary

    2017-01-01

    Engineering educators have an essential role in preparing engineers to work in a complex, interdisciplinary workforce. While much engineering education focuses on teaching students to develop disciplinary expertise in specific engineering domains, there is a strong need to teach engineers about the knowledge that they develop or use in their work (Bucciarelli 1994, Allenby Sarewitz, 2011; Frodeman, 2013). The purpose of this research is to gain a better understanding of the knowledge systems of practicing engineers through observations of their practices such that the insights learned can guide future education efforts. Using an example from a complex and interdisciplinary engineering project, this paper presents a case study overviewing the types of epistemological (or knowledge-acquiring or using) complexities that engineers navigate. Specifically, we looked at a discussion of the thermal design of a CubeSat that occurred during an engineering review at NASA. We analyzed the review using a framework that we call 'peak events', or pointed discussions between reviewers, project engineers, and managers. We examined the dialog within peak events to identify the ways that knowledge was brought to bear, highlighting discussions of uncertainty and the boundaries of knowledge claims. We focus on one example discussion surrounding the thermal design of the CubeSat, which provides a particularly thorough example of a knowledge system since the engineers present explained, justified, negotiated, and defended knowledge within a social setting. Engineering students do not get much practice or instruction in explicitly negotiating knowledge systems and epistemic standards in this way. We highlight issues that should matter to engineering educators, such as the need to discuss what level of uncertainty is sufficient and the need to negotiate boundaries of system responsibility. Although this analysis is limited to a single discussion or 'peak event', our case shows that this type of discussion can occur in engineering and suggests that it could be important for future engineering education research.

  3. Preparing the NDE engineers of the future: Education, training, and diversity

    NASA Astrophysics Data System (ADS)

    Holland, Stephen D.

    2017-02-01

    As quantitative NDE has matured and entered the mainstream, it has created an industry need for engineers who can select, evaluate, and qualify NDE techniques to satisfy quantitative engineering requirements. NDE as a field is cross-disciplinary with major NDE techniques relying on a broad spectrum of physics disciplines including fluid mechanics, electromagnetics, mechanical waves, and high energy physics. An NDE engineer needs broad and deep understanding of the measurement physics across modalities, a general engineering background, and familiarity with shop-floor practices and tools. While there are a wide range of certification and training programs worldwide for NDE technicians, there are few programs aimed at engineers. At the same time, substantial demographic shifts are underway with many experienced NDE engineers and technicians nearing retirement, and with new generations coming from much more diverse backgrounds. There is a need for more and better education opportunities for NDE engineers. Both teaching and learning NDE engineering are inherently challenging because of the breadth and depth of knowledge required. At the same time, sustaining the field in a more diverse era will require broadening participation of previously underrepresented groups. The QNDE 2016 conference in Atlanta, GA included a session on NDE education, training, and diversity. This paper summarizes the outcomes and discussion from this session.

  4. Study of Design Knowledge Capture (DKC) schemes implemented in magnetic bearing applications

    NASA Technical Reports Server (NTRS)

    1990-01-01

    A design knowledge capture (DKC) scheme was implemented using frame-based techniques. The objective of such a system is to capture not only the knowledge which describes a design, but also that which explains how the design decisions were reached. These knowledge types were labelled definitive and explanatory, respectively. Examination of the design process helped determine what knowledge to retain and at what stage that knowledge is used. A discussion of frames resulted in the recognition of their value to knowledge representation and organization. The FORMS frame system was used as a basis for further development, and for examples using magnetic bearing design. The specific contributions made by this research include: determination that frame-based systems provide a useful methodology for management and application of design knowledge; definition of specific user interface requirements, (this consists of a window-based browser); specification of syntax for DKC commands; and demonstration of the feasibility of DKC by applications to existing designs. It was determined that design knowledge capture could become an extremely valuable engineering tool for complicated, long-life systems, but that further work was needed, particularly the development of a graphic, window-based interface.

  5. Design knowledge capture for a corporate memory facility

    NASA Technical Reports Server (NTRS)

    Boose, John H.; Shema, David B.; Bradshaw, Jeffrey M.

    1990-01-01

    Currently, much of the information regarding decision alternatives and trade-offs made in the course of a major program development effort is not represented or retained in a way that permits computer-based reasoning over the life cycle of the program. The loss of this information results in problems in tracing design alternatives to requirements, in assessing the impact of change in requirements, and in configuration management. To address these problems, the problem was studied of building an intelligent, active corporate memory facility which would provide for the capture of the requirements and standards of a program, analyze the design alternatives and trade-offs made over the program's lifetime, and examine relationships between requirements and design trade-offs. Early phases of the work have concentrated on design knowledge capture for the Space Station Freedom. Tools are demonstrated and extended which helps automate and document engineering trade studies, and another tool is being developed to help designers interactively explore design alternatives and constraints.

  6. Knowledge Preservation and Web-tools

    NASA Technical Reports Server (NTRS)

    Moreman, Douglas; Dyer, John; Ahmad, Rashed

    1998-01-01

    We propose a library of "netbooks" as part of a national effort, preserving the wisdom of the early Space Program. NASA is losing its rocket scientists who designed the great systems of the past. Few new systems of similar ambition are being built; much of the expertise that took us to the Moon is evaporating. With retiring NASA designers, we work to preserve something of the expertise of these individuals, developed at great national cost. We show others the tools that make preservation easy and cheap. Retiring engineers and scientists can be coached into speaking (without charge) into recording devices about ideas not widely appreciated but of potential future value. Transcripts of the recordings and the audio itself are combined (cheaply) in netbooks accessible via a standard web-browser (free). Selected netbooks are indexed into a rapidly searchable system, an electronic Library. We recruit support in establishing a standards committee for that Library. The system is to be a model for access by the blind as well as for preservation of important, technical knowledge.

  7. Automating software design system DESTA

    NASA Technical Reports Server (NTRS)

    Lovitsky, Vladimir A.; Pearce, Patricia D.

    1992-01-01

    'DESTA' is the acronym for the Dialogue Evolutionary Synthesizer of Turnkey Algorithms by means of a natural language (Russian or English) functional specification of algorithms or software being developed. DESTA represents the computer-aided and/or automatic artificial intelligence 'forgiving' system which provides users with software tools support for algorithm and/or structured program development. The DESTA system is intended to provide support for the higher levels and earlier stages of engineering design of software in contrast to conventional Computer Aided Design (CAD) systems which provide low level tools for use at a stage when the major planning and structuring decisions have already been taken. DESTA is a knowledge-intensive system. The main features of the knowledge are procedures, functions, modules, operating system commands, batch files, their natural language specifications, and their interlinks. The specific domain for the DESTA system is a high level programming language like Turbo Pascal 6.0. The DESTA system is operational and runs on an IBM PC computer.

  8. FTDD973: A multimedia knowledge-based system and methodology for operator training and diagnostics

    NASA Technical Reports Server (NTRS)

    Hekmatpour, Amir; Brown, Gary; Brault, Randy; Bowen, Greg

    1993-01-01

    FTDD973 (973 Fabricator Training, Documentation, and Diagnostics) is an interactive multimedia knowledge based system and methodology for computer-aided training and certification of operators, as well as tool and process diagnostics in IBM's CMOS SGP fabrication line (building 973). FTDD973 is an example of what can be achieved with modern multimedia workstations. Knowledge-based systems, hypertext, hypergraphics, high resolution images, audio, motion video, and animation are technologies that in synergy can be far more useful than each by itself. FTDD973's modular and object-oriented architecture is also an example of how improvements in software engineering are finally making it possible to combine many software modules into one application. FTDD973 is developed in ExperMedia/2; and OS/2 multimedia expert system shell for domain experts.

  9. A Knowledge Engineering Approach to Analysis and Evaluation of Construction Schedules

    DTIC Science & Technology

    1990-02-01

    software engineering discipline focusing on constructing KBSs. It is an incremental and cyclical process that requires the interaction of a domain expert(s...the U.S. Army Coips of Engineers ; and (3) the project management software developer, represented by Pinnell Engineering , Inc. Since the primary...the programming skills necessary to convert the raw knowledge intn a form a computer can understand. knowledge engineering : The software engineering

  10. Knowledge translation in rehabilitation engineering research and development: a knowledge ecosystem framework.

    PubMed

    Chau, Tom; Moghimi, Saba; Popovic, Milos R

    2013-01-01

    Rehabilitation engineering is concerned with technology innovations and technology-mediated treatments for the improvement of quality of care and quality of life of individuals with disability. Unlike many other fields of health research, the knowledge translation (KT) cycle of rehabilitation engineering research and development (R&D) is often considered incomplete until a technology product or technology-facilitated therapy is available to target clientele. As such, the KT journey of rehabilitation engineering R&D is extremely challenging, necessarily involving knowledge exchange among numerous players across multiple sectors. In this article, we draw on recent literature about the knowledge trichotomy in technology-based rehabilitation R&D and propose a knowledge ecosystem to frame the rehabilitation engineering KT process from need to product. Identifying the principal process of the ecosystem as one of knowledge flow, we elucidate the roles of repository and networked knowledge, identify key consumers and producers in a trinity of communities of practice, and draw on knowledge management literature to describe different knowledge flows. The article concludes with instantiations of this knowledge ecosystem for 2 local rehabilitation engineering research-development-commercialization endeavors. Copyright © 2013 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  11. Knowledge-based system verification and validation

    NASA Technical Reports Server (NTRS)

    Johnson, Sally C.

    1990-01-01

    The objective of this task is to develop and evaluate a methodology for verification and validation (V&V) of knowledge-based systems (KBS) for space station applications with high reliability requirements. The approach consists of three interrelated tasks. The first task is to evaluate the effectiveness of various validation methods for space station applications. The second task is to recommend requirements for KBS V&V for Space Station Freedom (SSF). The third task is to recommend modifications to the SSF to support the development of KBS using effectiveness software engineering and validation techniques. To accomplish the first task, three complementary techniques will be evaluated: (1) Sensitivity Analysis (Worchester Polytechnic Institute); (2) Formal Verification of Safety Properties (SRI International); and (3) Consistency and Completeness Checking (Lockheed AI Center). During FY89 and FY90, each contractor will independently demonstrate the user of his technique on the fault detection, isolation, and reconfiguration (FDIR) KBS or the manned maneuvering unit (MMU), a rule-based system implemented in LISP. During FY91, the application of each of the techniques to other knowledge representations and KBS architectures will be addressed. After evaluation of the results of the first task and examination of Space Station Freedom V&V requirements for conventional software, a comprehensive KBS V&V methodology will be developed and documented. Development of highly reliable KBS's cannot be accomplished without effective software engineering methods. Using the results of current in-house research to develop and assess software engineering methods for KBS's as well as assessment of techniques being developed elsewhere, an effective software engineering methodology for space station KBS's will be developed, and modification of the SSF to support these tools and methods will be addressed.

  12. Automation of the Environmental Control and Life Support System

    NASA Technical Reports Server (NTRS)

    Dewberry, Brandon S.; Carnes, J. Ray

    1990-01-01

    The objective of the Environmental Control and Life Support System (ECLSS) Advanced Automation Project is to recommend and develop advanced software for the initial and evolutionary Space Station Freedom (SSF) ECLS system which will minimize the crew and ground manpower needed for operations. Another objective includes capturing ECLSS design and development knowledge for future missions. This report summarizes our results from Phase I, the ECLSS domain analysis phase, which we broke down into three steps: 1) Analyze and document the baselined ECLS system, 2) envision as our goal an evolution to a fully automated regenerative life support system, built upon an augmented baseline, and 3) document the augmentations (hooks and scars) and advanced software systems which we see as necessary in achieving minimal manpower support for ECLSS operations. In addition, Phase I included development of an advanced software life cycle testing tools will be used in the development of the software. In this way, we plan in preparation for phase II and III, the development and integration phases, respectively. Automated knowledge acquisition, engineering, verification, and can capture ECLSS development knowledge for future use, develop more robust and complex software, provide feedback to the KBS tool community, and insure proper visibility of our efforts.

  13. No wisdom in the crowd: genome annotation in the era of big data - current status and future prospects.

    PubMed

    Danchin, Antoine; Ouzounis, Christos; Tokuyasu, Taku; Zucker, Jean-Daniel

    2018-07-01

    Science and engineering rely on the accumulation and dissemination of knowledge to make discoveries and create new designs. Discovery-driven genome research rests on knowledge passed on via gene annotations. In response to the deluge of sequencing big data, standard annotation practice employs automated procedures that rely on majority rules. We argue this hinders progress through the generation and propagation of errors, leading investigators into blind alleys. More subtly, this inductive process discourages the discovery of novelty, which remains essential in biological research and reflects the nature of biology itself. Annotation systems, rather than being repositories of facts, should be tools that support multiple modes of inference. By combining deduction, induction and abduction, investigators can generate hypotheses when accurate knowledge is extracted from model databases. A key stance is to depart from 'the sequence tells the structure tells the function' fallacy, placing function first. We illustrate our approach with examples of critical or unexpected pathways, using MicroScope to demonstrate how tools can be implemented following the principles we advocate. We end with a challenge to the reader. © 2018 The Authors. Microbial Biotechnology published by John Wiley & Sons Ltd and Society for Applied Microbiology.

  14. Kwf-Grid workflow management system for Earth science applications

    NASA Astrophysics Data System (ADS)

    Tran, V.; Hluchy, L.

    2009-04-01

    In this paper, we present workflow management tool for Earth science applications in EGEE. The workflow management tool was originally developed within K-wf Grid project for GT4 middleware and has many advanced features like semi-automatic workflow composition, user-friendly GUI for managing workflows, knowledge management. In EGEE, we are porting the workflow management tool to gLite middleware for Earth science applications K-wf Grid workflow management system was developed within "Knowledge-based Workflow System for Grid Applications" under the 6th Framework Programme. The workflow mangement system intended to - semi-automatically compose a workflow of Grid services, - execute the composed workflow application in a Grid computing environment, - monitor the performance of the Grid infrastructure and the Grid applications, - analyze the resulting monitoring information, - capture the knowledge that is contained in the information by means of intelligent agents, - and finally to reuse the joined knowledge gathered from all participating users in a collaborative way in order to efficiently construct workflows for new Grid applications. Kwf Grid workflow engines can support different types of jobs (e.g. GRAM job, web services) in a workflow. New class of gLite job has been added to the system, allows system to manage and execute gLite jobs in EGEE infrastructure. The GUI has been adapted to the requirements of EGEE users, new credential management servlet is added to portal. Porting K-wf Grid workflow management system to gLite would allow EGEE users to use the system and benefit from its avanced features. The system is primarly tested and evaluated with applications from ES clusters.

  15. Hybrid semi-parametric mathematical systems: bridging the gap between systems biology and process engineering.

    PubMed

    Teixeira, Ana P; Carinhas, Nuno; Dias, João M L; Cruz, Pedro; Alves, Paula M; Carrondo, Manuel J T; Oliveira, Rui

    2007-12-01

    Systems biology is an integrative science that aims at the global characterization of biological systems. Huge amounts of data regarding gene expression, proteins activity and metabolite concentrations are collected by designing systematic genetic or environmental perturbations. Then the challenge is to integrate such data in a global model in order to provide a global picture of the cell. The analysis of these data is largely dominated by nonparametric modelling tools. In contrast, classical bioprocess engineering has been primarily founded on first principles models, but it has systematically overlooked the details of the embedded biological system. The full complexity of biological systems is currently assumed by systems biology and this knowledge can now be taken by engineers to decide how to optimally design and operate their processes. This paper discusses possible methodologies for the integration of systems biology and bioprocess engineering with emphasis on applications involving animal cell cultures. At the mathematical systems level, the discussion is focused on hybrid semi-parametric systems as a way to bridge systems biology and bioprocess engineering.

  16. A Computational Workflow for the Automated Generation of Models of Genetic Designs.

    PubMed

    Misirli, Göksel; Nguyen, Tramy; McLaughlin, James Alastair; Vaidyanathan, Prashant; Jones, Timothy S; Densmore, Douglas; Myers, Chris; Wipat, Anil

    2018-06-05

    Computational models are essential to engineer predictable biological systems and to scale up this process for complex systems. Computational modeling often requires expert knowledge and data to build models. Clearly, manual creation of models is not scalable for large designs. Despite several automated model construction approaches, computational methodologies to bridge knowledge in design repositories and the process of creating computational models have still not been established. This paper describes a workflow for automatic generation of computational models of genetic circuits from data stored in design repositories using existing standards. This workflow leverages the software tool SBOLDesigner to build structural models that are then enriched by the Virtual Parts Repository API using Systems Biology Open Language (SBOL) data fetched from the SynBioHub design repository. The iBioSim software tool is then utilized to convert this SBOL description into a computational model encoded using the Systems Biology Markup Language (SBML). Finally, this SBML model can be simulated using a variety of methods. This workflow provides synthetic biologists with easy to use tools to create predictable biological systems, hiding away the complexity of building computational models. This approach can further be incorporated into other computational workflows for design automation.

  17. A review on powder-based additive manufacturing for tissue engineering: selective laser sintering and inkjet 3D printing

    PubMed Central

    Shirazi, Seyed Farid Seyed; Gharehkhani, Samira; Mehrali, Mehdi; Yarmand, Hooman; Metselaar, Hendrik Simon Cornelis; Adib Kadri, Nahrizul; Osman, Noor Azuan Abu

    2015-01-01

    Since most starting materials for tissue engineering are in powder form, using powder-based additive manufacturing methods is attractive and practical. The principal point of employing additive manufacturing (AM) systems is to fabricate parts with arbitrary geometrical complexity with relatively minimal tooling cost and time. Selective laser sintering (SLS) and inkjet 3D printing (3DP) are two powerful and versatile AM techniques which are applicable to powder-based material systems. Hence, the latest state of knowledge available on the use of AM powder-based techniques in tissue engineering and their effect on mechanical and biological properties of fabricated tissues and scaffolds must be updated. Determining the effective setup of parameters, developing improved biocompatible/bioactive materials, and improving the mechanical/biological properties of laser sintered and 3D printed tissues are the three main concerns which have been investigated in this article. PMID:27877783

  18. A review on powder-based additive manufacturing for tissue engineering: selective laser sintering and inkjet 3D printing.

    PubMed

    Shirazi, Seyed Farid Seyed; Gharehkhani, Samira; Mehrali, Mehdi; Yarmand, Hooman; Metselaar, Hendrik Simon Cornelis; Adib Kadri, Nahrizul; Osman, Noor Azuan Abu

    2015-06-01

    Since most starting materials for tissue engineering are in powder form, using powder-based additive manufacturing methods is attractive and practical. The principal point of employing additive manufacturing (AM) systems is to fabricate parts with arbitrary geometrical complexity with relatively minimal tooling cost and time. Selective laser sintering (SLS) and inkjet 3D printing (3DP) are two powerful and versatile AM techniques which are applicable to powder-based material systems. Hence, the latest state of knowledge available on the use of AM powder-based techniques in tissue engineering and their effect on mechanical and biological properties of fabricated tissues and scaffolds must be updated. Determining the effective setup of parameters, developing improved biocompatible/bioactive materials, and improving the mechanical/biological properties of laser sintered and 3D printed tissues are the three main concerns which have been investigated in this article.

  19. Streptomyces species: Ideal chassis for natural product discovery and overproduction.

    PubMed

    Liu, Ran; Deng, Zixin; Liu, Tiangang

    2018-05-28

    There is considerable interest in mining organisms for new natural products (NPs) and in improving methods to overproduce valuable NPs. Because of the rapid development of tools and strategies for metabolic engineering and the markedly increased knowledge of the biosynthetic pathways and genetics of NP-producing organisms, genome mining and overproduction of NPs can be dramatically accelerated. In particular, Streptomyces species have been proposed as suitable chassis organisms for NP discovery and overproduction because of their many unique characteristics not shared with yeast, Escherichia coli, or other microorganisms. In this review, we summarize the methods for genome sequencing, gene cluster prediction, and gene editing in Streptomyces, as well as metabolic engineering strategies for NP overproduction and approaches for generating new products. Finally, two strategies for utilizing Streptomyces as the chassis for NP discovery and overproduction are emphasized. Copyright © 2018 International Metabolic Engineering Society. Published by Elsevier Inc. All rights reserved.

  20. Research-IQ: Development and Evaluation of an Ontology-anchored Integrative Query Tool

    PubMed Central

    Borlawsky, Tara B.; Lele, Omkar; Payne, Philip R. O.

    2011-01-01

    Investigators in the translational research and systems medicine domains require highly usable, efficient and integrative tools and methods that allow for the navigation of and reasoning over emerging large-scale data sets. Such resources must cover a spectrum of granularity from bio-molecules to population phenotypes. Given such information needs, we report upon the initial design and evaluation of an ontology-anchored integrative query tool, Research-IQ, which employs a combination of conceptual knowledge engineering and information retrieval techniques to enable the intuitive and rapid construction of queries, in terms of semi-structured textual propositions, that can subsequently be applied to integrative data sets. Our initial results, based upon both quantitative and qualitative evaluations of the efficacy and usability of Research-IQ, demonstrate its potential to increase clinical and translational research throughput. PMID:21821150

  1. MOORE: A prototype expert system for diagnosing spacecraft problems

    NASA Technical Reports Server (NTRS)

    Howlin, Katherine; Weissert, Jerry; Krantz, Kerry

    1988-01-01

    MOORE is a rule-based, prototype expert system that assists in diagnosing operational Tracking and Data Relay Satellite (TDRS) problems. It is intended to assist spacecraft engineers at the TDRS ground terminal in trouble shooting problems that are not readily solved with routine procedures, and without expert counsel. An additional goal of the prototype system is to develop in-house expert system and knowledge engineering skills. The prototype system diagnoses antenna pointing and earth pointing problems that may occur within the TDRS Attitude Control System (ACS). Plans include expansion to fault isolation of problems in the most critical subsystems of the TDRS spacecraft. Long term benefits are anticipated with use of an expert system during future TDRS programs with increased mission support time, reduced problem solving time, and retained expert knowledge and experience. Phase 2 of the project is intended to provide NASA the necessary expertise and capability to define requirements, evaluate proposals, and monitor the development progress of a highly competent expert system for NASA's Tracking Data Relay Satellite. Phase 2 also envisions addressing two unexplored applications for expert systems, spacecraft integration and tests (I and T) and support to launch activities. The concept, goals, domain, tools, knowledge acquisition, developmental approach, and design of the expert system. It will explain how NASA obtained the knowledge and capability to develop the system in-house without assistance from outside consultants. Future plans will also be presented.

  2. System Maturity and Architecture Assessment Methods, Processes, and Tools

    DTIC Science & Technology

    2012-03-02

    Deshmukh , and M. Sarfaraz. Development of Systems Engineering Maturity Models and Management Tools. Systems Engineering Research Center Final Technical...Ramirez- Marquez, D. Nowicki, A. Deshmukh , and M. Sarfaraz. Development of Systems Engineering Maturity Models and Management Tools. Systems Engineering

  3. Building an ontology of pulmonary diseases with natural language processing tools using textual corpora.

    PubMed

    Baneyx, Audrey; Charlet, Jean; Jaulent, Marie-Christine

    2007-01-01

    Pathologies and acts are classified in thesauri to help physicians to code their activity. In practice, the use of thesauri is not sufficient to reduce variability in coding and thesauri are not suitable for computer processing. We think the automation of the coding task requires a conceptual modeling of medical items: an ontology. Our task is to help lung specialists code acts and diagnoses with software that represents medical knowledge of this concerned specialty by an ontology. The objective of the reported work was to build an ontology of pulmonary diseases dedicated to the coding process. To carry out this objective, we develop a precise methodological process for the knowledge engineer in order to build various types of medical ontologies. This process is based on the need to express precisely in natural language the meaning of each concept using differential semantics principles. A differential ontology is a hierarchy of concepts and relationships organized according to their similarities and differences. Our main research hypothesis is to apply natural language processing tools to corpora to develop the resources needed to build the ontology. We consider two corpora, one composed of patient discharge summaries and the other being a teaching book. We propose to combine two approaches to enrich the ontology building: (i) a method which consists of building terminological resources through distributional analysis and (ii) a method based on the observation of corpus sequences in order to reveal semantic relationships. Our ontology currently includes 1550 concepts and the software implementing the coding process is still under development. Results show that the proposed approach is operational and indicates that the combination of these methods and the comparison of the resulting terminological structures give interesting clues to a knowledge engineer for the building of an ontology.

  4. Industrial Adoption of Model-Based Systems Engineering: Challenges and Strategies

    NASA Astrophysics Data System (ADS)

    Maheshwari, Apoorv

    As design teams are becoming more globally integrated, one of the biggest challenges is to efficiently communicate across the team. The increasing complexity and multi-disciplinary nature of the products are also making it difficult to keep track of all the information generated during the design process by these global team members. System engineers have identified Model-based Systems Engineering (MBSE) as a possible solution where the emphasis is placed on the application of visual modeling methods and best practices to systems engineering (SE) activities right from the beginning of the conceptual design phases through to the end of the product lifecycle. Despite several advantages, there are multiple challenges restricting the adoption of MBSE by industry. We mainly consider the following two challenges: a) Industry perceives MBSE just as a diagramming tool and does not see too much value in MBSE; b) Industrial adopters are skeptical if the products developed using MBSE approach will be accepted by the regulatory bodies. To provide counter evidence to the former challenge, we developed a generic framework for translation from an MBSE tool (Systems Modeling Language, SysML) to an analysis tool (Agent-Based Modeling, ABM). The translation is demonstrated using a simplified air traffic management problem and provides an example of a potential quite significant value: the ability to use MBSE representations directly in an analysis setting. For the latter challenge, we are developing a reference model that uses SysML to represent a generic infusion pump and SE process for planning, developing, and obtaining regulatory approval of a medical device. This reference model demonstrates how regulatory requirements can be captured effectively through model-based representations. We will present another case study at the end where we will apply the knowledge gained from both case studies to a UAV design problem.

  5. Capturing, using, and managing quality assurance knowledge for shuttle post-MECO flight design

    NASA Technical Reports Server (NTRS)

    Peters, H. L.; Fussell, L. R.; Goodwin, M. A.; Schultz, Roger D.

    1991-01-01

    Ascent initialization values used by the Shuttle's onboard computer for nominal and abort mission scenarios are verified by a six degrees of freedom computer simulation. The procedure that the Ascent Post Main Engine Cutoff (Post-MECO) group uses to perform quality assurance (QA) of the simulation is time consuming. Also, the QA data, checklists and associated rationale, though known by the group members, is not sufficiently documented, hindering transfer of knowledge and problem resolution. A new QA procedure which retains the current high level of integrity while reducing the time required to perform QA is needed to support the increasing Shuttle flight rate. Documenting the knowledge is also needed to increase its availability for training and problem resolution. To meet these needs, a knowledge capture process, embedded into the group activities, was initiated to verify the existing QA checks, define new ones, and document all rationale. The resulting checks were automated in a conventional software program to achieve the desired standardization, integrity, and time reduction. A prototype electronic knowledge base was developed with Macintosh's HyperCard to serve as a knowledge capture tool and data storage.

  6. Crosscutting Development- EVA Tools and Geology Sample Acquisition

    NASA Technical Reports Server (NTRS)

    2011-01-01

    Exploration to all destinations has at one time or another involved the acquisition and return of samples and context data. Gathered at the summit of the highest mountain, the floor of the deepest sea, or the ice of a polar surface, samples and their value (both scientific and symbolic) have been a mainstay of Earthly exploration. In manned spaceflight exploration, the gathering of samples and their contextual information has continued. With the extension of collecting activities to spaceflight destinations comes the need for geology tools and equipment uniquely designed for use by suited crew members in radically different environments from conventional field geology. Beginning with the first Apollo Lunar Surface Extravehicular Activity (EVA), EVA Geology Tools were successfully used to enable the exploration and scientific sample gathering objectives of the lunar crew members. These early designs were a step in the evolution of Field Geology equipment, and the evolution continues today. Contemporary efforts seek to build upon and extend the knowledge gained in not only the Apollo program but a wealth of terrestrial field geology methods and hardware that have continued to evolve since the last lunar surface EVA. This paper is presented with intentional focus on documenting the continuing evolution and growing body of knowledge for both engineering and science team members seeking to further the development of EVA Geology. Recent engineering development and field testing efforts of EVA Geology equipment for surface EVA applications are presented, including the 2010 Desert Research and Technology Studies (Desert RATs) field trial. An executive summary of findings will also be presented, detailing efforts recommended for exotic sample acquisition and pre-return curation development regardless of planetary or microgravity destination.

  7. Systems Engineering Knowledge Asset (SEKA) Management for Higher Performing Engineering Teams: People, Process and Technology toward Effective Knowledge-Workers

    ERIC Educational Resources Information Center

    Shelby, Kenneth R., Jr.

    2013-01-01

    Systems engineering teams' value-creation for enterprises is slower than possible due to inefficiencies in communication, learning, common knowledge collaboration and leadership conduct. This dissertation outlines the surrounding people, process and technology dimensions for higher performing engineering teams. It describes a true experiment…

  8. Retraining the Modern Civil Engineer.

    ERIC Educational Resources Information Center

    Priscoli, Jerome Delli

    1983-01-01

    Discusses why modern engineering requires social science and the nature of planning. After these conceptional discussions, 12 practical tools which social science brings to engineering are reviewed. A tested approach to training engineers in these tools is then described. Tools include institutional analysis, policy profiling, and other impact…

  9. Empirical relationship between electrical resistivity and geotechnical parameters: A case study of Federal University of Technology campus, Akure SW, Nigeria

    NASA Astrophysics Data System (ADS)

    Akintorinwa, O. J.; Oluwole, S. T.

    2018-06-01

    For several decades, geophysical prospecting method coupled with geotechnical analysis has become increasingly useful in evaluating the subsurface for both pre and post engineering investigations. Shallow geophysical tool is often used alongside geotechnical method to evaluate subsurface soil for engineering study to obtain information which may include the subsurface lithology and their thicknesses, competence of the bedrock and depths to its upper interface, and competence of the material that make up the overburden, especially the shallow section which serves as host for foundations of engineering structures (Aina et al., 1996; Adewumi and Olorunfemi, 2005; and Idornigie et al., 2006). This information helps the engineers to correctly locate and design the foundation of engineering structures. The information also serves as guide to the choice of design and suitable materials needed for road construction (Akinlabi and Adeyemi, 2014). Lack of knowledge of the properties of subsurface may leads to the failure of most engineering structures. Therefore, it is of great importance to carry out a pre-construction investigation of a proposed site in order to ascertain the fitness of the host earth material.

  10. GeneView: a comprehensive semantic search engine for PubMed.

    PubMed

    Thomas, Philippe; Starlinger, Johannes; Vowinkel, Alexander; Arzt, Sebastian; Leser, Ulf

    2012-07-01

    Research results are primarily published in scientific literature and curation efforts cannot keep up with the rapid growth of published literature. The plethora of knowledge remains hidden in large text repositories like MEDLINE. Consequently, life scientists have to spend a great amount of time searching for specific information. The enormous ambiguity among most names of biomedical objects such as genes, chemicals and diseases often produces too large and unspecific search results. We present GeneView, a semantic search engine for biomedical knowledge. GeneView is built upon a comprehensively annotated version of PubMed abstracts and openly available PubMed Central full texts. This semi-structured representation of biomedical texts enables a number of features extending classical search engines. For instance, users may search for entities using unique database identifiers or they may rank documents by the number of specific mentions they contain. Annotation is performed by a multitude of state-of-the-art text-mining tools for recognizing mentions from 10 entity classes and for identifying protein-protein interactions. GeneView currently contains annotations for >194 million entities from 10 classes for ∼21 million citations with 271,000 full text bodies. GeneView can be searched at http://bc3.informatik.hu-berlin.de/.

  11. Supporting interoperability of collaborative networks through engineering of a service-based Mediation Information System (MISE 2.0)

    NASA Astrophysics Data System (ADS)

    Benaben, Frederick; Mu, Wenxin; Boissel-Dallier, Nicolas; Barthe-Delanoe, Anne-Marie; Zribi, Sarah; Pingaud, Herve

    2015-08-01

    The Mediation Information System Engineering project is currently finishing its second iteration (MISE 2.0). The main objective of this scientific project is to provide any emerging collaborative situation with methods and tools to deploy a Mediation Information System (MIS). MISE 2.0 aims at defining and designing a service-based platform, dedicated to initiating and supporting the interoperability of collaborative situations among potential partners. This MISE 2.0 platform implements a model-driven engineering approach to the design of a service-oriented MIS dedicated to supporting the collaborative situation. This approach is structured in three layers, each providing their own key innovative points: (i) the gathering of individual and collaborative knowledge to provide appropriate collaborative business behaviour (key point: knowledge management, including semantics, exploitation and capitalisation), (ii) deployment of a mediation information system able to computerise the previously deduced collaborative processes (key point: the automatic generation of collaborative workflows, including connection with existing devices or services) (iii) the management of the agility of the obtained collaborative network of organisations (key point: supervision of collaborative situations and relevant exploitation of the gathered data). MISE covers business issues (through BPM), technical issues (through an SOA) and agility issues of collaborative situations (through EDA).

  12. Integration Framework of Process Planning based on Resource Independent Operation Summary to Support Collaborative Manufacturing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kulvatunyou, Boonserm; Wysk, Richard A.; Cho, Hyunbo

    2004-06-01

    In today's global manufacturing environment, manufacturing functions are distributed as never before. Design, engineering, fabrication, and assembly of new products are done routinely in many different enterprises scattered around the world. Successful business transactions require the sharing of design and engineering data on an unprecedented scale. This paper describes a framework that facilitates the collaboration of engineering tasks, particularly process planning and analysis, to support such globalized manufacturing activities. The information models of data and the software components that integrate those information models are described. The integration framework uses an Integrated Product and Process Data (IPPD) representation called a Resourcemore » Independent Operation Summary (RIOS) to facilitate the communication of business and manufacturing requirements. Hierarchical process modeling, process planning decomposition and an augmented AND/OR directed graph are used in this representation. The Resource Specific Process Planning (RSPP) module assigns required equipment and tools, selects process parameters, and determines manufacturing costs based on two-level hierarchical RIOS data. The shop floor knowledge (resource and process knowledge) and a hybrid approach (heuristic and linear programming) to linearize the AND/OR graph provide the basis for the planning. Finally, a prototype system is developed and demonstrated with an exemplary part. Java and XML (Extensible Markup Language) are used to ensure software and information portability.« less

  13. A knowledge-based design framework for airplane conceptual and preliminary design

    NASA Astrophysics Data System (ADS)

    Anemaat, Wilhelmus A. J.

    The goal of work described herein is to develop the second generation of Advanced Aircraft Analysis (AAA) into an object-oriented structure which can be used in different environments. One such environment is the third generation of AAA with its own user interface, the other environment with the same AAA methods (i.e. the knowledge) is the AAA-AML program. AAA-AML automates the initial airplane design process using current AAA methods in combination with AMRaven methodologies for dependency tracking and knowledge management, using the TechnoSoft Adaptive Modeling Language (AML). This will lead to the following benefits: (1) Reduced design time: computer aided design methods can reduce design and development time and replace tedious hand calculations. (2) Better product through improved design: more alternative designs can be evaluated in the same time span, which can lead to improved quality. (3) Reduced design cost: due to less training and less calculation errors substantial savings in design time and related cost can be obtained. (4) Improved Efficiency: the design engineer can avoid technically correct but irrelevant calculations on incomplete or out of sync information, particularly if the process enables robust geometry earlier. Although numerous advancements in knowledge based design have been developed for detailed design, currently no such integrated knowledge based conceptual and preliminary airplane design system exists. The third generation AAA methods are tested over a ten year period on many different airplane designs. Using AAA methods will demonstrate significant time savings. The AAA-AML system will be exercised and tested using 27 existing airplanes ranging from single engine propeller, business jets, airliners, UAV's to fighters. Data for the varied sizing methods will be compared with AAA results, to validate these methods. One new design, a Light Sport Aircraft (LSA), will be developed as an exercise to use the tool for designing a new airplane. Using these tools will show an improvement in efficiency over using separate programs due to the automatic recalculation with any change of input data. The direct visual feedback of 3D geometry in the AAA-AML, will lead to quicker resolving of problems as opposed to conventional methods.

  14. Transnational Discourses of Knowledge and Learning in Professional Work: Examples from Computer Engineering

    ERIC Educational Resources Information Center

    Nerland, Monika

    2010-01-01

    Taking a Foucauldian framework as its point of departure, this paper discusses how transnational discourses of knowledge and learning operate in the profession of computer engineering and form a certain logic through which modes of being an engineer are regulated. Both the knowledge domain of computer engineering and its related labour market is…

  15. Autonomous, In-Flight Crew Health Risk Management for Exploration-Class Missions: Leveraging the Integrated Medical Model for the Exploration Medical System Demonstration Project

    NASA Technical Reports Server (NTRS)

    Butler, D. J.; Kerstman, E.; Saile, L.; Myers, J.; Walton, M.; Lopez, V.; McGrath, T.

    2011-01-01

    The Integrated Medical Model (IMM) captures organizational knowledge across the space medicine, training, operations, engineering, and research domains. IMM uses this knowledge in the context of a mission and crew profile to forecast risks to crew health and mission success. The IMM establishes a quantified, statistical relationship among medical conditions, risk factors, available medical resources, and crew health and mission outcomes. These relationships may provide an appropriate foundation for developing an in-flight medical decision support tool that helps optimize the use of medical resources and assists in overall crew health management by an autonomous crew with extremely limited interactions with ground support personnel and no chance of resupply.

  16. User observations on information sharing (corporate knowledge and lessons learned)

    NASA Technical Reports Server (NTRS)

    Montague, Ronald A.; Gregg, Lawrence A.; Martin, Shirley A.; Underwood, Leroy H.; Mcgee, John M.

    1993-01-01

    The sharing of 'corporate knowledge' and lessons learned in the NASA aerospace community has been identified by Johnson Space Center survey participants as a desirable tool. The concept of the program is based on creating a user friendly information system that will allow engineers, scientists, and managers at all working levels to share their information and experiences with other users irrespective of location or organization. The survey addresses potential end uses for such a system and offers some guidance on the development of subsequent processes to ensure the integrity of the information shared. This system concept will promote sharing of information between NASA centers, between NASA and its contractors, between NASA and other government agencies, and perhaps between NASA and institutions of higher learning.

  17. Information Integration for Concurrent Engineering (IICE) IDEF3 Process Description Capture Method Report

    DTIC Science & Technology

    1992-05-01

    methodology, knowledge acquisition, 140 requirements definition, information systems, information engineering, 16. PRICE CODE systems engineering...and knowledge resources. Like manpower, materials, and machines, information and knowledge assets are recognized as vital resources that can be...evolve towards an information -integrated enterprise. These technologies are designed to leverage information and knowledge resources as the key

  18. Strategic Integration of Multiple Bioinformatics Resources for System Level Analysis of Biological Networks.

    PubMed

    D'Souza, Mark; Sulakhe, Dinanath; Wang, Sheng; Xie, Bing; Hashemifar, Somaye; Taylor, Andrew; Dubchak, Inna; Conrad Gilliam, T; Maltsev, Natalia

    2017-01-01

    Recent technological advances in genomics allow the production of biological data at unprecedented tera- and petabyte scales. Efficient mining of these vast and complex datasets for the needs of biomedical research critically depends on a seamless integration of the clinical, genomic, and experimental information with prior knowledge about genotype-phenotype relationships. Such experimental data accumulated in publicly available databases should be accessible to a variety of algorithms and analytical pipelines that drive computational analysis and data mining.We present an integrated computational platform Lynx (Sulakhe et al., Nucleic Acids Res 44:D882-D887, 2016) ( http://lynx.cri.uchicago.edu ), a web-based database and knowledge extraction engine. It provides advanced search capabilities and a variety of algorithms for enrichment analysis and network-based gene prioritization. It gives public access to the Lynx integrated knowledge base (LynxKB) and its analytical tools via user-friendly web services and interfaces. The Lynx service-oriented architecture supports annotation and analysis of high-throughput experimental data. Lynx tools assist the user in extracting meaningful knowledge from LynxKB and experimental data, and in the generation of weighted hypotheses regarding the genes and molecular mechanisms contributing to human phenotypes or conditions of interest. The goal of this integrated platform is to support the end-to-end analytical needs of various translational projects.

  19. Recent advances in systems metabolic engineering tools and strategies.

    PubMed

    Chae, Tong Un; Choi, So Young; Kim, Je Woong; Ko, Yoo-Sung; Lee, Sang Yup

    2017-10-01

    Metabolic engineering has been playing increasingly important roles in developing microbial cell factories for the production of various chemicals and materials to achieve sustainable chemical industry. Nowadays, many tools and strategies are available for performing systems metabolic engineering that allows systems-level metabolic engineering in more sophisticated and diverse ways by adopting rapidly advancing methodologies and tools of systems biology, synthetic biology and evolutionary engineering. As an outcome, development of more efficient microbial cell factories has become possible. Here, we review recent advances in systems metabolic engineering tools and strategies together with accompanying application examples. In addition, we describe how these tools and strategies work together in simultaneous and synergistic ways to develop novel microbial cell factories. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. A Framework for Creating a Function-based Design Tool for Failure Mode Identification

    NASA Technical Reports Server (NTRS)

    Arunajadai, Srikesh G.; Stone, Robert B.; Tumer, Irem Y.; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Knowledge of potential failure modes during design is critical for prevention of failures. Currently industries use procedures such as Failure Modes and Effects Analysis (FMEA), Fault Tree analysis, or Failure Modes, Effects and Criticality analysis (FMECA), as well as knowledge and experience, to determine potential failure modes. When new products are being developed there is often a lack of sufficient knowledge of potential failure mode and/or a lack of sufficient experience to identify all failure modes. This gives rise to a situation in which engineers are unable to extract maximum benefits from the above procedures. This work describes a function-based failure identification methodology, which would act as a storehouse of information and experience, providing useful information about the potential failure modes for the design under consideration, as well as enhancing the usefulness of procedures like FMEA. As an example, the method is applied to fifteen products and the benefits are illustrated.

  1. Technology to improve quality and accountability.

    PubMed

    Kay, Jonathan

    2006-01-01

    A body of evidence has been accumulated to demonstrate that current practice is not sufficiently safe for several stages of central laboratory testing. In particular, while analytical and perianalytical steps that take place within the laboratory are subjected to quality control procedures, this is not the case for several pre- and post-analytical steps. The ubiquitous application of auto-identification technology seems to represent a valuable tool for reducing error rates. A series of projects in Oxford has attempted to improve processes which support several areas of laboratory medicine, including point-of-care testing, blood transfusion, delivery and interpretation of reports, and support of decision-making by clinicians. The key tools are auto-identification, Internet communication technology, process re-engineering, and knowledge management.

  2. Software development environments: Status and trends

    NASA Technical Reports Server (NTRS)

    Duffel, Larry E.

    1988-01-01

    Currently software engineers are the essential integrating factors tying several components together. The components consist of process, methods, computers, tools, support environments, and software engineers. The engineers today empower the tools versus the tools empowering the engineers. Some of the issues in software engineering are quality, managing the software engineering process, and productivity. A strategy to accomplish this is to promote the evolution of software engineering from an ad hoc, labor intensive activity to a managed, technology supported discipline. This strategy may be implemented by putting the process under management control, adopting appropriate methods, inserting the technology that provides automated support for the process and methods, collecting automated tools into an integrated environment and educating the personnel.

  3. Clinical results of HIS, RIS, PACS integration using data integration CASE tools

    NASA Astrophysics Data System (ADS)

    Taira, Ricky K.; Chan, Hing-Ming; Breant, Claudine M.; Huang, Lu J.; Valentino, Daniel J.

    1995-05-01

    Current infrastructure research in PACS is dominated by the development of communication networks (local area networks, teleradiology, ATM networks, etc.), multimedia display workstations, and hierarchical image storage architectures. However, limited work has been performed on developing flexible, expansible, and intelligent information processing architectures for the vast decentralized image and text data repositories prevalent in healthcare environments. Patient information is often distributed among multiple data management systems. Current large-scale efforts to integrate medical information and knowledge sources have been costly with limited retrieval functionality. Software integration strategies to unify distributed data and knowledge sources is still lacking commercially. Systems heterogeneity (i.e., differences in hardware platforms, communication protocols, database management software, nomenclature, etc.) is at the heart of the problem and is unlikely to be standardized in the near future. In this paper, we demonstrate the use of newly available CASE (computer- aided software engineering) tools to rapidly integrate HIS, RIS, and PACS information systems. The advantages of these tools include fast development time (low-level code is generated from graphical specifications), and easy system maintenance (excellent documentation, easy to perform changes, and centralized code repository in an object-oriented database). The CASE tools are used to develop and manage the `middle-ware' in our client- mediator-serve architecture for systems integration. Our architecture is scalable and can accommodate heterogeneous database and communication protocols.

  4. Design mentoring tool.

    DOT National Transportation Integrated Search

    2011-01-01

    In 2004 a design engineer on-line mentoring tool was developed and implemented The purpose of the tool was to assist senior engineers : mentoring new engineers to the INDOT design process and improve their technical competency. This approach saves se...

  5. Microplastics in the environment: What can we learn from a decade of engineered nanoparticle fate and risk assessment?

    NASA Astrophysics Data System (ADS)

    Hüffer, T.; Praetorius, A.; Wagner, S.; von der Kammer, F.; Hofmann, T.

    2016-12-01

    The field of environmental fate and risk assessment is frequently dominated by "hot topics" of emerging contaminants; in recent years for example pharmaceuticals, nanomaterials or, most recently, microplastics. Since no emerging pollutant is entirely new, a careful assessment of existing knowledge on related substances can help us direct our research efforts and employ the limited resources in a more efficient way. Crucial questions on the environmental implications of microplastics, for example the need for analytical tools, adequate protocols to study their fate, or the effects of aging and a risk assessment based thereon remain largely unanswered. Over the last decade, the field of environmental implications of engineered nanoparticles (ENPs) has been facing similar challenges. The goal of this contribution is to suggest a road-map to pursue the risk assessment of microplastics based on our experience in one decade in ENPs research. We highlight how to avoid potential dead-ends in microplastics research. We also illustrate that cross-linking other research fields, especially polymer chemistry and material sciences, may facilitate filling the urgent knowledge gaps.

  6. Users manual for an expert system (HSPEXP) for calibration of the hydrological simulation program; Fortran

    USGS Publications Warehouse

    Lumb, A.M.; McCammon, R.B.; Kittle, J.L.

    1994-01-01

    Expert system software was developed to assist less experienced modelers with calibration of a watershed model and to facilitate the interaction between the modeler and the modeling process not provided by mathematical optimization. A prototype was developed with artificial intelligence software tools, a knowledge engineer, and two domain experts. The manual procedures used by the domain experts were identified and the prototype was then coded by the knowledge engineer. The expert system consists of a set of hierarchical rules designed to guide the calibration of the model through a systematic evaluation of model parameters. When the prototype was completed and tested, it was rewritten for portability and operational use and was named HSPEXP. The watershed model Hydrological Simulation Program--Fortran (HSPF) is used in the expert system. This report is the users manual for HSPEXP and contains a discussion of the concepts and detailed steps and examples for using the software. The system has been tested on watersheds in the States of Washington and Maryland, and the system correctly identified the model parameters to be adjusted and the adjustments led to improved calibration.

  7. A comparative analysis of user preference-based and existing knowledge management systems attributes in the aerospace industry

    NASA Astrophysics Data System (ADS)

    Varghese, Nishad G.

    Knowledge management (KM) exists in various forms throughout organizations. Process documentation, training courses, and experience sharing are examples of KM activities performed daily. The goal of KM systems (KMS) is to provide a tool set which serves to standardize the creation, sharing, and acquisition of business critical information. Existing literature provides numerous examples of targeted evaluations of KMS, focusing on specific system attributes. This research serves to bridge the targeted evaluations with an industry-specific, holistic approach. The user preferences of aerospace employees in engineering and engineering-related fields were compared to profiles of existing aerospace KMS based on three attribute categories: technical features, system administration, and user experience. The results indicated there is a statistically significant difference between aerospace user preferences and existing profiles in the user experience attribute category, but no statistically significant difference in the technical features and system administration attribute categories. Additional analysis indicated in-house developed systems exhibit higher technical features and user experience ratings than commercial-off-the-self (COTS) systems.

  8. The Humanistic Side of Engineering: Considering Social Science and Humanities Dimensions of Engineering in Education and Research

    ERIC Educational Resources Information Center

    Hynes, Morgan; Swenson, Jessica

    2013-01-01

    Mathematics and science knowledge/skills are most commonly associated with engineering's pre-requisite knowledge. Our goals in this paper are to argue for a more systematic inclusion of social science and humanities knowledge in the introduction of engineering to K-12 students. As part of this argument, we present a construct for framing the…

  9. Structural engineering masters level education framework of knowledge for the needs of initial professional practice

    NASA Astrophysics Data System (ADS)

    Balogh, Zsuzsa Enriko

    For at least the last decade, engineering, civil engineering, along with structural engineering as a profession within civil engineering, have and continue to face an emerging need for "Raising the Bar" of preparedness of young engineers seeking to become practicing professional engineers. The present consensus of the civil engineering profession is that the increasing need for broad and in-depth knowledge should require the young structural engineers to have at least a Masters-Level education. This study focuses on the Masters-Level preparedness in the structural engineering area within the civil engineering field. It follows much of the methodology used in the American Society of Civil Engineers (ASCE) Body of Knowledge determination for civil engineering and extends this type of study to better define the portion of the young engineers preparation beyond the undergraduate program for one specialty area of civil engineering. The objective of this research was to create a Framework of Knowledge for the young engineer which identifies and recognizes the needs of the profession, along with the profession's expectations of how those needs can be achieved in the graduate-level academic setting, in the practice environment, and through lifelong learning opportunities with an emphasis on the initial five years experience past completion of a Masters program in structural engineering. This study applied a modified Delphi method to obtain the critical information from members of the structural engineering profession. The results provide a Framework of Knowledge which will be useful to several groups seeking to better ensure the preparedness of the future young structural engineers at the Masters-Level.

  10. System engineering toolbox for design-oriented engineers

    NASA Technical Reports Server (NTRS)

    Goldberg, B. E.; Everhart, K.; Stevens, R.; Babbitt, N., III; Clemens, P.; Stout, L.

    1994-01-01

    This system engineering toolbox is designed to provide tools and methodologies to the design-oriented systems engineer. A tool is defined as a set of procedures to accomplish a specific function. A methodology is defined as a collection of tools, rules, and postulates to accomplish a purpose. For each concept addressed in the toolbox, the following information is provided: (1) description, (2) application, (3) procedures, (4) examples, if practical, (5) advantages, (6) limitations, and (7) bibliography and/or references. The scope of the document includes concept development tools, system safety and reliability tools, design-related analytical tools, graphical data interpretation tools, a brief description of common statistical tools and methodologies, so-called total quality management tools, and trend analysis tools. Both relationship to project phase and primary functional usage of the tools are also delineated. The toolbox also includes a case study for illustrative purposes. Fifty-five tools are delineated in the text.

  11. Space shuttle main engine anomaly data and inductive knowledge based systems: Automated corporate expertise

    NASA Technical Reports Server (NTRS)

    Modesitt, Kenneth L.

    1987-01-01

    Progress is reported on the development of SCOTTY, an expert knowledge-based system to automate the analysis procedure following test firings of the Space Shuttle Main Engine (SSME). The integration of a large-scale relational data base system, a computer graphics interface for experts and end-user engineers, potential extension of the system to flight engines, application of the system for training of newly-hired engineers, technology transfer to other engines, and the essential qualities of good software engineering practices for building expert knowledge-based systems are among the topics discussed.

  12. Tool for Turbine Engine Closed-Loop Transient Analysis (TTECTrA) Users' Guide

    NASA Technical Reports Server (NTRS)

    Csank, Jeffrey T.; Zinnecker, Alicia M.

    2014-01-01

    The tool for turbine engine closed-loop transient analysis (TTECTrA) is a semi-automated control design tool for subsonic aircraft engine simulations. At a specific flight condition, TTECTrA produces a basic controller designed to meet user-defined goals and containing only the fundamental limiters that affect the transient performance of the engine. The purpose of this tool is to provide the user a preliminary estimate of the transient performance of an engine model without the need to design a full nonlinear controller.

  13. Thermo-hydro-mechanical-chemical processes in fractured-porous media: Benchmarks and examples

    NASA Astrophysics Data System (ADS)

    Kolditz, O.; Shao, H.; Görke, U.; Kalbacher, T.; Bauer, S.; McDermott, C. I.; Wang, W.

    2012-12-01

    The book comprises an assembly of benchmarks and examples for porous media mechanics collected over the last twenty years. Analysis of thermo-hydro-mechanical-chemical (THMC) processes is essential to many applications in environmental engineering, such as geological waste deposition, geothermal energy utilisation, carbon capture and storage, water resources management, hydrology, even climate change. In order to assess the feasibility as well as the safety of geotechnical applications, process-based modelling is the only tool to put numbers, i.e. to quantify future scenarios. This charges a huge responsibility concerning the reliability of computational tools. Benchmarking is an appropriate methodology to verify the quality of modelling tools based on best practices. Moreover, benchmarking and code comparison foster community efforts. The benchmark book is part of the OpenGeoSys initiative - an open source project to share knowledge and experience in environmental analysis and scientific computation.

  14. Using Authentic Science in the Classroom: NASA's Coordinated Efforts to Enhance STEM Education

    NASA Astrophysics Data System (ADS)

    Lawton, B.; Schwerin, T.; Low, R.

    2015-11-01

    A key NASA education goal is to attract and retain students in science, technology engineering, and mathematics (STEM) disciplines. When teachers engage students in the examination of authentic data derived from NASA satellite missions, they simultaneously build 21st century technology skills as well as core content knowledge about the Earth and space. In this session, we highlight coordinated efforts by NASA Science Mission Directorate (SMD) Education and Public Outreach (EPO) programs to enhance educator accessibility to data resources, distribute state-of -the-art data tools and expand pathways for educators to find and use data resources. The group discussion explores how NASA SMD EPO efforts can further improve teacher access to authentic NASA data, identifies the types of tools and lessons most requested by the community, and explores how communication and collaboration between product developers and classroom educators using data tools and products can be enhanced.

  15. Advancing secondary metabolite biosynthesis in yeast with synthetic biology tools.

    PubMed

    Siddiqui, Michael S; Thodey, Kate; Trenchard, Isis; Smolke, Christina D

    2012-03-01

    Secondary metabolites are an important source of high-value chemicals, many of which exhibit important pharmacological properties. These valuable natural products are often difficult to synthesize chemically and are commonly isolated through inefficient extractions from natural biological sources. As such, they are increasingly targeted for production by biosynthesis from engineered microorganisms. The budding yeast species Saccharomyces cerevisiae has proven to be a powerful microorganism for heterologous expression of biosynthetic pathways. S. cerevisiae's usefulness as a host organism is owed in large part to the wealth of knowledge accumulated over more than a century of intense scientific study. Yet many challenges are currently faced in engineering yeast strains for the biosynthesis of complex secondary metabolite production. However, synthetic biology is advancing the development of new tools for constructing, controlling, and optimizing complex metabolic pathways in yeast. Here, we review how the coupling between yeast biology and synthetic biology is advancing the use of S. cerevisiae as a microbial host for the construction of secondary metabolic pathways. © 2011 Federation of European Microbiological Societies. Published by Blackwell Publishing Ltd. All rights reserved.

  16. Design mentoring tool : [technical summary].

    DOT National Transportation Integrated Search

    2011-01-01

    In 2004 a design engineer on-line mentoring tool was developed and implemented The purpose of the tool was to assist senior engineers mentoring new engineers to the INDOT design process and improve their technical competency. This approach saves seni...

  17. Decision support and disease management: a logic engineering approach.

    PubMed

    Fox, J; Thomson, R

    1998-12-01

    This paper describes the development and application of PROforma, a unified technology for clinical decision support and disease management. Work leading to the implementation of PROforma has been carried out in a series of projects funded by European agencies over the past 13 years. The work has been based on logic engineering, a distinct design and development methodology that combines concepts from knowledge engineering, logic programming, and software engineering. Several of the projects have used the approach to demonstrate a wide range of applications in primary and specialist care and clinical research. Concurrent academic research projects have provided a sound theoretical basis for the safety-critical elements of the methodology. The principal technical results of the work are the PROforma logic language for defining clinical processes and an associated suite of software tools for delivering applications, such as decision support and disease management procedures. The language supports four standard objects (decisions, plans, actions, and enquiries), each of which has an intuitive meaning with well-understood logical semantics. The development toolset includes a powerful visual programming environment for composing applications from these standard components, for verifying consistency and completeness of the resulting specification and for delivering stand-alone or embeddable applications. Tools and applications that have resulted from the work are described and illustrated, with examples from specialist cancer care and primary care. The results of a number of evaluation activities are included to illustrate the utility of the technology.

  18. Computer tools for systems engineering at LaRC

    NASA Technical Reports Server (NTRS)

    Walters, J. Milam

    1994-01-01

    The Systems Engineering Office (SEO) has been established to provide life cycle systems engineering support to Langley research Center projects. over the last two years, the computing market has been reviewed for tools which could enhance the effectiveness and efficiency of activities directed towards this mission. A group of interrelated applications have been procured, or are under development including a requirements management tool, a system design and simulation tool, and project and engineering data base. This paper will review the current configuration of these tools and provide information on future milestones and directions.

  19. OWLing Clinical Data Repositories With the Ontology Web Language

    PubMed Central

    Pastor, Xavier; Lozano, Esther

    2014-01-01

    Background The health sciences are based upon information. Clinical information is usually stored and managed by physicians with precarious tools, such as spreadsheets. The biomedical domain is more complex than other domains that have adopted information and communication technologies as pervasive business tools. Moreover, medicine continuously changes its corpus of knowledge because of new discoveries and the rearrangements in the relationships among concepts. This scenario makes it especially difficult to offer good tools to answer the professional needs of researchers and constitutes a barrier that needs innovation to discover useful solutions. Objective The objective was to design and implement a framework for the development of clinical data repositories, capable of facing the continuous change in the biomedicine domain and minimizing the technical knowledge required from final users. Methods We combined knowledge management tools and methodologies with relational technology. We present an ontology-based approach that is flexible and efficient for dealing with complexity and change, integrated with a solid relational storage and a Web graphical user interface. Results Onto Clinical Research Forms (OntoCRF) is a framework for the definition, modeling, and instantiation of data repositories. It does not need any database design or programming. All required information to define a new project is explicitly stated in ontologies. Moreover, the user interface is built automatically on the fly as Web pages, whereas data are stored in a generic repository. This allows for immediate deployment and population of the database as well as instant online availability of any modification. Conclusions OntoCRF is a complete framework to build data repositories with a solid relational storage. Driven by ontologies, OntoCRF is more flexible and efficient to deal with complexity and change than traditional systems and does not require very skilled technical people facilitating the engineering of clinical software systems. PMID:25599697

  20. OWLing Clinical Data Repositories With the Ontology Web Language.

    PubMed

    Lozano-Rubí, Raimundo; Pastor, Xavier; Lozano, Esther

    2014-08-01

    The health sciences are based upon information. Clinical information is usually stored and managed by physicians with precarious tools, such as spreadsheets. The biomedical domain is more complex than other domains that have adopted information and communication technologies as pervasive business tools. Moreover, medicine continuously changes its corpus of knowledge because of new discoveries and the rearrangements in the relationships among concepts. This scenario makes it especially difficult to offer good tools to answer the professional needs of researchers and constitutes a barrier that needs innovation to discover useful solutions. The objective was to design and implement a framework for the development of clinical data repositories, capable of facing the continuous change in the biomedicine domain and minimizing the technical knowledge required from final users. We combined knowledge management tools and methodologies with relational technology. We present an ontology-based approach that is flexible and efficient for dealing with complexity and change, integrated with a solid relational storage and a Web graphical user interface. Onto Clinical Research Forms (OntoCRF) is a framework for the definition, modeling, and instantiation of data repositories. It does not need any database design or programming. All required information to define a new project is explicitly stated in ontologies. Moreover, the user interface is built automatically on the fly as Web pages, whereas data are stored in a generic repository. This allows for immediate deployment and population of the database as well as instant online availability of any modification. OntoCRF is a complete framework to build data repositories with a solid relational storage. Driven by ontologies, OntoCRF is more flexible and efficient to deal with complexity and change than traditional systems and does not require very skilled technical people facilitating the engineering of clinical software systems.

  1. T.R.I.C.K.-Tire/Road Interaction Characterization & Knowledge - A tool for the evaluation of tire and vehicle performances in outdoor test sessions

    NASA Astrophysics Data System (ADS)

    Farroni, Flavio

    2016-05-01

    The most powerful engine, the most sophisticated aerodynamic devices or the most complex control systems will not improve vehicle performances if the forces exchanged with the road are not optimized by proper employment and knowledge of tires. The vehicle interface with the ground is constituted by the sum of small surfaces, wide about as one of our palms, in which tire/road interaction forces are exchanged. From this it is clear to see how the optimization of tire behavior represents a key-factor in the definition of the best setup of the whole vehicle. Nowadays, people and companies playing a role in automotive sector are looking for the optimal solution to model and understand tire's behavior both in experimental and simulation environments. The studies carried out and the tool developed herein demonstrate a new approach in tire characterization and in vehicle simulation procedures. This enables the reproduction of the dynamic response of a tire through the use of specific track sessions, carried out with the aim to employ the vehicle as a moving lab. The final product, named TRICK tool (Tire/Road Interaction Characterization and Knowledge), comprises of a vehicle model which processes experimental signals acquired from vehicle CAN bus and from sideslip angle estimation additional instrumentation. The output of the tool is several extra "virtual telemetry" channels, based on the time history of the acquired signals and containing force and slip estimations, useful to provide tire interaction characteristics. TRICK results can be integrated with the physical models developed by the Vehicle Dynamics UniNa research group, providing a multitude of working solutions and constituting an ideal instrument for the prediction and the simulation of the real tire dynamics.

  2. Investigating Knowledge Creation Technology in an Engineering Course

    ERIC Educational Resources Information Center

    Jalonen, Satu; Lakkala, Minna; Paavola, Sami

    2011-01-01

    The aim of the present study was to examine the technological affordances of a web-based collaborative learning technology, Knowledge Practices Environment (KPE), for supporting different dimensions of knowledge creation processes. KPE was used by engineering students in a practically oriented undergraduate engineering course. The study…

  3. Knowledge Engineering (Or, Catching Black Cats in Dark Rooms).

    ERIC Educational Resources Information Center

    Ruyle, Kim E.

    1993-01-01

    Discusses knowledge engineering, its relationship to artificial intelligence, and possible applications to developing expert systems, job aids, and technical training. The educational background of knowledge engineers is considered; the role of subject matter experts is described; and examples of flow charts, lists, and pictorial representations…

  4. Computational fluid dynamics: An engineering tool?

    NASA Astrophysics Data System (ADS)

    Anderson, J. D., Jr.

    1982-06-01

    Computational fluid dynamics in general, and time dependent finite difference techniques in particular, are examined from the point of view of direct engineering applications. Examples are given of the supersonic blunt body problem and gasdynamic laser calculations, where such techniques are clearly engineering tools. In addition, Navier-Stokes calculations of chemical laser flows are discussed as an example of a near engineering tool. Finally, calculations of the flowfield in a reciprocating internal combustion engine are offered as a promising future engineering application of computational fluid dynamics.

  5. Problem-Oriented Corporate Knowledge Base Models on the Case-Based Reasoning Approach Basis

    NASA Astrophysics Data System (ADS)

    Gluhih, I. N.; Akhmadulin, R. K.

    2017-07-01

    One of the urgent directions of efficiency enhancement of production processes and enterprises activities management is creation and use of corporate knowledge bases. The article suggests a concept of problem-oriented corporate knowledge bases (PO CKB), in which knowledge is arranged around possible problem situations and represents a tool for making and implementing decisions in such situations. For knowledge representation in PO CKB a case-based reasoning approach is encouraged to use. Under this approach, the content of a case as a knowledge base component has been defined; based on the situation tree a PO CKB knowledge model has been developed, in which the knowledge about typical situations as well as specific examples of situations and solutions have been represented. A generalized problem-oriented corporate knowledge base structural chart and possible modes of its operation have been suggested. The obtained models allow creating and using corporate knowledge bases for support of decision making and implementing, training, staff skill upgrading and analysis of the decisions taken. The universal interpretation of terms “situation” and “solution” adopted in the work allows using the suggested models to develop problem-oriented corporate knowledge bases in different subject domains. It has been suggested to use the developed models for making corporate knowledge bases of the enterprises that operate engineer systems and networks at large production facilities.

  6. A case study of the knowledge transfer practices from the perspectives of highly experienced engineers in the aerospace industry

    NASA Astrophysics Data System (ADS)

    Martin, Deloris

    Purpose. The purpose of this study was to describe the existing knowledge transfer practices in selected aerospace companies as perceived by highly experienced engineers retiring from the company. Specifically it was designed to investigate and describe (a) the processes and procedures used to transfer knowledge, (b) the systems that encourage knowledge transfer, (c) the impact of management actions on knowledge transfer, and (d) constraining factors that might impede knowledge transfer. Methodology. A descriptive case study was the methodology applied in this study. Qualitative data were gathered from highly experienced engineers from 3 large aerospace companies in Southern California. A semistructured interview was conducted face-to-face with each participant in a private or semiprivate, non-workplace setting to obtain each engineer's perspectives on his or her company's current knowledge transfer practices. Findings. The participants in this study preferred to transfer knowledge using face-to-face methods, one-on-one, through actual troubleshooting and problem-solving scenarios. Managers in these aerospace companies were observed as having knowledge transfer as a low priority; they tend not to promote knowledge transfer among their employees. While mentoring is the most common knowledge transfer system these companies offer, it is not the preferred method of knowledge transfer among the highly experienced engineers. Job security and schedule pressures are the top constraints that impede knowledge transfer between the highly experienced engineers and their coworkers. Conclusions. The study data support the conclusion that the highly experienced engineers in the study's aerospace companies would more likely transfer their knowledge to those remaining in the industry if the transfer could occur face-to-face with management support and acknowledgement of their expertise and if their job security is not threatened. The study also supports the conclusion that managers should be responsible for the leadership in developing a knowledge-sharing culture and rewarding those who do share. Recommendations. It is recommended that a quantitative study of highly experienced engineers in aerospace be conducted to determine the degree to which knowledge-sharing methods, processes, and procedures may be effective in capturing their knowledge. It is also recommended that a replication of this study be undertaken to include the perspectives of first-line managers on developing a knowledge-sharing culture for the aerospace industry.

  7. The responsibilities of engineers.

    PubMed

    Smith, Justin; Gardoni, Paolo; Murphy, Colleen

    2014-06-01

    Knowledge of the responsibilities of engineers is the foundation for answering ethical questions about the work of engineers. This paper defines the responsibilities of engineers by considering what constitutes the nature of engineering as a particular form of activity. Specifically, this paper focuses on the ethical responsibilities of engineers qua engineers. Such responsibilities refer to the duties acquired in virtue of being a member of a group. We examine the practice of engineering, drawing on the idea of practices developed by philosopher Alasdair MacIntyre, and show how the idea of a practice is important for identifying and justifying the responsibilities of engineers. To demonstrate the contribution that knowledge of the responsibilities of engineers makes to engineering ethics, a case study from structural engineering is discussed. The discussion of the failure of the Sleipner A Platform off the coast of Norway in 1991 demonstrates how the responsibilities of engineers can be derived from knowledge of the nature of engineering and its context.

  8. An Engineering Innovation Tool: Providing Science Educators a Picture of Engineering in Their Classroom

    ERIC Educational Resources Information Center

    Ross, Julia Myers; Peterman, Karen; Daugherty, Jenny L.; Custer, Rodney L.

    2018-01-01

    An Engineering Innovation Tool was designed to support science teachers as they navigate the opportunities and challenges the inclusion of engineering affords by providing a useful tool to be used within the professional development environment and beyond. The purpose of this manuscript is to share the design, development and substance of the tool…

  9. Simulation-Based e-Learning Tools for Science,Engineering, and Technology Education(SimBeLT)

    NASA Astrophysics Data System (ADS)

    Davis, Doyle V.; Cherner, Y.

    2006-12-01

    The focus of Project SimBeLT is the research, development, testing, and dissemination of a new type of simulation-based integrated e-learning set of modules for two-year college technical and engineering curricula in the areas of thermodynamics, fluid physics, and fiber optics that can also be used in secondary schools and four-year colleges. A collection of sophisticated virtual labs is the core component of the SimBeLT modules. These labs will be designed to enhance the understanding of technical concepts and underlying fundamental principles of these topics, as well as to master certain performance based skills online. SimBeLT software will help educators to meet the National Science Education Standard that "learning science and technology is something that students do, not something that is done to them". A major component of Project SimBeLT is the development of multi-layered technology-oriented virtual labs that realistically mimic workplace-like environments. Dynamic data exchange between simulations will be implemented and links with instant instructional messages and data handling tools will be realized. A second important goal of Project SimBeLT labs is to bridge technical skills and scientific knowledge by enhancing the teaching and learning of specific scientific or engineering subjects. SimBeLT builds upon research and outcomes of interactive teaching strategies and tools developed through prior NSF funding (http://webphysics.nhctc.edu/compact/index.html) (Project SimBeLT is partially supported by a grant from the National Science Foundation DUE-0603277)

  10. EM-31 RETRIEVAL KNOWLEDGE CENTER MEETING REPORT: MOBILIZE AND DISLODGE TANK WASTE HEELS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fellinger, A.

    2010-02-16

    The Retrieval Knowledge Center sponsored a meeting in June 2009 to review challenges and gaps to retrieval of tank waste heels. The facilitated meeting was held at the Savannah River Research Campus with personnel broadly representing tank waste retrieval knowledge at Hanford, Savannah River, Idaho, and Oak Ridge. This document captures the results of this meeting. In summary, it was agreed that the challenges to retrieval of tank waste heels fell into two broad categories: (1) mechanical heel waste retrieval methodologies and equipment and (2) understanding and manipulating the heel waste (physical, radiological, and chemical characteristics) to support retrieval optionsmore » and subsequent processing. Recent successes and lessons from deployments of the Sand and Salt Mantis vehicles as well as retrieval of C-Area tanks at Hanford were reviewed. Suggestions to address existing retrieval approaches that utilize a limited set of tools and techniques are included in this report. The meeting found that there had been very little effort to improve or integrate the multiple proven or new techniques and tools available into a menu of available methods for rapid insertion into baselines. It is recommended that focused developmental efforts continue in the two areas underway (low-level mixing evaluation and pumping slurries with large solid materials) and that projects to demonstrate new/improved tools be launched to outfit tank farm operators with the needed tools to complete tank heel retrievals effectively and efficiently. This document describes the results of a meeting held on June 3, 2009 at the Savannah River Site in South Carolina to identify technology gaps and potential technology solutions to retrieving high-level waste (HLW) heels from waste tanks within the complex of sites run by the U. S. Department of Energy (DOE). The meeting brought together personnel with extensive tank waste retrieval knowledge from DOE's four major waste sites - Hanford, Savannah River, Idaho, and Oak Ridge. The meeting was arranged by the Retrieval Knowledge Center (RKC), which is a technology development project sponsored by the Office of Technology Innovation & Development - formerly the Office of Engineering and Technology - within the DOE Office of Environmental Management (EM).« less

  11. Knowledge, skills and attitudes of hospital pharmacists in the use of information technology and electronic tools to support clinical practice: A Brazilian survey

    PubMed Central

    Vasconcelos, Hemerson Bruno da Silva; Woods, David John

    2017-01-01

    This study aimed to identify the knowledge, skills and attitudes of Brazilian hospital pharmacists in the use of information technology and electronic tools to support clinical practice. Methods: A questionnaire was sent by email to clinical pharmacists working public and private hospitals in Brazil. The instrument was validated using the method of Polit and Beck to determine the content validity index. Data (n = 348) were analyzed using descriptive statistics, Pearson's Chi-square test and Gamma correlation tests. Results: Pharmacists had 1–4 electronic devices for personal use, mainly smartphones (84.8%; n = 295) and laptops (81.6%; n = 284). At work, pharmacists had access to a computer (89.4%; n = 311), mostly connected to the internet (83.9%; n = 292). They felt competent (very capable/capable) searching for a web page/web site on a specific subject (100%; n = 348), downloading files (99.7%; n = 347), using spreadsheets (90.2%; n = 314), searching using MeSH terms in PubMed (97.4%; n = 339) and general searching for articles in bibliographic databases (such as Medline/PubMed: 93.4%; n = 325). Pharmacists did not feel competent in using statistical analysis software (somewhat capable/incapable: 78.4%; n = 273). Most pharmacists reported that they had not received formal education to perform most of these actions except searching using MeSH terms. Access to bibliographic databases was available in Brazilian hospitals, however, most pharmacists (78.7%; n = 274) reported daily use of a non-specific search engine such as Google. This result may reflect the lack of formal knowledge and training in the use of bibliographic databases and difficulty with the English language. The need to expand knowledge about information search tools was recognized by most pharmacists in clinical practice in Brazil, especially those with less time dedicated exclusively to clinical activity (Chi-square, p = 0.006). Conclusion: These results will assist in defining minimal competencies for the training of pharmacists in the field of information technology to support clinical practice. Knowledge and skill gaps are evident in the use of bibliographic databases, spreadsheets and statistical tools. PMID:29272292

  12. Knowledge, skills and attitudes of hospital pharmacists in the use of information technology and electronic tools to support clinical practice: A Brazilian survey.

    PubMed

    Néri, Eugenie Desirèe Rabelo; Meira, Assuero Silva; Vasconcelos, Hemerson Bruno da Silva; Woods, David John; Fonteles, Marta Maria de França

    2017-01-01

    This study aimed to identify the knowledge, skills and attitudes of Brazilian hospital pharmacists in the use of information technology and electronic tools to support clinical practice. A questionnaire was sent by email to clinical pharmacists working public and private hospitals in Brazil. The instrument was validated using the method of Polit and Beck to determine the content validity index. Data (n = 348) were analyzed using descriptive statistics, Pearson's Chi-square test and Gamma correlation tests. Pharmacists had 1-4 electronic devices for personal use, mainly smartphones (84.8%; n = 295) and laptops (81.6%; n = 284). At work, pharmacists had access to a computer (89.4%; n = 311), mostly connected to the internet (83.9%; n = 292). They felt competent (very capable/capable) searching for a web page/web site on a specific subject (100%; n = 348), downloading files (99.7%; n = 347), using spreadsheets (90.2%; n = 314), searching using MeSH terms in PubMed (97.4%; n = 339) and general searching for articles in bibliographic databases (such as Medline/PubMed: 93.4%; n = 325). Pharmacists did not feel competent in using statistical analysis software (somewhat capable/incapable: 78.4%; n = 273). Most pharmacists reported that they had not received formal education to perform most of these actions except searching using MeSH terms. Access to bibliographic databases was available in Brazilian hospitals, however, most pharmacists (78.7%; n = 274) reported daily use of a non-specific search engine such as Google. This result may reflect the lack of formal knowledge and training in the use of bibliographic databases and difficulty with the English language. The need to expand knowledge about information search tools was recognized by most pharmacists in clinical practice in Brazil, especially those with less time dedicated exclusively to clinical activity (Chi-square, p = 0.006). These results will assist in defining minimal competencies for the training of pharmacists in the field of information technology to support clinical practice. Knowledge and skill gaps are evident in the use of bibliographic databases, spreadsheets and statistical tools.

  13. Thrust Area Report, Engineering Research, Development and Technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Langland, R. T.

    1997-02-01

    The mission of the Engineering Research, Development, and Technology Program at Lawrence Livermore National Laboratory (LLNL) is to develop the knowledge base, process technologies, specialized equipment, tools and facilities to support current and future LLNL programs. Engineering`s efforts are guided by a strategy that results in dual benefit: first, in support of Department of Energy missions, such as national security through nuclear deterrence; and second, in enhancing the nation`s economic competitiveness through our collaboration with U.S. industry in pursuit of the most cost- effective engineering solutions to LLNL programs. To accomplish this mission, the Engineering Research, Development, and Technology Programmore » has two important goals: (1) identify key technologies relevant to LLNL programs where we can establish unique competencies, and (2) conduct high-quality research and development to enhance our capabilities and establish ourselves as the world leaders in these technologies. To focus Engineering`s efforts technology {ital thrust areas} are identified and technical leaders are selected for each area. The thrust areas are comprised of integrated engineering activities, staffed by personnel from the nine electronics and mechanical engineering divisions, and from other LLNL organizations. This annual report, organized by thrust area, describes Engineering`s activities for fiscal year 1996. The report provides timely summaries of objectives, methods, and key results from eight thrust areas: Computational Electronics and Electromagnetics; Computational Mechanics; Microtechnology; Manufacturing Technology; Materials Science and Engineering; Power Conversion Technologies; Nondestructive Evaluation; and Information Engineering. Readers desiring more information are encouraged to contact the individual thrust area leaders or authors. 198 refs., 206 figs., 16 tabs.« less

  14. Submarine pipeline on-bottom stability. Volume 2: Software and manuals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1998-12-01

    The state-of-the-art in pipeline stability design has been changing very rapidly recent. The physics governing on-bottom stability are much better understood now than they were eight years. This is due largely because of research and large scale model tests sponsored by PRCI. Analysis tools utilizing this new knowledge have been developed. These tools provide the design engineer with a rational approach have been developed. These tools provide the design engineer with a rational approach for weight coating design, which he can use with confidence because the tools have been developed based on full scale and near full scale model tests.more » These tools represent the state-of-the-art in stability design and model the complex behavior of pipes subjected to both wave and current loads. These include: hydrodynamic forces which account for the effect of the wake (generated by flow over the pipe) washing back and forth over the pipe in oscillatory flow; and the embedment (digging) which occurs as a pipe resting on the seabed is exposed to oscillatory loadings and small oscillatory deflections. This report has been developed as a reference handbook for use in on-bottom pipeline stability analysis It consists of two volumes. Volume one is devoted descriptions of the various aspects of the problem: the pipeline design process; ocean physics, wave mechanics, hydrodynamic forces, and meteorological data determination; geotechnical data collection and soil mechanics; and stability design procedures. Volume two describes, lists, and illustrates the analysis software. Diskettes containing the software and examples of the software are also included in Volume two.« less

  15. Toward systems metabolic engineering of Aspergillus and Pichia species for the production of chemicals and biofuels.

    PubMed

    Caspeta, Luis; Nielsen, Jens

    2013-05-01

    Recently genome sequence data have become available for Aspergillus and Pichia species of industrial interest. This has stimulated the use of systems biology approaches for large-scale analysis of the molecular and metabolic responses of Aspergillus and Pichia under defined conditions, which has resulted in much new biological information. Case-specific contextualization of this information has been performed using comparative and functional genomic tools. Genomics data are also the basis for constructing genome-scale metabolic models, and these models have helped in the contextualization of knowledge on the fundamental biology of Aspergillus and Pichia species. Furthermore, with the availability of these models, the engineering of Aspergillus and Pichia is moving from traditional approaches, such as random mutagenesis, to a systems metabolic engineering approach. Here we review the recent trends in systems biology of Aspergillus and Pichia species, highlighting the relevance of these developments for systems metabolic engineering of these organisms for the production of hydrolytic enzymes, biofuels and chemicals from biomass. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. New frontiers in design synthesis

    NASA Technical Reports Server (NTRS)

    Goldin, D. S.; Venneri, S. L.; Noor, A. K.

    1999-01-01

    The Intelligent Synthesis Environment (ISE), which is one of the major strategic technologies under development at NASA centers and the University of Virginia, is described. One of the major objectives of ISE is to significantly enhance the rapid creation of innovative affordable products and missions. ISE uses a synergistic combination of leading-edge technologies, including high performance computing, high capacity communications and networking, human-centered computing, knowledge-based engineering, computational intelligence, virtual product development, and product information management. The environment will link scientists, design teams, manufacturers, suppliers, and consultants who participate in the mission synthesis as well as in the creation and operation of the aerospace system. It will radically advance the process by which complex science missions are synthesized, and high-tech engineering Systems are designed, manufactured and operated. The five major components critical to ISE are human-centered computing, infrastructure for distributed collaboration, rapid synthesis and simulation tools, life cycle integration and validation, and cultural change in both the engineering and science creative process. The five components and their subelements are described. Related U.S. government programs are outlined and the future impact of ISE on engineering research and education is discussed.

  17. Efficient Results in Semantic Interoperability for Health Care. Findings from the Section on Knowledge Representation and Management.

    PubMed

    Soualmia, L F; Charlet, J

    2016-11-10

    To summarize excellent current research in the field of Knowledge Representation and Management (KRM) within the health and medical care domain. We provide a synopsis of the 2016 IMIA selected articles as well as a related synthetic overview of the current and future field activities. A first step of the selection was performed through MEDLINE querying with a list of MeSH descriptors completed by a list of terms adapted to the KRM section. The second step of the selection was completed by the two section editors who separately evaluated the set of 1,432 articles. The third step of the selection consisted of a collective work that merged the evaluation results to retain 15 articles for peer-review. The selection and evaluation process of this Yearbook's section on Knowledge Representation and Management has yielded four excellent and interesting articles regarding semantic interoperability for health care by gathering heterogeneous sources (knowledge and data) and auditing ontologies. In the first article, the authors present a solution based on standards and Semantic Web technologies to access distributed and heterogeneous datasets in the domain of breast cancer clinical trials. The second article describes a knowledge-based recommendation system that relies on ontologies and Semantic Web rules in the context of chronic diseases dietary. The third article is related to concept-recognition and text-mining to derive common human diseases model and a phenotypic network of common diseases. In the fourth article, the authors highlight the need for auditing the SNOMED CT. They propose to use a crowdbased method for ontology engineering. The current research activities further illustrate the continuous convergence of Knowledge Representation and Medical Informatics, with a focus this year on dedicated tools and methods to advance clinical care by proposing solutions to cope with the problem of semantic interoperability. Indeed, there is a need for powerful tools able to manage and interpret complex, large-scale and distributed datasets and knowledge bases, but also a need for user-friendly tools developed for the clinicians in their daily practice.

  18. Materials science tools for regenerative medicine

    NASA Astrophysics Data System (ADS)

    Richardson, Wade Nicholas

    Regenerative therapies originating from recent technological advances in biology could revolutionize medicine in the coming years. In particular, the advent of human pluripotent stem cells (hPSCs), with their ability to become any cell in the adult body, has opened the door to an entirely new way of treating disease. However, currently these medical breakthroughs remain only a promise. To make them a reality, new tools must be developed to surmount the new technical hurdles that have arisen from dramatic departure from convention that this field represents. The collected work presented in this dissertation covers several projects that seek to apply the skills and knowledge of materials science to this tool synthesizing effort. The work is divided into three chapters. The first deals with our work to apply Raman spectroscopy, a tool widely used for materials characterization, to degeneration in cartilage. We have shown that Raman can effectively distinguish the matrix material of healthy and diseased tissue. The second area of work covered is the development of a new confocal image analysis for studying hPSC colonies that are chemical confined to uniform growth regions. This tool has important application in understanding the heterogeneity that may slow the development of hPSC -based treatment, as well as the use of such confinement in the eventually large-scale manufacture of hPSCs for therapeutic use. Third, the use of structural templating in tissue engineering scaffolds is detailed. We have utilized templating to tailor scaffold structures for engineering of constructs mimicking two tissues: cartilage and lung. The work described here represents several important early steps towards large goals in regenerative medicine. These tools show a great deal of potential for accelerating progress in this field that seems on the cusp of helping a great many people with otherwise incurable disease.

  19. Risk Management Implementation Tool

    NASA Technical Reports Server (NTRS)

    Wright, Shayla L.

    2004-01-01

    Continuous Risk Management (CM) is a software engineering practice with processes, methods, and tools for managing risk in a project. It provides a controlled environment for practical decision making, in order to assess continually what could go wrong, determine which risk are important to deal with, implement strategies to deal with those risk and assure the measure effectiveness of the implemented strategies. Continuous Risk Management provides many training workshops and courses to teach the staff how to implement risk management to their various experiments and projects. The steps of the CRM process are identification, analysis, planning, tracking, and control. These steps and the various methods and tools that go along with them, identification, and dealing with risk is clear-cut. The office that I worked in was the Risk Management Office (RMO). The RMO at NASA works hard to uphold NASA s mission of exploration and advancement of scientific knowledge and technology by defining and reducing program risk. The RMO is one of the divisions that fall under the Safety and Assurance Directorate (SAAD). I worked under Cynthia Calhoun, Flight Software Systems Engineer. My task was to develop a help screen for the Continuous Risk Management Implementation Tool (RMIT). The Risk Management Implementation Tool will be used by many NASA managers to identify, analyze, track, control, and communicate risks in their programs and projects. The RMIT will provide a means for NASA to continuously assess risks. The goals and purposes for this tool is to provide a simple means to manage risks, be used by program and project managers throughout NASA for managing risk, and to take an aggressive approach to advertise and advocate the use of RMIT at each NASA center.

  20. LIS Professionals as Knowledge Engineers.

    ERIC Educational Resources Information Center

    Poulter, Alan; And Others

    1994-01-01

    Considers the role of library and information science professionals as knowledge engineers. Highlights include knowledge acquisition, including personal experience, interviews, protocol analysis, observation, multidimensional sorting, printed sources, and machine learning; knowledge representation, including production rules and semantic nets;…

  1. Biomaterial-mesenchymal stem cell constructs for immunomodulation in composite tissue engineering.

    PubMed

    Hanson, Summer; D'Souza, Rena N; Hematti, Peiman

    2014-08-01

    Cell-based treatments are being developed as a novel approach for the treatment of many diseases in an effort to repair injured tissues and regenerate lost tissues. Interest in the potential use of multipotent progenitor or stem cells has grown significantly in recent years, specifically the use of mesenchymal stem cells (MSCs), for tissue engineering in combination with extracellular matrix-based scaffolds. An area that warrants further attention is the local or systemic host responses toward the implanted cell-biomaterial constructs. Such immunological responses could play a major role in determining the clinical efficacy of the therapeutic device or biomaterials used. MSCs, due to their unique immunomodulatory properties, hold great promise in tissue engineering as they not only directly participate in tissue repair and regeneration but also modulate the host foreign body response toward the engineered constructs. The purpose of this review was to summarize the current state of knowledge and applications of MSC-biomaterial constructs as a potential immunoregulatory tool in tissue engineering. Better understanding of the interactions between biomaterials and cells could translate to the development of clinically relevant and novel cell-based therapeutics for tissue reconstruction and regenerative medicine.

  2. Analysis of Ten Reverse Engineering Tools

    NASA Astrophysics Data System (ADS)

    Koskinen, Jussi; Lehmonen, Tero

    Reverse engineering tools can be used in satisfying the information needs of software maintainers. Especially in case of maintaining large-scale legacy systems tool support is essential. Reverse engineering tools provide various kinds of capabilities to provide the needed information to the tool user. In this paper we analyze the provided capabilities in terms of four aspects: provided data structures, visualization mechanisms, information request specification mechanisms, and navigation features. We provide a compact analysis of ten representative reverse engineering tools for supporting C, C++ or Java: Eclipse Java Development Tools, Wind River Workbench (for C and C++), Understand (for C++), Imagix 4D, Creole, Javadoc, Javasrc, Source Navigator, Doxygen, and HyperSoft. The results of the study supplement the earlier findings in this important area.

  3. Debugging expert systems using a dynamically created hypertext network

    NASA Technical Reports Server (NTRS)

    Boyle, Craig D. B.; Schuette, John F.

    1991-01-01

    The labor intensive nature of expert system writing and debugging motivated this study. The hypothesis is that a hypertext based debugging tool is easier and faster than one traditional tool, the graphical execution trace. HESDE (Hypertext Expert System Debugging Environment) uses Hypertext nodes and links to represent the objects and their relationships created during the execution of a rule based expert system. HESDE operates transparently on top of the CLIPS (C Language Integrated Production System) rule based system environment and is used during the knowledge base debugging process. During the execution process HESDE builds an execution trace. Use of facts, rules, and their values are automatically stored in a Hypertext network for each execution cycle. After the execution process, the knowledge engineer may access the Hypertext network and browse the network created. The network may be viewed in terms of rules, facts, and values. An experiment was conducted to compare HESDE with a graphical debugging environment. Subjects were given representative tasks. For speed and accuracy, in eight of the eleven tasks given to subjects, HESDE was significantly better.

  4. Automated payload experiment tool feasibility study

    NASA Technical Reports Server (NTRS)

    Maddux, Gary A.; Clark, James; Delugach, Harry; Hammons, Charles; Logan, Julie; Provancha, Anna

    1991-01-01

    To achieve an environment less dependent on the flow of paper, automated techniques of data storage and retrieval must be utilized. The prototype under development seeks to demonstrate the ability of a knowledge-based, hypertext computer system. This prototype is concerned with the logical links between two primary NASA support documents, the Science Requirements Document (SRD) and the Engineering Requirements Document (ERD). Once developed, the final system should have the ability to guide a principal investigator through the documentation process in a more timely and efficient manner, while supplying more accurate information to the NASA payload developer.

  5. Plowshare

    DOE R&D Accomplishments Database

    Teller, E.

    1963-02-04

    The purpose of this lecture is to give an impression of the main characteristic feature of Plowshare: its exceedingly wide applicability throughout fields of economic or scientific interest. If one wants to find the right applications, knowledge of the nuclear tool is not enough. One needs to have a thorough familiarity with the materials, with the processes, with all of science, with all the economics on our globe and maybe beyond. A survey is presented of all aspects of peaceful applications of nuclear explosives: earth moving, large-scale chemical and mining engineering, and scientific experiments. (D.L.C.)

  6. The potential application of the blackboard model of problem solving to multidisciplinary design

    NASA Technical Reports Server (NTRS)

    Rogers, J. L.

    1989-01-01

    Problems associated with the sequential approach to multidisciplinary design are discussed. A blackboard model is suggested as a potential tool for implementing the multilevel decomposition approach to overcome these problems. The blackboard model serves as a global database for the solution with each discipline acting as a knowledge source for updating the solution. With this approach, it is possible for engineers to improve the coordination, communication, and cooperation in the conceptual design process, allowing them to achieve a more optimal design from an interdisciplinary standpoint.

  7. An investigation of multitasking information behavior and the influence of working memory and flow

    NASA Astrophysics Data System (ADS)

    Alexopoulou, Peggy; Hepworth, Mark; Morris, Anne

    2015-02-01

    This study explored the multitasking information behaviour of Web users and how this is influenced by working memory, flow and Personal, Artefact and Task characteristics, as described in the PAT model. The research was exploratory using a pragmatic, mixed method approach. Thirty University students participated; 10 psychologists, 10 accountants and 10 mechanical engineers. The data collection tools used were: pre and post questionnaires, a working memory test, a flow state scale test, audio-visual data, web search logs, think aloud data, observation, and the critical decision method. All participants searched information on the Web for four topics: two for which they had prior knowledge and two more without prior knowledge. Perception of task complexity was found to be related to working memory. People with low working memory reported a significant increase in task complexity after they had completed information searching tasks for which they had no prior knowledge, this was not the case for tasks with prior knowledge. Regarding flow and task complexity, the results confirmed the suggestion of the PAT model (Finneran and Zhang, 2003), which proposed that a complex task can lead to anxiety and low flow levels as well as to perceived challenge and high flow levels. However, the results did not confirm the suggestion of the PAT model regarding the characteristics of web search systems and especially perceived vividness. All participants experienced high vividness. According to the PAT model, however, only people with high flow should experience high levels of vividness. Flow affected the degree of change of knowledge of the participants. People with high flow gained more knowledge for tasks without prior knowledge rather than people with low flow. Furthermore, accountants felt that tasks without prior knowledge were less complex at the end of the web seeking procedure than psychologists and mechanical engineers. Finally, the three disciplines appeared to differ regarding the multitasking information behaviour characteristics such as queries, web search sessions and opened tabs/windows.

  8. Metabolic Engineering of Oleaginous Yeasts for Production of Fuels and Chemicals.

    PubMed

    Shi, Shuobo; Zhao, Huimin

    2017-01-01

    Oleaginous yeasts have been increasingly explored for production of chemicals and fuels via metabolic engineering. Particularly, there is a growing interest in using oleaginous yeasts for the synthesis of lipid-related products due to their high lipogenesis capability, robustness, and ability to utilize a variety of substrates. Most of the metabolic engineering studies in oleaginous yeasts focused on Yarrowia that already has plenty of genetic engineering tools. However, recent advances in systems biology and synthetic biology have provided new strategies and tools to engineer those oleaginous yeasts that have naturally high lipid accumulation but lack genetic tools, such as Rhodosporidium , Trichosporon , and Lipomyces . This review highlights recent accomplishments in metabolic engineering of oleaginous yeasts and recent advances in the development of genetic engineering tools in oleaginous yeasts within the last 3 years.

  9. Education on electrical phenomena involved in electroporation-based therapies and treatments: a blended learning approach.

    PubMed

    Čorović, Selma; Mahnič-Kalamiza, Samo; Miklavčič, Damijan

    2016-04-07

    Electroporation-based applications require multidisciplinary expertise and collaboration of experts with different professional backgrounds in engineering and science. Beginning in 2003, an international scientific workshop and postgraduate course electroporation based technologies and treatments (EBTT) has been organized at the University of Ljubljana to facilitate transfer of knowledge from leading experts to researches, students and newcomers in the field of electroporation. In this paper we present one of the integral parts of EBTT: an e-learning practical work we developed to complement delivery of knowledge via lectures and laboratory work, thus providing a blended learning approach on electrical phenomena involved in electroporation-based therapies and treatments. The learning effect was assessed via a pre- and post e-learning examination test composed of 10 multiple choice questions (i.e. items). The e-learning practical work session and both of the e-learning examination tests were carried out after the live EBTT lectures and other laboratory work. Statistical analysis was performed to compare and evaluate the learning effect measured in two groups of students: (1) electrical engineers and (2) natural scientists (i.e. medical doctors, biologists and chemists) undergoing the e-learning practical work in 2011-2014 academic years. Item analysis was performed to assess the difficulty of each item of the examination test. The results of our study show that the total score on the post examination test significantly improved and the item difficulty in both experimental groups decreased. The natural scientists reached the same level of knowledge (no statistical difference in total post-examination test score) on the post-course test take, as do electrical engineers, although the engineers started with statistically higher total pre-test examination score, as expected. The main objective of this study was to investigate whether the educational content the e-learning practical work presented to the students with different professional backgrounds enhanced their knowledge acquired via lectures during EBTT. We compared the learning effect assessed in two experimental groups undergoing the e-learning practical work: electrical engineers and natural scientists. The same level of knowledge on the post-course examination was reached in both groups. The results indicate that our e-learning platform supported by blended learning approach provides an effective learning tool for populations with mixed professional backgrounds and thus plays an important role in bridging the gap between scientific domains involved in electroporation-based technologies and treatments.

  10. An investigation of constraint-based component-modeling for knowledge representation in computer-aided conceptual design

    NASA Technical Reports Server (NTRS)

    Kolb, Mark A.

    1990-01-01

    Originally, computer programs for engineering design focused on detailed geometric design. Later, computer programs for algorithmically performing the preliminary design of specific well-defined classes of objects became commonplace. However, due to the need for extreme flexibility, it appears unlikely that conventional programming techniques will prove fruitful in developing computer aids for engineering conceptual design. The use of symbolic processing techniques, such as object-oriented programming and constraint propagation, facilitate such flexibility. Object-oriented programming allows programs to be organized around the objects and behavior to be simulated, rather than around fixed sequences of function- and subroutine-calls. Constraint propagation allows declarative statements to be understood as designating multi-directional mathematical relationships among all the variables of an equation, rather than as unidirectional assignments to the variable on the left-hand side of the equation, as in conventional computer programs. The research has concentrated on applying these two techniques to the development of a general-purpose computer aid for engineering conceptual design. Object-oriented programming techniques are utilized to implement a user-extensible database of design components. The mathematical relationships which model both geometry and physics of these components are managed via constraint propagation. In addition, to this component-based hierarchy, special-purpose data structures are provided for describing component interactions and supporting state-dependent parameters. In order to investigate the utility of this approach, a number of sample design problems from the field of aerospace engineering were implemented using the prototype design tool, Rubber Airplane. The additional level of organizational structure obtained by representing design knowledge in terms of components is observed to provide greater convenience to the program user, and to result in a database of engineering information which is easier both to maintain and to extend.

  11. A Proposal to Develop Interactive Classification Technology

    NASA Technical Reports Server (NTRS)

    deBessonet, Cary

    1998-01-01

    Research for the first year was oriented towards: 1) the design of an interactive classification tool (ICT); and 2) the development of an appropriate theory of inference for use in ICT technology. The general objective was to develop a theory of classification that could accommodate a diverse array of objects, including events and their constituent objects. Throughout this report, the term "object" is to be interpreted in a broad sense to cover any kind of object, including living beings, non-living physical things, events, even ideas and concepts. The idea was to produce a theory that could serve as the uniting fabric of a base technology capable of being implemented in a variety of automated systems. The decision was made to employ two technologies under development by the principal investigator, namely, SMS (Symbolic Manipulation System) and SL (Symbolic Language) [see debessonet, 1991, for detailed descriptions of SMS and SL]. The plan was to enhance and modify these technologies for use in an ICT environment. As a means of giving focus and direction to the proposed research, the investigators decided to design an interactive, classificatory tool for use in building accessible knowledge bases for selected domains. Accordingly, the proposed research was divisible into tasks that included: 1) the design of technology for classifying domain objects and for building knowledge bases from the results automatically; 2) the development of a scheme of inference capable of drawing upon previously processed classificatory schemes and knowledge bases; and 3) the design of a query/ search module for accessing the knowledge bases built by the inclusive system. The interactive tool for classifying domain objects was to be designed initially for textual corpora with a view to having the technology eventually be used in robots to build sentential knowledge bases that would be supported by inference engines specially designed for the natural or man-made environments in which the robots would be called upon to operate.

  12. 40 CFR 1065.410 - Maintenance limits for stabilized test engines.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... engineering grade tools to identify bad engine components. Any equipment, instruments, or tools used for... no longer use it as an emission-data engine. Also, if your test engine has a major mechanical failure... your test engine has a major mechanical failure that requires you to take it apart, you may no longer...

  13. 40 CFR 1065.410 - Maintenance limits for stabilized test engines.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... engineering grade tools to identify bad engine components. Any equipment, instruments, or tools used for... no longer use it as an emission-data engine. Also, if your test engine has a major mechanical failure... your test engine has a major mechanical failure that requires you to take it apart, you may no longer...

  14. A demonstration of expert systems applications in transportation engineering : volume I, transportation engineers and expert systems.

    DOT National Transportation Integrated Search

    1987-01-01

    Expert systems, a branch of artificial-intelligence studies, is introduced with a view to its relevance in transportation engineering. Knowledge engineering, the process of building expert systems or transferring knowledge from human experts to compu...

  15. Real-time diagnostics for a reusable rocket engine

    NASA Technical Reports Server (NTRS)

    Guo, T. H.; Merrill, W.; Duyar, A.

    1992-01-01

    A hierarchical, decentralized diagnostic system is proposed for the Real-Time Diagnostic System component of the Intelligent Control System (ICS) for reusable rocket engines. The proposed diagnostic system has three layers of information processing: condition monitoring, fault mode detection, and expert system diagnostics. The condition monitoring layer is the first level of signal processing. Here, important features of the sensor data are extracted. These processed data are then used by the higher level fault mode detection layer to do preliminary diagnosis on potential faults at the component level. Because of the closely coupled nature of the rocket engine propulsion system components, it is expected that a given engine condition may trigger more than one fault mode detector. Expert knowledge is needed to resolve the conflicting reports from the various failure mode detectors. This is the function of the diagnostic expert layer. Here, the heuristic nature of this decision process makes it desirable to use an expert system approach. Implementation of the real-time diagnostic system described above requires a wide spectrum of information processing capability. Generally, in the condition monitoring layer, fast data processing is often needed for feature extraction and signal conditioning. This is usually followed by some detection logic to determine the selected faults on the component level. Three different techniques are used to attack different fault detection problems in the NASA LeRC ICS testbed simulation. The first technique employed is the neural network application for real-time sensor validation which includes failure detection, isolation, and accommodation. The second approach demonstrated is the model-based fault diagnosis system using on-line parameter identification. Besides these model based diagnostic schemes, there are still many failure modes which need to be diagnosed by the heuristic expert knowledge. The heuristic expert knowledge is implemented using a real-time expert system tool called G2 by Gensym Corp. Finally, the distributed diagnostic system requires another level of intelligence to oversee the fault mode reports generated by component fault detectors. The decision making at this level can best be done using a rule-based expert system. This level of expert knowledge is also implemented using G2.

  16. The smooth (tractor) operator: insights of knowledge engineering.

    PubMed

    Cullen, Ralph H; Smarr, Cory-Ann; Serrano-Baquero, Daniel; McBride, Sara E; Beer, Jenay M; Rogers, Wendy A

    2012-11-01

    The design of and training for complex systems requires in-depth understanding of task demands imposed on users. In this project, we used the knowledge engineering approach (Bowles et al., 2004) to assess the task of mowing in a citrus grove. Knowledge engineering is divided into four phases: (1) Establish goals. We defined specific goals based on the stakeholders involved. The main goal was to identify operator demands to support improvement of the system. (2) Create a working model of the system. We reviewed product literature, analyzed the system, and conducted expert interviews. (3) Extract knowledge. We interviewed tractor operators to understand their knowledge base. (4) Structure knowledge. We analyzed and organized operator knowledge to inform project goals. We categorized the information and developed diagrams to display the knowledge effectively. This project illustrates the benefits of knowledge engineering as a qualitative research method to inform technology design and training. Copyright © 2012 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  17. Development of a computer-interpretable clinical guideline model for decision support in the differential diagnosis of hyponatremia.

    PubMed

    González-Ferrer, Arturo; Valcárcel, M Ángel; Cuesta, Martín; Cháfer, Joan; Runkle, Isabelle

    2017-07-01

    Hyponatremia is the most common type of electrolyte imbalance, occurring when serum sodium is below threshold levels, typically 135mmol/L. Electrolyte balance has been identified as one of the most challenging subjects for medical students, but also as one of the most relevant areas to learn about according to physicians and researchers. We present a computer-interpretable guideline (CIG) model that will be used for medical training to learn how to improve the diagnosis of hyponatremia applying an expert consensus document (ECDs). We used the PROForma set of tools to develop the model, using an iterative process involving two knowledge engineers (a computer science Ph.D. and a preventive medicine specialist) and two expert endocrinologists. We also carried out an initial validation of the model and a qualitative post-analysis from the results of a retrospective study (N=65 patients), comparing the consensus diagnosis of two experts with the output of the tool. The model includes over two-hundred "for", "against" and "neutral" arguments that are selectively triggered depending on the input value of more than forty patient-state variables. We share the methodology followed for the development process and the initial validation results, that achieved a high ratio of 61/65 agreements with the consensus diagnosis, having a kappa value of K=0.86 for overall agreement and K=0.80 for first-ranked agreement. Hospital care professionals involved in the project showed high expectations of using this tool for training, but the process to follow for a successful diagnosis and application is not trivial, as reported in this manuscript. Secondary benefits of using these tools are associated to improving research knowledge and existing clinical practice guidelines (CPGs) or ECDs. Beyond point-of-care clinical decision support, knowledge-based decision support systems are very attractive as a training tool, to help selected professionals to better understand difficult diseases that are underdiagnosed and/or incorrectly managed. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. A Mission in the Desert: Albuquerque District, 1935-1985

    DTIC Science & Technology

    1985-01-01

    Engineers came into New Mexico in 1935 to construct its first project near Tucumcari, the Engineers began to develop a knowledge of the political...agency, as a local unit of the federal gov- ernment in cases of civil emergency, and as a source of engineering knowledge for Southwest engineering...the magnitude of this book could reach completion without the involvement of many people at every stage of development. The assistance and knowledge

  19. Improve Data Mining and Knowledge Discovery Through the Use of MatLab

    NASA Technical Reports Server (NTRS)

    Shaykhian, Gholam Ali; Martin, Dawn (Elliott); Beil, Robert

    2011-01-01

    Data mining is widely used to mine business, engineering, and scientific data. Data mining uses pattern based queries, searches, or other analyses of one or more electronic databases/datasets in order to discover or locate a predictive pattern or anomaly indicative of system failure, criminal or terrorist activity, etc. There are various algorithms, techniques and methods used to mine data; including neural networks, genetic algorithms, decision trees, nearest neighbor method, rule induction association analysis, slice and dice, segmentation, and clustering. These algorithms, techniques and methods used to detect patterns in a dataset, have been used in the development of numerous open source and commercially available products and technology for data mining. Data mining is best realized when latent information in a large quantity of data stored is discovered. No one technique solves all data mining problems; challenges are to select algorithms or methods appropriate to strengthen data/text mining and trending within given datasets. In recent years, throughout industry, academia and government agencies, thousands of data systems have been designed and tailored to serve specific engineering and business needs. Many of these systems use databases with relational algebra and structured query language to categorize and retrieve data. In these systems, data analyses are limited and require prior explicit knowledge of metadata and database relations; lacking exploratory data mining and discoveries of latent information. This presentation introduces MatLab(R) (MATrix LABoratory), an engineering and scientific data analyses tool to perform data mining. MatLab was originally intended to perform purely numerical calculations (a glorified calculator). Now, in addition to having hundreds of mathematical functions, it is a programming language with hundreds built in standard functions and numerous available toolboxes. MatLab's ease of data processing, visualization and its enormous availability of built in functionalities and toolboxes make it suitable to perform numerical computations and simulations as well as a data mining tool. Engineers and scientists can take advantage of the readily available functions/toolboxes to gain wider insight in their perspective data mining experiments.

  20. Improve Data Mining and Knowledge Discovery through the use of MatLab

    NASA Technical Reports Server (NTRS)

    Shaykahian, Gholan Ali; Martin, Dawn Elliott; Beil, Robert

    2011-01-01

    Data mining is widely used to mine business, engineering, and scientific data. Data mining uses pattern based queries, searches, or other analyses of one or more electronic databases/datasets in order to discover or locate a predictive pattern or anomaly indicative of system failure, criminal or terrorist activity, etc. There are various algorithms, techniques and methods used to mine data; including neural networks, genetic algorithms, decision trees, nearest neighbor method, rule induction association analysis, slice and dice, segmentation, and clustering. These algorithms, techniques and methods used to detect patterns in a dataset, have been used in the development of numerous open source and commercially available products and technology for data mining. Data mining is best realized when latent information in a large quantity of data stored is discovered. No one technique solves all data mining problems; challenges are to select algorithms or methods appropriate to strengthen data/text mining and trending within given datasets. In recent years, throughout industry, academia and government agencies, thousands of data systems have been designed and tailored to serve specific engineering and business needs. Many of these systems use databases with relational algebra and structured query language to categorize and retrieve data. In these systems, data analyses are limited and require prior explicit knowledge of metadata and database relations; lacking exploratory data mining and discoveries of latent information. This presentation introduces MatLab(TradeMark)(MATrix LABoratory), an engineering and scientific data analyses tool to perform data mining. MatLab was originally intended to perform purely numerical calculations (a glorified calculator). Now, in addition to having hundreds of mathematical functions, it is a programming language with hundreds built in standard functions and numerous available toolboxes. MatLab's ease of data processing, visualization and its enormous availability of built in functionalities and toolboxes make it suitable to perform numerical computations and simulations as well as a data mining tool. Engineers and scientists can take advantage of the readily available functions/toolboxes to gain wider insight in their perspective data mining experiments.

  1. Product Lifecycle Management and the Quest for Sustainable Space Explorations

    NASA Technical Reports Server (NTRS)

    Caruso, Pamela W.; Dumbacher, Daniel L.

    2010-01-01

    Product Lifecycle Management (PLM) is an outcome of lean thinking to eliminate waste and increase productivity. PLM is inextricably tied to the systems engineering business philosophy, coupled with a methodology by which personnel, processes and practices, and information technology combine to form an architecture platform for product design, development, manufacturing, operations, and decommissioning. In this model, which is being implemented by the Engineering Directorate at the National Aeronautics and Space Administration's (NASA's) Marshall Space Flight Center, total lifecycle costs are important variables for critical decision-making. With the ultimate goal to deliver quality products that meet or exceed requirements on time and within budget, PLM is a powerful concept to shape everything from engineering trade studies and testing goals, to integrated vehicle operations and retirement scenarios. This paper will demonstrate how the Engineering Directorate is implementing PLM as part of an overall strategy to deliver safe, reliable, and affordable space exploration solutions. It has been 30 years since the United States fielded the Space Shuttle. The next generation space transportation system requires a paradigm shift such that digital tools and knowledge management, which are central elements of PLM, are used consistently to maximum effect. The outcome is a better use of scarce resources, along with more focus on stakeholder and customer requirements, as a new portfolio of enabling tools becomes second nature to the workforce. This paper will use the design and manufacturing processes, which have transitioned to digital-based activities, to show how PLM supports the comprehensive systems engineering and integration function. It also will go through a launch countdown scenario where an anomaly is detected to show how the virtual vehicle created from paperless processes will help solve technical challenges and improve the likelihood of launching on schedule, with less hands-on labor needed for processing and troubleshooting.

  2. Relating GTE and Knowledge-Based Courseware Engineering: Some Epistemological Issues.

    ERIC Educational Resources Information Center

    De Diana, Italo P. F.; Ladhani, Al-Noor

    1998-01-01

    Discusses GTE (Generic Tutoring Environment) and knowledge-based courseware engineering from an epistemological point of view and suggests some combination of the two approaches. Topics include intelligent tutoring; courseware authoring; application versus acquisition of knowledge; and domain knowledge. (LRW)

  3. Development of Management Metrics for Research and Technology

    NASA Technical Reports Server (NTRS)

    Sheskin, Theodore J.

    2003-01-01

    Professor Ted Sheskin from CSU will be tasked to research and investigate metrics that can be used to determine the technical progress for advanced development and research tasks. These metrics will be implemented in a software environment that hosts engineering design, analysis and management tools to be used to support power system and component research work at GRC. Professor Sheskin is an Industrial Engineer and has been involved in issues related to management of engineering tasks and will use his knowledge from this area to allow extrapolation into the research and technology management area. Over the course of the summer, Professor Sheskin will develop a bibliography of management papers covering current management methods that may be applicable to research management. At the completion of the summer work we expect to have him recommend a metric system to be reviewed prior to implementation in the software environment. This task has been discussed with Professor Sheskin and some review material has already been given to him.

  4. Make Your Workflows Smarter

    NASA Technical Reports Server (NTRS)

    Jones, Corey; Kapatos, Dennis; Skradski, Cory

    2012-01-01

    Do you have workflows with many manual tasks that slow down your business? Or, do you scale back workflows because there are simply too many manual tasks? Basic workflow robots can automate some common tasks, but not everything. This presentation will show how advanced robots called "expression robots" can be set up to perform everything from simple tasks such as: moving, creating folders, renaming, changing or creating an attribute, and revising, to more complex tasks like: creating a pdf, or even launching a session of Creo Parametric and performing a specific modeling task. Expression robots are able to utilize the Java API and Info*Engine to do almost anything you can imagine! Best of all, these tools are supported by PTC and will work with later releases of Windchill. Limited knowledge of Java, Info*Engine, and XML are required. The attendee will learn what task expression robots are capable of performing. The attendee will learn what is involved in setting up an expression robot. The attendee will gain a basic understanding of simple Info*Engine tasks

  5. Developing Smartphone Apps for Education, Outreach, Science, and Engineering

    NASA Astrophysics Data System (ADS)

    Weatherwax, A. T.; Fitzsimmons, Z.; Czajkowski, J.; Breimer, E.; Hellman, S. B.; Hunter, S.; Dematteo, J.; Savery, T.; Melsert, K.; Sneeringer, J.

    2010-12-01

    The increased popularity of mobile phone apps provide scientists with a new avenue for sharing and distributing data and knowledge with colleagues, while also providing meaningful education and outreach products for consumption by the general public. Our initial development of iPhone and Android apps centered on the distribution of exciting auroral images taken at the South Pole for education and outreach purposes. These portable platforms, with limited resources when compared to computers, presented a unique set of design and implementation challenges that we will discuss in this presentation. For example, the design must account for limited memory; screen size; processing power; battery life; and potentially high data transport costs. Some of these unique requirements created an environment that enabled undergraduate and high-school students to participate in the creation of these apps. Additionally, during development it became apparent that these apps could also serve as data analysis and engineering tools. Our presentation will further discuss our plans to use apps not only for Education and Public Outreach, but for teaching, science and engineering.

  6. Prototyping Tool for Web-Based Multiuser Online Role-Playing Game

    NASA Astrophysics Data System (ADS)

    Okamoto, Shusuke; Kamada, Masaru; Yonekura, Tatsuhiro

    This letter proposes a prototyping tool for Web-based Multiuser Online Role-Playing Game (MORPG). The design goal is to make this tool simple and powerful. The tool is comprised of a GUI editor, a translator and a runtime environment. The GUI editor is used to edit state-transition diagrams, each of which defines the behavior of the fictional characters. The state-transition diagrams are translated into C program codes, which plays the role of a game engine in RPG system. The runtime environment includes PHP, JavaScript with Ajax and HTML. So the prototype system can be played on the usual Web browser, such as Fire-fox, Safari and IE. On a click or key press by a player, the Web browser sends it to the Web server to reflect its consequence on the screens which other players are looking at. Prospected users of this tool include programming novices and schoolchildren. The knowledge or skill of any specific programming languages is not required to create state-transition diagrams. Its structure is not only suitable for the definition of a character behavior but also intuitive to help novices understand. Therefore, the users can easily create Web-based MORPG system with the tool.

  7. DAWN (Design Assistant Workstation) for advanced physical-chemical life support systems

    NASA Technical Reports Server (NTRS)

    Rudokas, Mary R.; Cantwell, Elizabeth R.; Robinson, Peter I.; Shenk, Timothy W.

    1989-01-01

    This paper reports the results of a project supported by the National Aeronautics and Space Administration, Office of Aeronautics and Space Technology (NASA-OAST) under the Advanced Life Support Development Program. It is an initial attempt to integrate artificial intelligence techniques (via expert systems) with conventional quantitative modeling tools for advanced physical-chemical life support systems. The addition of artificial intelligence techniques will assist the designer in the definition and simulation of loosely/well-defined life support processes/problems as well as assist in the capture of design knowledge, both quantitative and qualitative. Expert system and conventional modeling tools are integrated to provide a design workstation that assists the engineer/scientist in creating, evaluating, documenting and optimizing physical-chemical life support systems for short-term and extended duration missions.

  8. Knowledge engineering for temporal dependency networks as operations procedures. [in space communication

    NASA Technical Reports Server (NTRS)

    Fayyad, Kristina E.; Hill, Randall W., Jr.; Wyatt, E. J.

    1993-01-01

    This paper presents a case study of the knowledge engineering process employed to support the Link Monitor and Control Operator Assistant (LMCOA). The LMCOA is a prototype system which automates the configuration, calibration, test, and operation (referred to as precalibration) of the communications, data processing, metric data, antenna, and other equipment used to support space-ground communications with deep space spacecraft in NASA's Deep Space Network (DSN). The primary knowledge base in the LMCOA is the Temporal Dependency Network (TDN), a directed graph which provides a procedural representation of the precalibration operation. The TDN incorporates precedence, temporal, and state constraints and uses several supporting knowledge bases and data bases. The paper provides a brief background on the DSN, and describes the evolution of the TDN and supporting knowledge bases, the process used for knowledge engineering, and an analysis of the successes and problems of the knowledge engineering effort.

  9. Design and analysis of lifting tool assemblies to lift different engine block

    NASA Astrophysics Data System (ADS)

    Sawant, Arpana; Deshmukh, Nilaj N.; Chauhan, Santosh; Dabhadkar, Mandar; Deore, Rupali

    2017-07-01

    Engines block are required to be lifted from one place to another while they are being processed. The human effort required for this purpose is more and also the engine block may get damaged if it is not handled properly. There is a need for designing a proper lifting tool which will be able to conveniently lift the engine block and place it at the desired position without any accident and damage to the engine block. In the present study lifting tool assemblies are designed and analyzed in such way that it may lift different categories of engine blocks. The lifting tool assembly consists of lifting plate, lifting ring, cap screws and washers. A parametric model and assembly of Lifting tool is done in 3D modelling software CREO 2.0 and analysis is carried out in ANSYS Workbench 16.0. A test block of weight equivalent to that of an engine block is considered for the purpose of analysis. In the preliminary study, without washer the stresses obtained on the lifting tool were more than the safety margin. In the present design, washers were used with appropriate dimensions which helps to bring down the stresses on the lifting tool within the safety margin. Analysis is carried out to verify that tool design meets the ASME BTH-1 required safety margin.

  10. Metabolic Engineering of Oleaginous Yeasts for Production of Fuels and Chemicals

    PubMed Central

    Shi, Shuobo; Zhao, Huimin

    2017-01-01

    Oleaginous yeasts have been increasingly explored for production of chemicals and fuels via metabolic engineering. Particularly, there is a growing interest in using oleaginous yeasts for the synthesis of lipid-related products due to their high lipogenesis capability, robustness, and ability to utilize a variety of substrates. Most of the metabolic engineering studies in oleaginous yeasts focused on Yarrowia that already has plenty of genetic engineering tools. However, recent advances in systems biology and synthetic biology have provided new strategies and tools to engineer those oleaginous yeasts that have naturally high lipid accumulation but lack genetic tools, such as Rhodosporidium, Trichosporon, and Lipomyces. This review highlights recent accomplishments in metabolic engineering of oleaginous yeasts and recent advances in the development of genetic engineering tools in oleaginous yeasts within the last 3 years. PMID:29167664

  11. An integrated approach to engineering curricula improvement with multi-objective decision modeling and linear programming

    NASA Astrophysics Data System (ADS)

    Shea, John E.

    The structure of engineering curricula currently in place at most colleges and universities has existed since the early 1950's, and reflects an historical emphasis on a solid foundation in math, science, and engineering science. However, there is often not a close match between elements of the traditional engineering education, and the skill sets that graduates need to possess for success in the industrial environment. Considerable progress has been made to restructure engineering courses and curricula. What is lacking, however, are tools and methodologies that incorporate the many dimensions of college courses, and how they are structured to form a curriculum. If curriculum changes are to be made, the first objective must be to determine what knowledge and skills engineering graduates need to possess. To accomplish this, a set of engineering competencies was developed from existing literature, and used in the development of a comprehensive mail survey of alumni, employers, students and faculty. Respondents proposed some changes to the topics in the curriculum and recommended that work to improve the curriculum be focused on communication, problem solving and people skills. The process of designing a curriculum is similar to engineering design, with requirements that must be met, and objectives that must be optimized. From this similarity came the idea for developing a linear, additive, multi-objective model that identifies the objectives that must be considered when designing a curriculum, and contains the mathematical relationships necessary to quantify the value of a specific alternative. The model incorporates the three primary objectives of engineering topics, skills, and curriculum design principles and uses data from the survey. It was used to design new courses, to evaluate various curricula alternatives, and to conduct sensitivity analysis to better understand their differences. Using the multi-objective model to identify the highest scoring curriculum from a catalog of courses is difficult because of the many factors being considered. To assist this process, the multi-objective model and the curriculum requirements were incorporated in a linear program to select the "optimum" curriculum. The application of this tool was also beneficial in identifying the active constraints that limit curriculum development and content.

  12. Integrating Computational Science Tools into a Thermodynamics Course

    ERIC Educational Resources Information Center

    Vieira, Camilo; Magana, Alejandra J.; García, R. Edwin; Jana, Aniruddha; Krafcik, Matthew

    2018-01-01

    Computational tools and methods have permeated multiple science and engineering disciplines, because they enable scientists and engineers to process large amounts of data, represent abstract phenomena, and to model and simulate complex concepts. In order to prepare future engineers with the ability to use computational tools in the context of…

  13. Compound toxicity screening and structure-activity relationship modeling in Escherichia coli.

    PubMed

    Planson, Anne-Gaëlle; Carbonell, Pablo; Paillard, Elodie; Pollet, Nicolas; Faulon, Jean-Loup

    2012-03-01

    Synthetic biology and metabolic engineering are used to develop new strategies for producing valuable compounds ranging from therapeutics to biofuels in engineered microorganisms. When developing methods for high-titer production cells, toxicity is an important element to consider. Indeed the production rate can be limited due to toxic intermediates or accumulation of byproducts of the heterologous biosynthetic pathway of interest. Conversely, highly toxic molecules are desired when designing antimicrobials. Compound toxicity in bacteria plays a major role in metabolic engineering as well as in the development of new antibacterial agents. Here, we screened a diversified chemical library of 166 compounds for toxicity in Escherichia coli. The dataset was built using a clustering algorithm maximizing the chemical diversity in the library. The resulting assay data was used to develop a toxicity predictor that we used to assess the toxicity of metabolites throughout the metabolome. This new tool for predicting toxicity can thus be used for fine-tuning heterologous expression and can be integrated in a computational-framework for metabolic pathway design. Many structure-activity relationship tools have been developed for toxicology studies in eukaryotes [Valerio (2009), Toxicol Appl Pharmacol, 241(3): 356-370], however, to the best of our knowledge we present here the first E. coli toxicity prediction web server based on QSAR models (EcoliTox server: http://www.issb.genopole.fr/∼faulon/EcoliTox.php). Copyright © 2011 Wiley Periodicals, Inc.

  14. A Concept for the Inclusion of Analytical and Computational Capability in Existing Systems for Measurement of Neutron Flux

    NASA Technical Reports Server (NTRS)

    Patrick, Clinton; Cooper, Anita E.; Powers, W. T.

    2005-01-01

    For approximately two decades, efforts have been sponsored by NASA's Marshall Space Flight Center to make possible high-speed, automated classification and quantification of constituent materials in various harsh environments. MSFC, along with the Air Force/Arnold Engineering Development Center, has led the work, developing and implementing systems that employ principles of emission and absorption spectroscopy to monitor molecular and atomic particulates in gas plasma of rocket engine flow fields. One such system identifies species and quantifies mass loss rates in H2/O2 rocket plumes. Other gases have been examined and the physics of their detection under numerous conditions were made a part of the knowledge base for the MSFC/USAF team. Additionally, efforts are being advanced to hardware encode components of the data analysis tools in order to address real-time operational requirements for health monitoring and management. NASA has a significant investment in these systems, warranting a spiral approach that meshes current tools and experience with technological advancements. This paper addresses current systems - the Optical Plume Anomaly Detector (OPAD) and the Engine Diagnostic Filtering System (EDIFIS) - and discusses what is considered a natural progression: a concept for migrating them towards detection of high energy particles, including neutrons and gamma rays. The proposal outlines system development to date, basic concepts for future advancements, and recommendations for accomplishing them.

  15. Reverse engineering biological networks :applications in immune responses to bio-toxins.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Martino, Anthony A.; Sinclair, Michael B.; Davidson, George S.

    Our aim is to determine the network of events, or the regulatory network, that defines an immune response to a bio-toxin. As a model system, we are studying T cell regulatory network triggered through tyrosine kinase receptor activation using a combination of pathway stimulation and time-series microarray experiments. Our approach is composed of five steps (1) microarray experiments and data error analysis, (2) data clustering, (3) data smoothing and discretization, (4) network reverse engineering, and (5) network dynamics analysis and fingerprint identification. The technological outcome of this study is a suite of experimental protocols and computational tools that reverse engineermore » regulatory networks provided gene expression data. The practical biological outcome of this work is an immune response fingerprint in terms of gene expression levels. Inferring regulatory networks from microarray data is a new field of investigation that is no more than five years old. To the best of our knowledge, this work is the first attempt that integrates experiments, error analyses, data clustering, inference, and network analysis to solve a practical problem. Our systematic approach of counting, enumeration, and sampling networks matching experimental data is new to the field of network reverse engineering. The resulting mathematical analyses and computational tools lead to new results on their own and should be useful to others who analyze and infer networks.« less

  16. Engineering Knowledge for Assistive Living

    NASA Astrophysics Data System (ADS)

    Chen, Liming; Nugent, Chris

    This paper introduces a knowledge based approach to assistive living in smart homes. It proposes a system architecture that makes use of knowledge in the lifecycle of assistive living. The paper describes ontology based knowledge engineering practices and discusses mechanisms for exploiting knowledge for activity recognition and assistance. It presents system implementation and experiments, and discusses initial results.

  17. Working Smarter Not Harder - Developing a Virtual Subsurface Data Framework for U.S. Energy R&D

    NASA Astrophysics Data System (ADS)

    Rose, K.; Baker, D.; Bauer, J.; Dehlin, M.; Jones, T. J.; Rowan, C.

    2017-12-01

    The data revolution has resulted in a proliferation of resources that span beyond commercial and social networking domains. Research, scientific, and engineering data resources, including subsurface characterization, modeling, and analytical datasets, are increasingly available through online portals, warehouses, and systems. Data for subsurface systems is still challenging to access, discontinuous, and varies in resolution. However, with the proliferation of online data there are significant opportunities to advance access and knowledge of subsurface systems. The Energy Data eXchange (EDX) is an online platform designed to address research data needs by improving access to energy R&D products through advanced search capabilities. In addition, EDX hosts private, virtualized computational workspaces in support of multi-organizational R&D. These collaborative workspaces allow teams to share working data resources and connect to a growing number of analytical tools to support research efforts. One recent application, a team digital data notebook tool, called DataBook, was introduced within EDX workspaces to allow teams to capture contextual and structured data resources. Starting with DOE's subsurface R&D community, the EDX team has been developing DataBook to support scientists and engineers working on subsurface energy research, allowing them to contribute and curate both structured and unstructured data and knowledge about subsurface systems. These resources span petrophysical, geologic, engineering, geophysical, interpretations, models, and analyses associated with carbon storage, water, oil, gas, geothermal, induced seismicity and other subsurface systems to support the development of a virtual subsurface data framework. The integration of EDX and DataBook allows for these systems to leverage each other's best features, such as the ability to interact with other systems (Earthcube, OpenEI.net, NGDS, etc.) and leverage custom machine learning algorithms and capabilities to enhance user experience, make access and connection to relevant subsurface data resources more efficient for research teams to use, analyze and draw insights. Ultimately, the public and private resources in EDX seek to make subsurface energy research more efficient, reduce redundancy, and drive innovation.

  18. 'Unknown' proteins and 'orphan' enzymes: the missing half of the engineering parts list--and how to find it.

    PubMed

    Hanson, Andrew D; Pribat, Anne; Waller, Jeffrey C; de Crécy-Lagard, Valérie

    2009-12-14

    Like other forms of engineering, metabolic engineering requires knowledge of the components (the 'parts list') of the target system. Lack of such knowledge impairs both rational engineering design and diagnosis of the reasons for failures; it also poses problems for the related field of metabolic reconstruction, which uses a cell's parts list to recreate its metabolic activities in silico. Despite spectacular progress in genome sequencing, the parts lists for most organisms that we seek to manipulate remain highly incomplete, due to the dual problem of 'unknown' proteins and 'orphan' enzymes. The former are all the proteins deduced from genome sequence that have no known function, and the latter are all the enzymes described in the literature (and often catalogued in the EC database) for which no corresponding gene has been reported. Unknown proteins constitute up to about half of the proteins in prokaryotic genomes, and much more than this in higher plants and animals. Orphan enzymes make up more than a third of the EC database. Attacking the 'missing parts list' problem is accordingly one of the great challenges for post-genomic biology, and a tremendous opportunity to discover new facets of life's machinery. Success will require a co-ordinated community-wide attack, sustained over years. In this attack, comparative genomics is probably the single most effective strategy, for it can reliably predict functions for unknown proteins and genes for orphan enzymes. Furthermore, it is cost-efficient and increasingly straightforward to deploy owing to a proliferation of databases and associated tools.

  19. Overview of NASA MSFC IEC Federated Engineering Collaboration Capability

    NASA Technical Reports Server (NTRS)

    Moushon, Brian; McDuffee, Patrick

    2005-01-01

    The MSFC IEC federated engineering framework is currently developing a single collaborative engineering framework across independent NASA centers. The federated approach allows NASA centers the ability to maintain diversity and uniqueness, while providing interoperability. These systems are integrated together in a federated framework without compromising individual center capabilities. MSFC IEC's Federation Framework will have a direct affect on how engineering data is managed across the Agency. The approach is directly attributed in response to the Columbia Accident Investigation Board (CAB) finding F7.4-11 which states the Space Shuttle Program has a wealth of data sucked away in multiple databases without a convenient way to integrate and use the data for management, engineering, or safety decisions. IEC s federated capability is further supported by OneNASA recommendation 6 that identifies the need to enhance cross-Agency collaboration by putting in place common engineering and collaborative tools and databases, processes, and knowledge-sharing structures. MSFC's IEC Federated Framework is loosely connected to other engineering applications that can provide users with the integration needed to achieve an Agency view of the entire product definition and development process, while allowing work to be distributed across NASA Centers and contractors. The IEC DDMS federation framework eliminates the need to develop a single, enterprise-wide data model, where the goal of having a common data model shared between NASA centers and contractors is very difficult to achieve.

  20. CLIPS: A tool for corn disease diagnostic system and an aid to neural network for automated knowledge acquisition

    NASA Technical Reports Server (NTRS)

    Wu, Cathy; Taylor, Pam; Whitson, George; Smith, Cathy

    1990-01-01

    This paper describes the building of a corn disease diagnostic expert system using CLIPS, and the development of a neural expert system using the fact representation method of CLIPS for automated knowledge acquisition. The CLIPS corn expert system diagnoses 21 diseases from 52 symptoms and signs with certainty factors. CLIPS has several unique features. It allows the facts in rules to be broken down to object-attribute-value (OAV) triples, allows rule-grouping, and fires rules based on pattern-matching. These features combined with the chained inference engine result to a natural user query system and speedy execution. In order to develop a method for automated knowledge acquisition, an Artificial Neural Expert System (ANES) is developed by a direct mapping from the CLIPS system. The ANES corn expert system uses the same OAV triples in the CLIPS system for its facts. The LHS and RHS facts of the CLIPS rules are mapped into the input and output layers of the ANES, respectively; and the inference engine of the rules is imbedded in the hidden layer. The fact representation by OAC triples gives a natural grouping of the rules. These features allow the ANES system to automate rule-generation, and make it efficient to execute and easy to expand for a large and complex domain.

  1. Milestones in Software Engineering and Knowledge Engineering History: A Comparative Review

    PubMed Central

    del Águila, Isabel M.; Palma, José; Túnez, Samuel

    2014-01-01

    We present a review of the historical evolution of software engineering, intertwining it with the history of knowledge engineering because “those who cannot remember the past are condemned to repeat it.” This retrospective represents a further step forward to understanding the current state of both types of engineerings; history has also positive experiences; some of them we would like to remember and to repeat. Two types of engineerings had parallel and divergent evolutions but following a similar pattern. We also define a set of milestones that represent a convergence or divergence of the software development methodologies. These milestones do not appear at the same time in software engineering and knowledge engineering, so lessons learned in one discipline can help in the evolution of the other one. PMID:24624046

  2. Milestones in software engineering and knowledge engineering history: a comparative review.

    PubMed

    del Águila, Isabel M; Palma, José; Túnez, Samuel

    2014-01-01

    We present a review of the historical evolution of software engineering, intertwining it with the history of knowledge engineering because "those who cannot remember the past are condemned to repeat it." This retrospective represents a further step forward to understanding the current state of both types of engineerings; history has also positive experiences; some of them we would like to remember and to repeat. Two types of engineerings had parallel and divergent evolutions but following a similar pattern. We also define a set of milestones that represent a convergence or divergence of the software development methodologies. These milestones do not appear at the same time in software engineering and knowledge engineering, so lessons learned in one discipline can help in the evolution of the other one.

  3. Identifying and prioritizing the tools/techniques of knowledge management based on the Asian Productivity Organization Model (APO) to use in hospitals.

    PubMed

    Khajouei, Hamid; Khajouei, Reza

    2017-12-01

    Appropriate knowledge, correct information, and relevant data are vital in medical diagnosis and treatment systems. Knowledge Management (KM) through its tools/techniques provides a pertinent framework for decision-making in healthcare systems. The objective of this study was to identify and prioritize the KM tools/techniques that apply to hospital setting. This is a descriptive-survey study. Data were collected using a -researcher-made questionnaire that was developed based on experts' opinions to select the appropriate tools/techniques from 26 tools/techniques of the Asian Productivity Organization (APO) model. Questions were categorized into five steps of KM (identifying, creating, storing, sharing, and applying the knowledge) according to this model. The study population consisted of middle and senior managers of hospitals and managing directors of Vice-Chancellor for Curative Affairs in Kerman University of Medical Sciences in Kerman, Iran. The data were analyzed in SPSS v.19 using one-sample t-test. Twelve out of 26 tools/techniques of the APO model were identified as the tools applicable in hospitals. "Knowledge café" and "APO knowledge management assessment tool" with respective means of 4.23 and 3.7 were the most and the least applicable tools in the knowledge identification step. "Mentor-mentee scheme", as well as "voice and Voice over Internet Protocol (VOIP)" with respective means of 4.20 and 3.52 were the most and the least applicable tools/techniques in the knowledge creation step. "Knowledge café" and "voice and VOIP" with respective means of 3.85 and 3.42 were the most and the least applicable tools/techniques in the knowledge storage step. "Peer assist and 'voice and VOIP' with respective means of 4.14 and 3.38 were the most and the least applicable tools/techniques in the knowledge sharing step. Finally, "knowledge worker competency plan" and "knowledge portal" with respective means of 4.38 and 3.85 were the most and the least applicable tools/techniques in the knowledge application step. The results showed that 12 out of 26 tools in the APO model are appropriate for hospitals of which 11 are significantly applicable, and "storytelling" is marginally applicable. In this study, the preferred tools/techniques for implementation of each of the five KM steps in hospitals are introduced. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Mechanical knowledge does matter to tool use even when assessed with a non-production task: Evidence from left brain-damaged patients.

    PubMed

    Lesourd, Mathieu; Budriesi, Carla; Osiurak, François; Nichelli, Paolo F; Bartolo, Angela

    2017-12-20

    In the literature on apraxia of tool use, it is now accepted that using familiar tools requires semantic and mechanical knowledge. However, mechanical knowledge is nearly always assessed with production tasks, so one may assume that mechanical knowledge and familiar tool use are associated only because of their common motor mechanisms. This notion may be challenged by demonstrating that familiar tool use depends on an alternative tool selection task assessing mechanical knowledge, where alternative uses of tools are assumed according to their physical properties but where actual use of tools is not needed. We tested 21 left brain-damaged patients and 21 matched controls with familiar tool use tasks (pantomime and single tool use), semantic tasks and an alternative tool selection task. The alternative tool selection task accounted for a large amount of variance in the single tool use task and was the best predictor among all the semantic tasks. Concerning the pantomime of tool use task, group and individual results suggested that the integrity of the semantic system and preserved mechanical knowledge are neither necessary nor sufficient to produce pantomimes. These results corroborate the idea that mechanical knowledge is essential when we use tools, even when tasks assessing mechanical knowledge do not require the production of any motor action. Our results also confirm the value of pantomime of tool use, which can be considered as a complex activity involving several cognitive abilities (e.g., communicative skills) rather than the activation of gesture engrams. © 2017 The British Psychological Society.

  5. Failure analysis of a tool steel torque shaft

    NASA Technical Reports Server (NTRS)

    Reagan, J. R.

    1981-01-01

    A low design load drive shaft used to deliver power from an experimental exhaust heat recovery system to the crankshaft of an experimental diesel truck engine failed during highway testing. An independent testing laboratory analyzed the failure by routine metallography and attributed the failure to fatigue induced by a banded microstructure. Visual examination by NASA of the failed shaft plus the knowledge of the torsional load that it carried pointed to a 100 percent ductile failure with no evidence of fatigue. Scanning electron microscopy confirmed this. Torsional test specimens were produced from pieces of the failed shaft and torsional overload testing produced identical failures to that which had occurred in the truck engine. This pointed to a failure caused by a high overload and although the microstructure was defective it was not the cause of the failure.

  6. Integrated System Health Management (ISHM) Implementation in Rocket Engine Testing

    NASA Technical Reports Server (NTRS)

    Figueroa, Fernando; Morris, Jon; Turowski, Mark; Franzl, Richard; Walker, Mark; Kapadia, Ravi; Venkatesh, Meera

    2010-01-01

    A pilot operational ISHM capability has been implemented for the E-2 Rocket Engine Test Stand (RETS) and a Chemical Steam Generator (CSG) test article at NASA Stennis Space Center. The implementation currently includes an ISHM computer and a large display in the control room. The paper will address the overall approach, tools, and requirements. It will also address the infrastructure and architecture. Specific anomaly detection algorithms will be discussed regarding leak detection and diagnostics, valve validation, and sensor validation. It will also describe development and use of a Health Assessment Database System (HADS) as a repository for measurements, health, configuration, and knowledge related to a system with ISHM capability. It will conclude with a discussion of user interfaces, and a description of the operation of the ISHM system prior, during, and after testing.

  7. Mathematical modelling in engineering: an alternative way to teach Linear Algebra

    NASA Astrophysics Data System (ADS)

    Domínguez-García, S.; García-Planas, M. I.; Taberna, J.

    2016-10-01

    Technological advances require that basic science courses for engineering, including Linear Algebra, emphasize the development of mathematical strengths associated with modelling and interpretation of results, which are not limited only to calculus abilities. Based on this consideration, we have proposed a project-based learning, giving a dynamic classroom approach in which students modelled real-world problems and turn gain a deeper knowledge of the Linear Algebra subject. Considering that most students are digital natives, we use the e-portfolio as a tool of communication between students and teachers, besides being a good place making the work visible. In this article, we present an overview of the design and implementation of a project-based learning for a Linear Algebra course taught during the 2014-2015 at the 'ETSEIB'of Universitat Politècnica de Catalunya (UPC).

  8. ANSYS UIDL-Based CAE Development of Axial Support System for Optical Mirror

    NASA Astrophysics Data System (ADS)

    Yang, De-Hua; Shao, Liang

    2008-09-01

    The Whiffle-tree type axial support mechanism is widely adopted by most relatively large optical mirrors. Based on the secondary developing tools offered by the commonly used Finite Element Anylysis (FEA) software ANSYS, ANSYS Parametric Design Language (APDL) is used for creating the mirror FEA model driven by parameters, and ANSYS User Interface Design Language (UIDL) for generating custom menu of interactive manner, whereby, the relatively independent dedicated Computer Aided Engineering (CAE) module is embedded in ANSYS for calculation and optimization of axial Whiffle-tree support of optical mirrors. An example is also described to illustrate the intuitive and effective usage of the dedicated module by boosting work efficiency and releasing related engineering knowledge of user. The philosophy of secondary-developed special module with commonly used software also suggests itself for product development in other industries.

  9. Autonomous and Autonomic Swarms

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G.; Rash, James L.; Truszkowski, Walter F.; Rouff, Christopher A.; Sterritt, Roy

    2005-01-01

    A watershed in systems engineering is represented by the advent of swarm-based systems that accomplish missions through cooperative action by a (large) group of autonomous individuals each having simple capabilities and no global knowledge of the group s objective. Such systems, with individuals capable of surviving in hostile environments, pose unprecedented challenges to system developers. Design and testing and verification at much higher levels will be required, together with the corresponding tools, to bring such systems to fruition. Concepts for possible future NASA space exploration missions include autonomous, autonomic swarms. Engineering swarm-based missions begins with understanding autonomy and autonomicity and how to design, test, and verify systems that have those properties and, simultaneously, the capability to accomplish prescribed mission goals. Formal methods-based technologies, both projected and in development, are described in terms of their potential utility to swarm-based system developers.

  10. Investigating the Extent That an Integrative Learning Module Broadens the Perception of First-Year Students about the Engineering Profession

    ERIC Educational Resources Information Center

    Singer, Kerri Patrick; Foutz, Tim; Navarro, Maria; Thompson, Sidney

    2015-01-01

    Engineers today need both engineering knowledge and social science knowledge to solve complex problems. However, most people have a traditional view of engineering as a field dominated by math and science foci, with little social consequence. This study examined and compared perceptions about engineering from Freshmen taking three different First…

  11. Tools to Support Human Factors and Systems Engineering Interactions During Early Analysis

    NASA Technical Reports Server (NTRS)

    Thronesbery, Carroll; Malin, Jane T.; Holden, Kritina; Smith, Danielle Paige

    2005-01-01

    We describe an approach and existing software tool support for effective interactions between human factors engineers and systems engineers in early analysis activities during system acquisition. We examine the tasks performed during this stage, emphasizing those tasks where system engineers and human engineers interact. The Concept of Operations (ConOps) document is an important product during this phase, and particular attention is paid to its influences on subsequent acquisition activities. Understanding this influence helps ConOps authors describe a complete system concept that guides subsequent acquisition activities. We identify commonly used system engineering and human engineering tools and examine how they can support the specific tasks associated with system definition. We identify possible gaps in the support of these tasks, the largest of which appears to be creating the ConOps document itself. Finally, we outline the goals of our future empirical investigations of tools to support system concept definition.

  12. Tools to Support Human Factors and Systems Engineering Interactions During Early Analysis

    NASA Technical Reports Server (NTRS)

    Thronesbery, Carroll; Malin, Jane T.; Holden, Kritina; Smith, Danielle Paige

    2006-01-01

    We describe an approach and existing software tool support for effective interactions between human factors engineers and systems engineers in early analysis activities during system acquisition. We examine the tasks performed during this stage, emphasizing those tasks where system engineers and human engineers interact. The Concept of Operations (ConOps) document is an important product during this phase, and particular attention is paid to its influences on subsequent acquisition activities. Understanding this influence helps ConOps authors describe a complete system concept that guides subsequent acquisition activities. We identify commonly used system engineering and human engineering tools and examine how they can support the specific tasks associated with system definition. We identify possible gaps in the support of these tasks, the largest of which appears to be creating the ConOps document itself. Finally, we outline the goals of our future empirical investigations of tools to support system concept definition.

  13. When one model is not enough: combining epistemic tools in systems biology.

    PubMed

    Green, Sara

    2013-06-01

    In recent years, the philosophical focus of the modeling literature has shifted from descriptions of general properties of models to an interest in different model functions. It has been argued that the diversity of models and their correspondingly different epistemic goals are important for developing intelligible scientific theories (Leonelli, 2007; Levins, 2006). However, more knowledge is needed on how a combination of different epistemic means can generate and stabilize new entities in science. This paper will draw on Rheinberger's practice-oriented account of knowledge production. The conceptual repertoire of Rheinberger's historical epistemology offers important insights for an analysis of the modelling practice. I illustrate this with a case study on network modeling in systems biology where engineering approaches are applied to the study of biological systems. I shall argue that the use of multiple representational means is an essential part of the dynamic of knowledge generation. It is because of-rather than in spite of-the diversity of constraints of different models that the interlocking use of different epistemic means creates a potential for knowledge production. Copyright © 2013 Elsevier Ltd. All rights reserved.

  14. cellVIEW: a Tool for Illustrative and Multi-Scale Rendering of Large Biomolecular Datasets

    PubMed Central

    Le Muzic, Mathieu; Autin, Ludovic; Parulek, Julius; Viola, Ivan

    2017-01-01

    In this article we introduce cellVIEW, a new system to interactively visualize large biomolecular datasets on the atomic level. Our tool is unique and has been specifically designed to match the ambitions of our domain experts to model and interactively visualize structures comprised of several billions atom. The cellVIEW system integrates acceleration techniques to allow for real-time graphics performance of 60 Hz display rate on datasets representing large viruses and bacterial organisms. Inspired by the work of scientific illustrators, we propose a level-of-detail scheme which purpose is two-fold: accelerating the rendering and reducing visual clutter. The main part of our datasets is made out of macromolecules, but it also comprises nucleic acids strands which are stored as sets of control points. For that specific case, we extend our rendering method to support the dynamic generation of DNA strands directly on the GPU. It is noteworthy that our tool has been directly implemented inside a game engine. We chose to rely on a third party engine to reduce software development work-load and to make bleeding-edge graphics techniques more accessible to the end-users. To our knowledge cellVIEW is the only suitable solution for interactive visualization of large bimolecular landscapes on the atomic level and is freely available to use and extend. PMID:29291131

  15. ELISA, a demonstrator environment for information systems architecture design

    NASA Technical Reports Server (NTRS)

    Panem, Chantal

    1994-01-01

    This paper describes an approach of reusability of software engineering technology in the area of ground space system design. System engineers have lots of needs similar to software developers: sharing of a common data base, capitalization of knowledge, definition of a common design process, communication between different technical domains. Moreover system designers need to simulate dynamically their system as early as possible. Software development environments, methods and tools now become operational and widely used. Their architecture is based on a unique object base, a set of common management services and they host a family of tools for each life cycle activity. In late '92, CNES decided to develop a demonstrative software environment supporting some system activities. The design of ground space data processing systems was chosen as the application domain. ELISA (Integrated Software Environment for Architectures Specification) was specified as a 'demonstrator', i.e. a sufficient basis for demonstrations, evaluation and future operational enhancements. A process with three phases was implemented: system requirements definition, design of system architectures models, and selection of physical architectures. Each phase is composed of several activities that can be performed in parallel, with the provision of Commercial Off the Shelves Tools. ELISA has been delivered to CNES in January 94, currently used for demonstrations and evaluations on real projects (e.g. SPOT4 Satellite Control Center). It is on the way of new evolutions.

  16. Computational Tools for Metabolic Engineering

    PubMed Central

    Copeland, Wilbert B.; Bartley, Bryan A.; Chandran, Deepak; Galdzicki, Michal; Kim, Kyung H.; Sleight, Sean C.; Maranas, Costas D.; Sauro, Herbert M.

    2012-01-01

    A great variety of software applications are now employed in the metabolic engineering field. These applications have been created to support a wide range of experimental and analysis techniques. Computational tools are utilized throughout the metabolic engineering workflow to extract and interpret relevant information from large data sets, to present complex models in a more manageable form, and to propose efficient network design strategies. In this review, we present a number of tools that can assist in modifying and understanding cellular metabolic networks. The review covers seven areas of relevance to metabolic engineers. These include metabolic reconstruction efforts, network visualization, nucleic acid and protein engineering, metabolic flux analysis, pathway prospecting, post-structural network analysis and culture optimization. The list of available tools is extensive and we can only highlight a small, representative portion of the tools from each area. PMID:22629572

  17. 3D-liquid chromatography as a complex mixture characterization tool for knowledge-based downstream process development.

    PubMed

    Hanke, Alexander T; Tsintavi, Eleni; Ramirez Vazquez, Maria Del Pilar; van der Wielen, Luuk A M; Verhaert, Peter D E M; Eppink, Michel H M; van de Sandt, Emile J A X; Ottens, Marcel

    2016-09-01

    Knowledge-based development of chromatographic separation processes requires efficient techniques to determine the physicochemical properties of the product and the impurities to be removed. These characterization techniques are usually divided into approaches that determine molecular properties, such as charge, hydrophobicity and size, or molecular interactions with auxiliary materials, commonly in the form of adsorption isotherms. In this study we demonstrate the application of a three-dimensional liquid chromatography approach to a clarified cell homogenate containing a therapeutic enzyme. Each separation dimension determines a molecular property relevant to the chromatographic behavior of each component. Matching of the peaks across the different separation dimensions and against a high-resolution reference chromatogram allows to assign the determined parameters to pseudo-components, allowing to determine the most promising technique for the removal of each impurity. More detailed process design using mechanistic models requires isotherm parameters. For this purpose, the second dimension consists of multiple linear gradient separations on columns in a high-throughput screening compatible format, that allow regression of isotherm parameters with an average standard error of 8%. © 2016 American Institute of Chemical Engineers Biotechnol. Prog., 32:1283-1291, 2016. © 2016 American Institute of Chemical Engineers.

  18. BioArtificial polymers

    NASA Astrophysics Data System (ADS)

    Szałata, Kamila; Gumi, Tania

    2017-07-01

    Nowadays, the polymer science has impact in practically all life areas. Countless benefits coming from the usage of materials with high mechanical and chemical resistance, variety of functionalities and potentiality of modification drive to the development of new application fields. Novel approaches of combining these synthetic substances with biomolecules lead to obtain multifunctional hybrid conjugates which merge the bioactivity of natural component with outstanding properties of artificial polymer. Over the decades, an immense progress in bioartificial composites domain allowed to reach a high level of knowledge in terms of natural-like systems engineering, leading to diverse strategies of biomolecule immobilization. Together with different available options, including covalent and noncovalent attachment, come various challenges, related mainly with maintaining the biological activity of fixed molecules. Even though the amount of applications that achieve commercial status is still not substantial, and is expanding continuously in the disciplines like "smart materials," biosensors, delivery systems, nanoreactors and many others. A huge number of remarkable developments reported in the literature present a potential of bioartificial conjugates as a fabrics with highly controllable structure and multiple functionalities, serving as a powerful nanotechnological tool. This novel approach brings closer biologists, chemists and engineers, who sharing their effort and complementing the knowledge can revolutionize the field of bioartificial polymer science.

  19. Selection platforms for directed evolution in synthetic biology

    PubMed Central

    Tizei, Pedro A.G.; Csibra, Eszter; Torres, Leticia; Pinheiro, Vitor B.

    2016-01-01

    Life on Earth is incredibly diverse. Yet, underneath that diversity, there are a number of constants and highly conserved processes: all life is based on DNA and RNA; the genetic code is universal; biology is limited to a small subset of potential chemistries. A vast amount of knowledge has been accrued through describing and characterizing enzymes, biological processes and organisms. Nevertheless, much remains to be understood about the natural world. One of the goals in Synthetic Biology is to recapitulate biological complexity from simple systems made from biological molecules–gaining a deeper understanding of life in the process. Directed evolution is a powerful tool in Synthetic Biology, able to bypass gaps in knowledge and capable of engineering even the most highly conserved biological processes. It encompasses a range of methodologies to create variation in a population and to select individual variants with the desired function–be it a ligand, enzyme, pathway or even whole organisms. Here, we present some of the basic frameworks that underpin all evolution platforms and review some of the recent contributions from directed evolution to synthetic biology, in particular methods that have been used to engineer the Central Dogma and the genetic code. PMID:27528765

  20. Selection platforms for directed evolution in synthetic biology.

    PubMed

    Tizei, Pedro A G; Csibra, Eszter; Torres, Leticia; Pinheiro, Vitor B

    2016-08-15

    Life on Earth is incredibly diverse. Yet, underneath that diversity, there are a number of constants and highly conserved processes: all life is based on DNA and RNA; the genetic code is universal; biology is limited to a small subset of potential chemistries. A vast amount of knowledge has been accrued through describing and characterizing enzymes, biological processes and organisms. Nevertheless, much remains to be understood about the natural world. One of the goals in Synthetic Biology is to recapitulate biological complexity from simple systems made from biological molecules-gaining a deeper understanding of life in the process. Directed evolution is a powerful tool in Synthetic Biology, able to bypass gaps in knowledge and capable of engineering even the most highly conserved biological processes. It encompasses a range of methodologies to create variation in a population and to select individual variants with the desired function-be it a ligand, enzyme, pathway or even whole organisms. Here, we present some of the basic frameworks that underpin all evolution platforms and review some of the recent contributions from directed evolution to synthetic biology, in particular methods that have been used to engineer the Central Dogma and the genetic code. © 2016 The Author(s).

  1. M-and-C Domain Map Maker: an environment complimenting MDE with M-and-C knowledge and ensuring solution completeness

    NASA Astrophysics Data System (ADS)

    Patwari, Puneet; Choudhury, Subhrojyoti R.; Banerjee, Amar; Swaminathan, N.; Pandey, Shreya

    2016-07-01

    Model Driven Engineering (MDE) as a key driver to reduce development cost of M&C systems is beginning to find acceptance across scientific instruments such as Radio Telescopes and Nuclear Reactors. Such projects are adopting it to reduce time to integrate, test and simulate their individual controllers and increase reusability and traceability in the process. The creation and maintenance of models is still a significant challenge to realizing MDE benefits. Creating domain-specific modelling environments reduces the barriers, and we have been working along these lines, creating a domain-specific language and environment based on an M&C knowledge model. However, large projects involve several such domains, and there is still a need to interconnect the domain models, in order to ensure modelling completeness. This paper presents a knowledge-centric approach to doing that, by creating a generic system model that underlies the individual domain knowledge models. We present our vision for M&C Domain Map Maker, a set of processes and tools that enables explication of domain knowledge in terms of domain models with mutual consistency relationships to aid MDE.

  2. Knowledge-based personalized search engine for the Web-based Human Musculoskeletal System Resources (HMSR) in biomechanics.

    PubMed

    Dao, Tien Tuan; Hoang, Tuan Nha; Ta, Xuan Hien; Tho, Marie Christine Ho Ba

    2013-02-01

    Human musculoskeletal system resources of the human body are valuable for the learning and medical purposes. Internet-based information from conventional search engines such as Google or Yahoo cannot response to the need of useful, accurate, reliable and good-quality human musculoskeletal resources related to medical processes, pathological knowledge and practical expertise. In this present work, an advanced knowledge-based personalized search engine was developed. Our search engine was based on a client-server multi-layer multi-agent architecture and the principle of semantic web services to acquire dynamically accurate and reliable HMSR information by a semantic processing and visualization approach. A security-enhanced mechanism was applied to protect the medical information. A multi-agent crawler was implemented to develop a content-based database of HMSR information. A new semantic-based PageRank score with related mathematical formulas were also defined and implemented. As the results, semantic web service descriptions were presented in OWL, WSDL and OWL-S formats. Operational scenarios with related web-based interfaces for personal computers and mobile devices were presented and analyzed. Functional comparison between our knowledge-based search engine, a conventional search engine and a semantic search engine showed the originality and the robustness of our knowledge-based personalized search engine. In fact, our knowledge-based personalized search engine allows different users such as orthopedic patient and experts or healthcare system managers or medical students to access remotely into useful, accurate, reliable and good-quality HMSR information for their learning and medical purposes. Copyright © 2012 Elsevier Inc. All rights reserved.

  3. Relationship of Prior Knowledge and Working Engineers' Learning Preferences: Implications for Designing Effective Instruction

    ERIC Educational Resources Information Center

    Baukal, Charles E.; Ausburn, Lynna J.

    2017-01-01

    Continuing engineering education (CEE) is important to ensure engineers maintain proficiency over the life of their careers. However, relatively few studies have examined designing effective training for working engineers. Research has indicated that both learner instructional preferences and prior knowledge can impact the learning process, but it…

  4. A Survey of Cooperative Engineering Education. Bulletin, 1949, No. 15

    ERIC Educational Resources Information Center

    Armsby, Henry H.

    1949-01-01

    Engineers are increasingly being called to occupy positions of leadership in which they need knowledge of society and of social processes sometimes to a greater degree than they need engineering skill. Many educators are deeply concerned over the question of how engineering students may acquire this knowledge in addition to the technical knowledge…

  5. A scalable architecture for incremental specification and maintenance of procedural and declarative clinical decision-support knowledge.

    PubMed

    Hatsek, Avner; Shahar, Yuval; Taieb-Maimon, Meirav; Shalom, Erez; Klimov, Denis; Lunenfeld, Eitan

    2010-01-01

    Clinical guidelines have been shown to improve the quality of medical care and to reduce its costs. However, most guidelines exist in a free-text representation and, without automation, are not sufficiently accessible to clinicians at the point of care. A prerequisite for automated guideline application is a machine-comprehensible representation of the guidelines. In this study, we designed and implemented a scalable architecture to support medical experts and knowledge engineers in specifying and maintaining the procedural and declarative aspects of clinical guideline knowledge, resulting in a machine comprehensible representation. The new framework significantly extends our previous work on the Digital electronic Guidelines Library (DeGeL) The current study designed and implemented a graphical framework for specification of declarative and procedural clinical knowledge, Gesher. We performed three different experiments to evaluate the functionality and usability of the major aspects of the new framework: Specification of procedural clinical knowledge, specification of declarative clinical knowledge, and exploration of a given clinical guideline. The subjects included clinicians and knowledge engineers (overall, 27 participants). The evaluations indicated high levels of completeness and correctness of the guideline specification process by both the clinicians and the knowledge engineers, although the best results, in the case of declarative-knowledge specification, were achieved by teams including a clinician and a knowledge engineer. The usability scores were high as well, although the clinicians' assessment was significantly lower than the assessment of the knowledge engineers.

  6. NASA/DoD Aerospace Knowledge Diffusion Research Project. Paper 31: The information-seeking behavior of engineers

    NASA Technical Reports Server (NTRS)

    Pinelli, Thomas E.; Bishop, Ann P.; Barclay, Rebecca O.; Kennedy, John M.

    1993-01-01

    Engineers are an extraordinarily diverse group of professionals, but an attribute common to all engineers is their use of information. Engineering can be conceptualized as an information processing system that must deal with work-related uncertainty through patterns of technical communications. Throughout the process, data, information, and tacit knowledge are being acquired, produced, transferred, and utilized. While acknowledging that other models exist, we have chosen to view the information-seeking behavior of engineers within a conceptual framework of the engineer as an information processor. This article uses the chosen framework to discuss information-seeking behavior of engineers, reviewing selected literature and empirical studies from library and information science, management, communications, and sociology. The article concludes by proposing a research agenda designed to extend our current, limited knowledge of the way engineers process information.

  7. Biological knowledge bases using Wikis: combining the flexibility of Wikis with the structure of databases.

    PubMed

    Brohée, Sylvain; Barriot, Roland; Moreau, Yves

    2010-09-01

    In recent years, the number of knowledge bases developed using Wiki technology has exploded. Unfortunately, next to their numerous advantages, classical Wikis present a critical limitation: the invaluable knowledge they gather is represented as free text, which hinders their computational exploitation. This is in sharp contrast with the current practice for biological databases where the data is made available in a structured way. Here, we present WikiOpener an extension for the classical MediaWiki engine that augments Wiki pages by allowing on-the-fly querying and formatting resources external to the Wiki. Those resources may provide data extracted from databases or DAS tracks, or even results returned by local or remote bioinformatics analysis tools. This also implies that structured data can be edited via dedicated forms. Hence, this generic resource combines the structure of biological databases with the flexibility of collaborative Wikis. The source code and its documentation are freely available on the MediaWiki website: http://www.mediawiki.org/wiki/Extension:WikiOpener.

  8. A cross-disciplinary response to improve test activities: The corporate memory capitalization in Ariane4 test domain

    NASA Technical Reports Server (NTRS)

    Vo, Dinh Phuoc; Soler, Christian; Aussenac, N.; Macchion, D.

    1993-01-01

    The Assembly, Integration, Test, and Validation (AIT/AIV) of the Ariane4 Vehicle Equipment Bay was held at Matra Marconi Space (MMS) site of Toulouse for several years. For this activity, incident interpretation necessitates a great deal of different knowledge. When complex faults occur, particularly those appearing during overall control tests, experts of various domains (EGSE, software, on-board equipment) have to join for investigation sessions. Thus, an assistance tool for the identification of faulty equipment will improve the efficiency of diagnosis and the overall productivity of test activities. As a solution, the Aramiihs laboratory proposed considering the opportunity of a knowledge based system intended to assist the tester in diagnosis. This knowledge based system is, in fact, a short-term achievement of a long-term goal which is the capitalization of corporate memory in the Ariane4 test domain. Aramiihs is a research unit where engineers from MMS and researchers from the IRIT-CNRS cooperate on problems concerning new types of man-system interaction.

  9. High productivity machining of holes in Inconel 718 with SiAlON tools

    NASA Astrophysics Data System (ADS)

    Agirreurreta, Aitor Arruti; Pelegay, Jose Angel; Arrazola, Pedro Jose; Ørskov, Klaus Bonde

    2016-10-01

    Inconel 718 is often employed in aerospace engines and power generation turbines. Numerous researches have proven the enhanced productivity when turning with ceramic tools compared to carbide ones, however there is considerably less information with regard to milling. Moreover, no knowledge has been published about machining holes with this type of tools. Additional research on different machining techniques, like for instance circular ramping, is critical to expand the productivity improvements that ceramics can offer. In this a 3D model of the machining and a number of experiments with SiAlON round inserts have been carried out in order to evaluate the effect of the cutting speed and pitch on the tool wear and chip generation. The results of this analysis show that three different types of chips are generated and also that there are three potential wear zones. Top slice wear is identified as the most critical wear type followed by the notch wear as a secondary wear mechanism. Flank wear and adhesion are also found in most of the tests.

  10. The Theory of Planned Behaviour Applied to Search Engines as a Learning Tool

    ERIC Educational Resources Information Center

    Liaw, Shu-Sheng

    2004-01-01

    Search engines have been developed for helping learners to seek online information. Based on theory of planned behaviour approach, this research intends to investigate the behaviour of using search engines as a learning tool. After factor analysis, the results suggest that perceived satisfaction of search engine, search engines as an information…

  11. Knowledge management in the engineering design environment

    NASA Technical Reports Server (NTRS)

    Briggs, Hugh C.

    2006-01-01

    The Aerospace and Defense industry is experiencing an increasing loss of knowledge through workforce reductions associated with business consolidation and retirement of senior personnel. Significant effort is being placed on process definition as part of ISO certification and, more recently, CMMI certification. The process knowledge in these efforts represents the simplest of engineering knowledge and many organizations are trying to get senior engineers to write more significant guidelines, best practices and design manuals. A new generation of design software, known as Product Lifecycle Management systems, has many mechanisms for capturing and deploying a wider variety of engineering knowledge than simple process definitions. These hold the promise of significant improvements through reuse of prior designs, codification of practices in workflows, and placement of detailed how-tos at the point of application.

  12. Transboundary Water: Improving Methodologies and Developing Integrated Tools to Support Water Security

    NASA Technical Reports Server (NTRS)

    Hakimdavar, Raha; Wood, Danielle; Eylander, John; Peters-Lidard, Christa; Smith, Jane; Doorn, Brad; Green, David; Hummel, Corey; Moore, Thomas C.

    2018-01-01

    River basins for which transboundary coordination and governance is a factor are of concern to US national security, yet there is often a lack of sufficient data-driven information available at the needed time horizons to inform transboundary water decision-making for the intelligence, defense, and foreign policy communities. To address this need, a two-day workshop entitled Transboundary Water: Improving Methodologies and Developing Integrated Tools to Support Global Water Security was held in August 2017 in Maryland. The committee that organized and convened the workshop (the Organizing Committee) included representatives from the National Aeronautics and Space Administration (NASA), the US Army Corps of Engineers Engineer Research and Development Center (ERDC), and the US Air Force. The primary goal of the workshop was to advance knowledge on the current US Government and partners' technical information needs and gaps to support national security interests in relation to transboundary water. The workshop also aimed to identify avenues for greater communication and collaboration among the scientific, intelligence, defense, and foreign policy communities. The discussion around transboundary water was considered in the context of the greater global water challenges facing US national security.

  13. Design Patterns for Learning and Assessment: Facilitating the Introduction of a Complex Simulation-Based Learning Environment into a Community of Instructors

    NASA Astrophysics Data System (ADS)

    Frezzo, Dennis C.; Behrens, John T.; Mislevy, Robert J.

    2010-04-01

    Simulation environments make it possible for science and engineering students to learn to interact with complex systems. Putting these capabilities to effective use for learning, and assessing learning, requires more than a simulation environment alone. It requires a conceptual framework for the knowledge, skills, and ways of thinking that are meant to be developed, in order to design activities that target these capabilities. The challenges of using simulation environments effectively are especially daunting in dispersed social systems. This article describes how these challenges were addressed in the context of the Cisco Networking Academies with a simulation tool for computer networks called Packet Tracer. The focus is on a conceptual support framework for instructors in over 9,000 institutions around the world for using Packet Tracer in instruction and assessment, by learning to create problem-solving scenarios that are at once tuned to the local needs of their students and consistent with the epistemic frame of "thinking like a network engineer." We describe a layered framework of tools and interfaces above the network simulator that supports the use of Packet Tracer in the distributed community of instructors and students.

  14. Linking engineering and medicine: fostering collaboration skills in interdisciplinary teams.

    PubMed

    Khoo, Michael C K

    2012-07-01

    Biomedical engineering embodies the spirit of combining disciplines. The engineer's pragmatic approach to--and appetite for--solving problems is matched by a bounty of technical challenges generated in medical domains. From nanoscale diagnostics to the redesign of systems of health-care delivery, engineers have been connecting advances in basic and applied science with applications that have helped to improve medical care and outcomes. Increasingly, however, integrating these areas of knowledge and application is less individualistic and more of a team sport. Success increasingly relies on a direct focus on practicing and developing collaboration skills in interdisciplinary teams. Such an approach does not fit easily into individual-focused, discipline-based programs. Biomedical engineering has done its fair share of silo busting, but new approaches are needed to inspire interdisciplinary teams to form around challenges in particular areas. Health care offers a wide variety of complex challenges across an array of delivery settings that can call for new interdisciplinary approaches. This was recognized by the deans of the University of Southern California's (USC's) Medical and Engineering Schools when they began the planning process, leading to the creation of the Health, Technology, and Engineering (HTE@USC or HTE for short) program. “Health care and technology are changing rapidly, and future physicians and engineers need intellectual tools to stay ahead of this change,” says Carmen A. Puliafito, dean of the Keck School of Medicine. His goal is to train national leaders in the quest for devices and processes to improve health care.

  15. SolarPILOT | Concentrating Solar Power | NREL

    Science.gov Websites

    tools. Unlike exclusively ray-tracing tools, SolarPILOT runs the analytical simulation engine that uses engine alongside a ray-tracing core for more detailed simulations. The SolTrace simulation engine is

  16. Meta-tools for software development and knowledge acquisition

    NASA Technical Reports Server (NTRS)

    Eriksson, Henrik; Musen, Mark A.

    1992-01-01

    The effectiveness of tools that provide support for software development is highly dependent on the match between the tools and their task. Knowledge-acquisition (KA) tools constitute a class of development tools targeted at knowledge-based systems. Generally, KA tools that are custom-tailored for particular application domains are more effective than are general KA tools that cover a large class of domains. The high cost of custom-tailoring KA tools manually has encouraged researchers to develop meta-tools for KA tools. Current research issues in meta-tools for knowledge acquisition are the specification styles, or meta-views, for target KA tools used, and the relationships between the specification entered in the meta-tool and other specifications for the target program under development. We examine different types of meta-views and meta-tools. Our current project is to provide meta-tools that produce KA tools from multiple specification sources--for instance, from a task analysis of the target application.

  17. Engineering: Defining and differentiating its unique culture

    NASA Astrophysics Data System (ADS)

    Pilotte, Mary K.

    The world of work for engineering professionals is changing. At a rapid pace, experienced engineers are exiting the workforce due to retirement of the Baby Boomer generation, while at the same time the problems facing engineers are increasingly complex and frequently global in nature. For firms to protect their knowledge assets, they must ensure that acquired understandings are shared among their engineering work groups. Engineering teaching and learning in the workplace (i.e., knowledge sharing), is a social activity that resides in a social context governed by the professional engineering culture. This quantitative study uses Hofstede's Organizational Cultural Values Model (Hofstede, Neuijen, Ohayv, & Sanders, 1990) to examine dimensions of engineering culture in the workplace, producing a central tendency profile of engineering's cultural practices. Further, it explores through hypotheses if demographic differentiators, including birth generation, gender, race, industry sector of employment, and engineering discipline, play roles in forming engineering cultural practices. Results both corroborate aspects of Hofstede's model and assert new understandings relative to factors influencing dimensions of engineering practice. Outcomes are discussed in terms of their potential impact on industrial knowledge sharing and formation of beneficial engineering cultures.

  18. Composing Data Parallel Code for a SPARQL Graph Engine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Castellana, Vito G.; Tumeo, Antonino; Villa, Oreste

    Big data analytics process large amount of data to extract knowledge from them. Semantic databases are big data applications that adopt the Resource Description Framework (RDF) to structure metadata through a graph-based representation. The graph based representation provides several benefits, such as the possibility to perform in memory processing with large amounts of parallelism. SPARQL is a language used to perform queries on RDF-structured data through graph matching. In this paper we present a tool that automatically translates SPARQL queries to parallel graph crawling and graph matching operations. The tool also supports complex SPARQL constructs, which requires more than basicmore » graph matching for their implementation. The tool generates parallel code annotated with OpenMP pragmas for x86 Shared-memory Multiprocessors (SMPs). With respect to commercial database systems such as Virtuoso, our approach reduces memory occupation due to join operations and provides higher performance. We show the scaling of the automatically generated graph-matching code on a 48-core SMP.« less

  19. [Application of microelectronics CAD tools to synthetic biology].

    PubMed

    Madec, Morgan; Haiech, Jacques; Rosati, Élise; Rezgui, Abir; Gendrault, Yves; Lallement, Christophe

    2017-02-01

    Synthetic biology is an emerging science that aims to create new biological functions that do not exist in nature, based on the knowledge acquired in life science over the last century. Since the beginning of this century, several projects in synthetic biology have emerged. The complexity of the developed artificial bio-functions is relatively low so that empirical design methods could be used for the design process. Nevertheless, with the increasing complexity of biological circuits, this is no longer the case and a large number of computer aided design softwares have been developed in the past few years. These tools include languages for the behavioral description and the mathematical modelling of biological systems, simulators at different levels of abstraction, libraries of biological devices and circuit design automation algorithms. All of these tools already exist in other fields of engineering sciences, particularly in microelectronics. This is the approach that is put forward in this paper. © 2017 médecine/sciences – Inserm.

  20. GUDM: Automatic Generation of Unified Datasets for Learning and Reasoning in Healthcare.

    PubMed

    Ali, Rahman; Siddiqi, Muhammad Hameed; Idris, Muhammad; Ali, Taqdir; Hussain, Shujaat; Huh, Eui-Nam; Kang, Byeong Ho; Lee, Sungyoung

    2015-07-02

    A wide array of biomedical data are generated and made available to healthcare experts. However, due to the diverse nature of data, it is difficult to predict outcomes from it. It is therefore necessary to combine these diverse data sources into a single unified dataset. This paper proposes a global unified data model (GUDM) to provide a global unified data structure for all data sources and generate a unified dataset by a "data modeler" tool. The proposed tool implements user-centric priority based approach which can easily resolve the problems of unified data modeling and overlapping attributes across multiple datasets. The tool is illustrated using sample diabetes mellitus data. The diverse data sources to generate the unified dataset for diabetes mellitus include clinical trial information, a social media interaction dataset and physical activity data collected using different sensors. To realize the significance of the unified dataset, we adopted a well-known rough set theory based rules creation process to create rules from the unified dataset. The evaluation of the tool on six different sets of locally created diverse datasets shows that the tool, on average, reduces 94.1% time efforts of the experts and knowledge engineer while creating unified datasets.

  1. GUDM: Automatic Generation of Unified Datasets for Learning and Reasoning in Healthcare

    PubMed Central

    Ali, Rahman; Siddiqi, Muhammad Hameed; Idris, Muhammad; Ali, Taqdir; Hussain, Shujaat; Huh, Eui-Nam; Kang, Byeong Ho; Lee, Sungyoung

    2015-01-01

    A wide array of biomedical data are generated and made available to healthcare experts. However, due to the diverse nature of data, it is difficult to predict outcomes from it. It is therefore necessary to combine these diverse data sources into a single unified dataset. This paper proposes a global unified data model (GUDM) to provide a global unified data structure for all data sources and generate a unified dataset by a “data modeler” tool. The proposed tool implements user-centric priority based approach which can easily resolve the problems of unified data modeling and overlapping attributes across multiple datasets. The tool is illustrated using sample diabetes mellitus data. The diverse data sources to generate the unified dataset for diabetes mellitus include clinical trial information, a social media interaction dataset and physical activity data collected using different sensors. To realize the significance of the unified dataset, we adopted a well-known rough set theory based rules creation process to create rules from the unified dataset. The evaluation of the tool on six different sets of locally created diverse datasets shows that the tool, on average, reduces 94.1% time efforts of the experts and knowledge engineer while creating unified datasets. PMID:26147731

  2. What can management theories offer evidence-based practice? A comparative analysis of measurement tools for organisational context.

    PubMed

    French, Beverley; Thomas, Lois H; Baker, Paula; Burton, Christopher R; Pennington, Lindsay; Roddam, Hazel

    2009-05-19

    Given the current emphasis on networks as vehicles for innovation and change in health service delivery, the ability to conceptualize and measure organisational enablers for the social construction of knowledge merits attention. This study aimed to develop a composite tool to measure the organisational context for evidence-based practice (EBP) in healthcare. A structured search of the major healthcare and management databases for measurement tools from four domains: research utilisation (RU), research activity (RA), knowledge management (KM), and organisational learning (OL). Included studies were reports of the development or use of measurement tools that included organisational factors. Tools were appraised for face and content validity, plus development and testing methods. Measurement tool items were extracted, merged across the four domains, and categorised within a constructed framework describing the absorptive and receptive capacities of organisations. Thirty measurement tools were identified and appraised. Eighteen tools from the four domains were selected for item extraction and analysis. The constructed framework consists of seven categories relating to three core organisational attributes of vision, leadership, and a learning culture, and four stages of knowledge need, acquisition of new knowledge, knowledge sharing, and knowledge use. Measurement tools from RA or RU domains had more items relating to the categories of leadership, and acquisition of new knowledge; while tools from KM or learning organisation domains had more items relating to vision, learning culture, knowledge need, and knowledge sharing. There was equal emphasis on knowledge use in the different domains. If the translation of evidence into knowledge is viewed as socially mediated, tools to measure the organisational context of EBP in healthcare could be enhanced by consideration of related concepts from the organisational and management sciences. Comparison of measurement tools across domains suggests that there is scope within EBP for supplementing the current emphasis on human and technical resources to support information uptake and use by individuals. Consideration of measurement tools from the fields of KM and OL shows more content related to social mechanisms to facilitate knowledge recognition, translation, and transfer between individuals and groups.

  3. What can management theories offer evidence-based practice? A comparative analysis of measurement tools for organisational context

    PubMed Central

    French, Beverley; Thomas, Lois H; Baker, Paula; Burton, Christopher R; Pennington, Lindsay; Roddam, Hazel

    2009-01-01

    Background Given the current emphasis on networks as vehicles for innovation and change in health service delivery, the ability to conceptualise and measure organisational enablers for the social construction of knowledge merits attention. This study aimed to develop a composite tool to measure the organisational context for evidence-based practice (EBP) in healthcare. Methods A structured search of the major healthcare and management databases for measurement tools from four domains: research utilisation (RU), research activity (RA), knowledge management (KM), and organisational learning (OL). Included studies were reports of the development or use of measurement tools that included organisational factors. Tools were appraised for face and content validity, plus development and testing methods. Measurement tool items were extracted, merged across the four domains, and categorised within a constructed framework describing the absorptive and receptive capacities of organisations. Results Thirty measurement tools were identified and appraised. Eighteen tools from the four domains were selected for item extraction and analysis. The constructed framework consists of seven categories relating to three core organisational attributes of vision, leadership, and a learning culture, and four stages of knowledge need, acquisition of new knowledge, knowledge sharing, and knowledge use. Measurement tools from RA or RU domains had more items relating to the categories of leadership, and acquisition of new knowledge; while tools from KM or learning organisation domains had more items relating to vision, learning culture, knowledge need, and knowledge sharing. There was equal emphasis on knowledge use in the different domains. Conclusion If the translation of evidence into knowledge is viewed as socially mediated, tools to measure the organisational context of EBP in healthcare could be enhanced by consideration of related concepts from the organisational and management sciences. Comparison of measurement tools across domains suggests that there is scope within EBP for supplementing the current emphasis on human and technical resources to support information uptake and use by individuals. Consideration of measurement tools from the fields of KM and OL shows more content related to social mechanisms to facilitate knowledge recognition, translation, and transfer between individuals and groups. PMID:19454008

  4. The environment power system analysis tool development program

    NASA Technical Reports Server (NTRS)

    Jongeward, Gary A.; Kuharski, Robert A.; Kennedy, Eric M.; Stevens, N. John; Putnam, Rand M.; Roche, James C.; Wilcox, Katherine G.

    1990-01-01

    The Environment Power System Analysis Tool (EPSAT) is being developed to provide space power system design engineers with an analysis tool for determining system performance of power systems in both naturally occurring and self-induced environments. The program is producing an easy to use computer aided engineering (CAE) tool general enough to provide a vehicle for technology transfer from space scientists and engineers to power system design engineers. The results of the project after two years of a three year development program are given. The EPSAT approach separates the CAE tool into three distinct functional units: a modern user interface to present information, a data dictionary interpreter to coordinate analysis; and a data base for storing system designs and results of analysis.

  5. A kinetic modeling of chondrocyte culture for manufacture of tissue-engineered cartilage.

    PubMed

    Kino-Oka, Masahiro; Maeda, Yoshikatsu; Yamamoto, Takeyuki; Sugawara, Katsura; Taya, Masahito

    2005-03-01

    For repairing articular cartilage defects, innovative techniques based on tissue engineering have been developed and are now entering into the practical stage of clinical application by means of grafting in vitro cultured products. A variety of natural and artificial materials available for scaffolds, which permit chondrocyte cells to aggregate, have been designed for their ability to promote cell growth and differentiation. From the viewpoint of the manufacturing process for tissue-engineered cartilage, the diverse nature of raw materials (seeding cells) and end products (cultured cartilage) oblige us to design a tailor-made process with less reproducibility, which is an obstacle to establishing a production doctrine based on bioengineering knowledge concerning growth kinetics and modeling as well as designs of bioreactors and culture operations for certification of high product quality. In this article, we review the recent advances in the manufacturing of tissue-engineered cartilage. After outlining the manufacturing processes for tissue-engineered cartilage in the first section, the second and third sections, respectively, describe the three-dimensional culture of chondrocytes with Aterocollagen gel and kinetic model consideration as a tool for evaluating this culture process. In the final section, culture strategy is discussed in terms of the combined processes of monolayer growth (ex vivo chondrocyte cell expansion) and three-dimensional growth (construction of cultured cartilage in the gel).

  6. Engineering C4 photosynthesis into C3 chassis in the synthetic biology age.

    PubMed

    Schuler, Mara L; Mantegazza, Otho; Weber, Andreas P M

    2016-07-01

    C4 photosynthetic plants outperform C3 plants in hot and arid climates. By concentrating carbon dioxide around Rubisco C4 plants drastically reduce photorespiration. The frequency with which plants evolved C4 photosynthesis independently challenges researchers to unravel the genetic mechanisms underlying this convergent evolutionary switch. The conversion of C3 crops, such as rice, towards C4 photosynthesis is a long-standing goal. Nevertheless, at the present time, in the age of synthetic biology, this still remains a monumental task, partially because the C4 carbon-concentrating biochemical cycle spans two cell types and thus requires specialized anatomy. Here we review the advances in understanding the molecular basis and the evolution of the C4 trait, advances in the last decades that were driven by systems biology methods. In this review we emphasise essential genetic engineering tools needed to translate our theoretical knowledge into engineering approaches. With our current molecular understanding of the biochemical C4 pathway, we propose a simplified rational engineering model exclusively built with known C4 metabolic components. Moreover, we discuss an alternative approach to the progressing international engineering attempts that would combine targeted mutagenesis and directed evolution. © 2016 The Authors The Plant Journal © 2016 John Wiley & Sons Ltd.

  7. The structural approach to shared knowledge: an application to engineering design teams.

    PubMed

    Avnet, Mark S; Weigel, Annalisa L

    2013-06-01

    We propose a methodology for analyzing shared knowledge in engineering design teams. Whereas prior work has focused on shared knowledge in small teams at a specific point in time, the model presented here is both scalable and dynamic. By quantifying team members' common views of design drivers, we build a network of shared mental models to reveal the structure of shared knowledge at a snapshot in time. Based on a structural comparison of networks at different points in time, a metric of change in shared knowledge is computed. Analysis of survey data from 12 conceptual space mission design sessions reveals a correlation between change in shared knowledge and each of several system attributes, including system development time, system mass, and technological maturity. From these results, we conclude that an early period of learning and consensus building could be beneficial to the design of engineered systems. Although we do not examine team performance directly, we demonstrate that shared knowledge is related to the technical design and thus provide a foundation for improving design products by incorporating the knowledge and thoughts of the engineering design team into the process.

  8. Enhancing Knowledge Sharing Management Using BIM Technology in Construction

    PubMed Central

    Ho, Shih-Ping; Tserng, Hui-Ping

    2013-01-01

    Construction knowledge can be communicated and reused among project managers and jobsite engineers to alleviate problems on a construction jobsite and reduce the time and cost of solving problems related to constructability. This paper proposes a new methodology for the sharing of construction knowledge by using Building Information Modeling (BIM) technology. The main characteristics of BIM include illustrating 3D CAD-based presentations and keeping information in a digital format and facilitation of easy updating and transfer of information in the BIM environment. Using the BIM technology, project managers and engineers can gain knowledge related to BIM and obtain feedback provided by jobsite engineers for future reference. This study addresses the application of knowledge sharing management using BIM technology and proposes a BIM-based Knowledge Sharing Management (BIMKSM) system for project managers and engineers. The BIMKSM system is then applied in a selected case study of a construction project in Taiwan to demonstrate the effectiveness of sharing knowledge in the BIM environment. The results demonstrate that the BIMKSM system can be used as a visual BIM-based knowledge sharing management platform by utilizing the BIM technology. PMID:24723790

  9. Enhancing knowledge sharing management using BIM technology in construction.

    PubMed

    Ho, Shih-Ping; Tserng, Hui-Ping; Jan, Shu-Hui

    2013-01-01

    Construction knowledge can be communicated and reused among project managers and jobsite engineers to alleviate problems on a construction jobsite and reduce the time and cost of solving problems related to constructability. This paper proposes a new methodology for the sharing of construction knowledge by using Building Information Modeling (BIM) technology. The main characteristics of BIM include illustrating 3D CAD-based presentations and keeping information in a digital format and facilitation of easy updating and transfer of information in the BIM environment. Using the BIM technology, project managers and engineers can gain knowledge related to BIM and obtain feedback provided by jobsite engineers for future reference. This study addresses the application of knowledge sharing management using BIM technology and proposes a BIM-based Knowledge Sharing Management (BIMKSM) system for project managers and engineers. The BIMKSM system is then applied in a selected case study of a construction project in Taiwan to demonstrate the effectiveness of sharing knowledge in the BIM environment. The results demonstrate that the BIMKSM system can be used as a visual BIM-based knowledge sharing management platform by utilizing the BIM technology.

  10. [Penile injury caused by a Moulinette. Result of autoerotic self-mutilation].

    PubMed

    Lehsnau, M

    2007-07-01

    Autoerotic manipulations of external male genitals resulting in mutilation with different degrees of severity are rare. We report the clinical case of a 12-year-old boy who injured his glans, left corpus cavernosum and corpus spongiosum with opened urethra as a consequence of autoerotic genital self-mutilation. According to our knowledge of the current literature this is the first description of autoerotic genital self-mutilation with a Moulinette. A Moulinette is a kitchen tool with an electric engine and an extremely fast rotary double knife, which is used to reduce food into small pieces, especially vegetables and fruits.

  11. LeaRN: A Collaborative Learning-Research Network for a WLCG Tier-3 Centre

    NASA Astrophysics Data System (ADS)

    Pérez Calle, Elio

    2011-12-01

    The Department of Modern Physics of the University of Science and Technology of China is hosting a Tier-3 centre for the ATLAS experiment. A interdisciplinary team of researchers, engineers and students are devoted to the task of receiving, storing and analysing the scientific data produced by the LHC. In order to achieve the highest performance and to develop a knowledge base shared by all members of the team, the research activities and their coordination are being supported by an array of computing systems. These systems have been designed to foster communication, collaboration and coordination among the members of the team, both face-to-face and remotely, and both in synchronous and asynchronous ways. The result is a collaborative learning-research network whose main objectives are awareness (to get shared knowledge about other's activities and therefore obtain synergies), articulation (to allow a project to be divided, work units to be assigned and then reintegrated) and adaptation (to adapt information technologies to the needs of the group). The main technologies involved are Communication Tools such as web publishing, revision control and wikis, Conferencing Tools such as forums, instant messaging and video conferencing and Coordination Tools, such as time management, project management and social networks. The software toolkit has been deployed by the members of the team and it has been based on free and open source software.

  12. Tool vibration detection with eddy current sensors in machining process and computation of stability lobes using fuzzy classifiers

    NASA Astrophysics Data System (ADS)

    Devillez, Arnaud; Dudzinski, Daniel

    2007-01-01

    Today the knowledge of a process is very important for engineers to find optimal combination of control parameters warranting productivity, quality and functioning without defects and failures. In our laboratory, we carry out research in the field of high speed machining with modelling, simulation and experimental approaches. The aim of our investigation is to develop a software allowing the cutting conditions optimisation to limit the number of predictive tests, and the process monitoring to prevent any trouble during machining operations. This software is based on models and experimental data sets which constitute the knowledge of the process. In this paper, we deal with the problem of vibrations occurring during a machining operation. These vibrations may cause some failures and defects to the process, like workpiece surface alteration and rapid tool wear. To measure on line the tool micro-movements, we equipped a lathe with a specific instrumentation using eddy current sensors. Obtained signals were correlated with surface finish and a signal processing algorithm was used to determine if a test is stable or unstable. Then, a fuzzy classification method was proposed to classify the tests in a space defined by the width of cut and the cutting speed. Finally, it was shown that the fuzzy classification takes into account of the measurements incertitude to compute the stability limit or stability lobes of the process.

  13. From Knowing-About to Knowing-To: Development of Engineering Pedagogical Content Knowledge by Elementary Teachers through Perceived Learning and Implementing Difficulties

    ERIC Educational Resources Information Center

    Sun, Yan; Strobel, Johannes

    2014-01-01

    The present study sought to reveal how elementary teachers develop their engineering pedagogical content knowledge (PCK) after leaving professional development programs to practice engineering teaching in real classroom settings. Participants of this study were the elementary teachers who received one-week training of engineering education…

  14. Integrating Solar Power onto the Electric Grid - Bridging the Gap between Atmospheric Science, Engineering and Economics

    NASA Astrophysics Data System (ADS)

    Ghonima, M. S.; Yang, H.; Zhong, X.; Ozge, B.; Sahu, D. K.; Kim, C. K.; Babacan, O.; Hanna, R.; Kurtz, B.; Mejia, F. A.; Nguyen, A.; Urquhart, B.; Chow, C. W.; Mathiesen, P.; Bosch, J.; Wang, G.

    2015-12-01

    One of the main obstacles to high penetrations of solar power is the variable nature of solar power generation. To mitigate variability, grid operators have to schedule additional reliability resources, at considerable expense, to ensure that load requirements are met by generation. Thus despite the cost of solar PV decreasing, the cost of integrating solar power will increase as penetration of solar resources onto the electric grid increases. There are three principal tools currently available to mitigate variability impacts: (i) flexible generation, (ii) storage, either virtual (demand response) or physical devices and (iii) solar forecasting. Storage devices are a powerful tool capable of ensuring smooth power output from renewable resources. However, the high cost of storage is prohibitive and markets are still being designed to leverage their full potential and mitigate their limitation (e.g. empty storage). Solar forecasting provides valuable information on the daily net load profile and upcoming ramps (increasing or decreasing solar power output) thereby providing the grid advance warning to schedule ancillary generation more accurately, or curtail solar power output. In order to develop solar forecasting as a tool that can be utilized by the grid operators we identified two focus areas: (i) develop solar forecast technology and improve solar forecast accuracy and (ii) develop forecasts that can be incorporated within existing grid planning and operation infrastructure. The first issue required atmospheric science and engineering research, while the second required detailed knowledge of energy markets, and power engineering. Motivated by this background we will emphasize area (i) in this talk and provide an overview of recent advancements in solar forecasting especially in two areas: (a) Numerical modeling tools for coastal stratocumulus to improve scheduling in the day-ahead California energy market. (b) Development of a sky imager to provide short term forecasts (0-20 min ahead) to improve optimization and control of equipment on distribution feeders with high penetration of solar. Leveraging such tools that have seen extensive use in the atmospheric sciences supports the development of accurate physics-based solar forecast models. Directions for future research are also provided.

  15. Engine Data Interpretation System (EDIS), phase 2

    NASA Technical Reports Server (NTRS)

    Cost, Thomas L.; Hofmann, Martin O.

    1991-01-01

    A prototype of an expert system was developed which applies qualitative constraint-based reasoning to the task of post-test analysis of data resulting from a rocket engine firing. Data anomalies are detected and corresponding faults are diagnosed. Engine behavior is reconstructed using measured data and knowledge about engine behavior. Knowledge about common faults guides but does not restrict the search for the best explanation in terms of hypothesized faults. The system contains domain knowledge about the behavior of common rocket engine components and was configured for use with the Space Shuttle Main Engine (SSME). A graphical user interface allows an expert user to intimately interact with the system during diagnosis. The system was applied to data taken during actual SSME tests where data anomalies were observed.

  16. A Development Architecture for Serious Games Using BCI (Brain Computer Interface) Sensors

    PubMed Central

    Sung, Yunsick; Cho, Kyungeun; Um, Kyhyun

    2012-01-01

    Games that use brainwaves via brain–computer interface (BCI) devices, to improve brain functions are known as BCI serious games. Due to the difficulty of developing BCI serious games, various BCI engines and authoring tools are required, and these reduce the development time and cost. However, it is desirable to reduce the amount of technical knowledge of brain functions and BCI devices needed by game developers. Moreover, a systematic BCI serious game development process is required. In this paper, we present a methodology for the development of BCI serious games. We describe an architecture, authoring tools, and development process of the proposed methodology, and apply it to a game development approach for patients with mild cognitive impairment as an example. This application demonstrates that BCI serious games can be developed on the basis of expert-verified theories. PMID:23202227

  17. A development architecture for serious games using BCI (brain computer interface) sensors.

    PubMed

    Sung, Yunsick; Cho, Kyungeun; Um, Kyhyun

    2012-11-12

    Games that use brainwaves via brain-computer interface (BCI) devices, to improve brain functions are known as BCI serious games. Due to the difficulty of developing BCI serious games, various BCI engines and authoring tools are required, and these reduce the development time and cost. However, it is desirable to reduce the amount of technical knowledge of brain functions and BCI devices needed by game developers. Moreover, a systematic BCI serious game development process is required. In this paper, we present a methodology for the development of BCI serious games. We describe an architecture, authoring tools, and development process of the proposed methodology, and apply it to a game development approach for patients with mild cognitive impairment as an example. This application demonstrates that BCI serious games can be developed on the basis of expert-verified theories.

  18. Promoting Students' Problem Solving Skills and Knowledge of STEM Concepts in a Data-Rich Learning Environment: Using Online Data as a Tool for Teaching about Renewable Energy Technologies

    NASA Astrophysics Data System (ADS)

    Thurmond, Brandi

    This study sought to compare a data-rich learning (DRL) environment that utilized online data as a tool for teaching about renewable energy technologies (RET) to a lecture-based learning environment to determine the impact of the learning environment on students' knowledge of Science, Technology, Engineering, and Math (STEM) concepts related to renewable energy technologies and students' problem solving skills. Two purposefully selected Advanced Placement (AP) Environmental Science teachers were included in the study. Each teacher taught one class about RET in a lecture-based environment (control) and another class in a DRL environment (treatment), for a total of four classes of students (n=128). This study utilized a quasi-experimental, pretest/posttest, control-group design. The initial hypothesis that the treatment group would have a significant gain in knowledge of STEM concepts related to RET and be better able to solve problems when compared to the control group was not supported by the data. Although students in the DRL environment had a significant gain in knowledge after instruction, posttest score comparisons of the control and treatment groups revealed no significant differences between the groups. Further, no significant differences were noted in students' problem solving abilities as measured by scores on a problem-based activity and self-reported abilities on a reflective questionnaire. This suggests that the DRL environment is at least as effective as the lecture-based learning environment in teaching AP Environmental Science students about RET and fostering the development of problem solving skills. As this was a small scale study, further research is needed to provide information about effectiveness of DRL environments in promoting students' knowledge of STEM concepts and problem-solving skills.

  19. Knitting for heart valve tissue engineering

    PubMed Central

    Ayad, Nadia; Wojciechowska, Dorota; Zielińska, Dorota; Struszczyk, Marcin H.; Latif, Najma; Yacoub, Magdi

    Knitting is a versatile technology which offers a large portfolio of products and solutions of interest in heart valve (HV) tissue engineering (TE). One of the main advantages of knitting is its ability to construct complex shapes and structures by precisely assembling the yarns in the desired position. With this in mind, knitting could be employed to construct a HV scaffold that closely resembles the authentic valve. This has the potential to reproduce the anisotropic structure that is characteristic of the heart valve with the yarns, in particular the 3-layered architecture of the leaflets. These yarns can provide oriented growth of cells lengthwise and consequently enable the deposition of extracellular matrix (ECM) proteins in an oriented manner. This technique, therefore, has a potential to provide a functional knitted scaffold, but to achieve that textile engineers need to gain a basic understanding of structural and mechanical aspects of the heart valve and in addition, tissue engineers must acquire the knowledge of tools and capacities that are essential in knitting technology. The aim of this review is to provide a platform to consolidate these two fields as well as to enable an efficient communication and cooperation among these two research areas. PMID:29043276

  20. Emerging CAE technologies and their role in Future Ambient Intelligence Environments

    NASA Astrophysics Data System (ADS)

    Noor, Ahmed K.

    2011-03-01

    Dramatic improvements are on the horizon in Computer Aided Engineering (CAE) and various simulation technologies. The improvements are due, in part, to the developments in a number of leading-edge technologies and their synergistic combinations/convergence. The technologies include ubiquitous, cloud, and petascale computing; ultra high-bandwidth networks, pervasive wireless communication; knowledge based engineering; networked immersive virtual environments and virtual worlds; novel human-computer interfaces; and powerful game engines and facilities. This paper describes the frontiers and emerging simulation technologies, and their role in the future virtual product creation and learning/training environments. The environments will be ambient intelligence environments, incorporating a synergistic combination of novel agent-supported visual simulations (with cognitive learning and understanding abilities); immersive 3D virtual world facilities; development chain management systems and facilities (incorporating a synergistic combination of intelligent engineering and management tools); nontraditional methods; intelligent, multimodal and human-like interfaces; and mobile wireless devices. The Virtual product creation environment will significantly enhance the productivity and will stimulate creativity and innovation in future global virtual collaborative enterprises. The facilities in the learning/training environment will provide timely, engaging, personalized/collaborative and tailored visual learning.

  1. Interpreting geologic maps for engineering purposes: Hollidaysburg quadrangle, Pennsylvania

    USGS Publications Warehouse

    ,

    1953-01-01

    This set of maps has been prepared to show the kinds of information, useful to engineers, that can be derived from ordinary geologic maps. A few additional bits of information, drawn from other sources, are mentioned below. Some of the uses of such maps are well known; they are indispensable tools in the modern search for oil or ore deposits; they are the first essential step in unraveling the story of the earth we live on. Less well known, perhaps, is the fact that topographic and geologic maps contain many of the basic data needed for planning any engineering construction job, big or little. Any structure built by man must fit into the topographic and geologic environment shown on such maps. Moreover, most if not all construction jobs must be based on knowledge of the soils and waters, which also are intimately related to this same environment. The topographic map shows the shape of the land the hills and valleys, the streams and swamps, the man-made features such as roads, railroads, and towns. The geologic map shows the kinds and shapes of the rock bodies that form the land surface and that lie beneath it. These are the facts around which the engineer must build.

  2. A sociocultural analysis of Latino high school students' funds of knowledge and implications for culturally responsive engineering education

    NASA Astrophysics Data System (ADS)

    Mejia, Joel Alejandro

    Previous studies have suggested that, when funds of knowledge are incorporated into science and mathematics curricula, students are more engaged and often develop richer understandings of scientific concepts. While there has been a growing body of research addressing how teachers may integrate students' linguistic, social, and cultural practices with science and mathematics instruction, very little research has been conducted on how the same can be accomplished with Latino and Latina students in engineering. The purpose of this study was to address this gap in the literature by investigating how fourteen Latino and Latina high school adolescents used their funds of knowledge to address engineering design challenges. This project was intended to enhance the educational experience of underrepresented minorities whose social and cultural practices have been traditionally undervalued in schools. This ethnographic study investigated the funds of knowledge of fourteen Latino and Latina high school adolescents and how they used these funds of knowledge in engineering design. Participant observation, bi-monthly group discussion, retrospective and concurrent protocols, and monthly one-on-one interviews were conducted during the study. A constant comparative analysis suggested that Latino and Latina adolescents, although profoundly underrepresented in engineering, bring a wealth of knowledge and experiences that are relevant to engineering design thinking and practice.

  3. Metabolic engineering with systems biology tools to optimize production of prokaryotic secondary metabolites.

    PubMed

    Kim, Hyun Uk; Charusanti, Pep; Lee, Sang Yup; Weber, Tilmann

    2016-08-27

    Covering: 2012 to 2016Metabolic engineering using systems biology tools is increasingly applied to overproduce secondary metabolites for their potential industrial production. In this Highlight, recent relevant metabolic engineering studies are analyzed with emphasis on host selection and engineering approaches for the optimal production of various prokaryotic secondary metabolites: native versus heterologous hosts (e.g., Escherichia coli) and rational versus random approaches. This comparative analysis is followed by discussions on systems biology tools deployed in optimizing the production of secondary metabolites. The potential contributions of additional systems biology tools are also discussed in the context of current challenges encountered during optimization of secondary metabolite production.

  4. Software engineering and Ada (Trademark) training: An implementation model for NASA

    NASA Technical Reports Server (NTRS)

    Legrand, Sue; Freedman, Glenn

    1988-01-01

    The choice of Ada for software engineering for projects such as the Space Station has resulted in government and industrial groups considering training programs that help workers become familiar with both a software culture and the intricacies of a new computer language. The questions of how much time it takes to learn software engineering with Ada, how much an organization should invest in such training, and how the training should be structured are considered. Software engineering is an emerging, dynamic discipline. It is defined by the author as the establishment and application of sound engineering environments, tools, methods, models, principles, and concepts combined with appropriate standards, guidelines, and practices to support computing which is correct, modifiable, reliable and safe, efficient, and understandable throughout the life cycle of the application. Neither the training programs needed, nor the content of such programs, have been well established. This study addresses the requirements for training for NASA personnel and recommends an implementation plan. A curriculum and a means of delivery are recommended. It is further suggested that a knowledgeable programmer may be able to learn Ada in 5 days, but that it takes 6 to 9 months to evolve into a software engineer who uses the language correctly and effectively. The curriculum and implementation plan can be adapted for each NASA Center according to the needs dictated by each project.

  5. Improving Antibody-Based Cancer Therapeutics Through Glycan Engineering.

    PubMed

    Yu, Xiaojie; Marshall, Michael J E; Cragg, Mark S; Crispin, Max

    2017-06-01

    Antibody-based therapeutics has emerged as a major tool in cancer treatment. Guided by the superb specificity of the antibody variable domain, it allows the precise targeting of tumour markers. Recently, eliciting cellular effector functions, mediated by the Fc domain, has gained traction as a means by which to generate more potent antibody therapeutics. Extensive mutagenesis studies of the Fc protein backbone has enabled the generation of Fc variants that more optimally engage the Fcγ receptors known to mediate cellular effector functions such as antibody-dependent cellular cytotoxicity (ADCC) and cellular phagocytosis. In addition to the protein backbone, the homodimeric Fc domain contains two opposing N-linked glycans, which represent a further point of potential immunomodulation, independent of the Fc protein backbone. For example, a lack of core fucose usually attached to the IgG Fc glycan leads to enhanced ADCC activity, whereas a high level of terminal sialylation is associated with reduced inflammation. Significant growth in knowledge of Fc glycosylation over the last decade, combined with advancement in genetic engineering, has empowered glyco-engineering to fine-tune antibody therapeutics. This has culminated in the approval of two glyco-engineered antibodies for cancer therapy: the anti-CCR4 mogamulizumab approved in 2012 and the anti-CD20 obinutuzumab in 2013. We discuss here the technological platforms for antibody glyco-engineering and review the current clinical landscape of glyco-engineered antibodies.

  6. Action and semantic tool knowledge - Effective connectivity in the underlying neural networks.

    PubMed

    Kleineberg, Nina N; Dovern, Anna; Binder, Ellen; Grefkes, Christian; Eickhoff, Simon B; Fink, Gereon R; Weiss, Peter H

    2018-04-26

    Evidence from neuropsychological and imaging studies indicate that action and semantic knowledge about tools draw upon distinct neural substrates, but little is known about the underlying interregional effective connectivity. With fMRI and dynamic causal modeling (DCM) we investigated effective connectivity in the left-hemisphere (LH) while subjects performed (i) a function knowledge and (ii) a value knowledge task, both addressing semantic tool knowledge, and (iii) a manipulation (action) knowledge task. Overall, the results indicate crosstalk between action nodes and semantic nodes. Interestingly, effective connectivity was weakened between semantic nodes and action nodes during the manipulation task. Furthermore, pronounced modulations of effective connectivity within the fronto-parietal action system of the LH (comprising lateral occipito-temporal cortex, intraparietal sulcus, supramarginal gyrus, inferior frontal gyrus) were observed in a bidirectional manner during the processing of action knowledge. In contrast, the function and value knowledge tasks resulted in a significant strengthening of the effective connectivity between visual cortex and fusiform gyrus. Importantly, this modulation was present in both semantic tasks, indicating that processing different aspects of semantic knowledge about tools evokes similar effective connectivity patterns. Data revealed that interregional effective connectivity during the processing of tool knowledge occurred in a bidirectional manner with a weakening of connectivity between areas engaged in action and semantic knowledge about tools during the processing of action knowledge. Moreover, different semantic tool knowledge tasks elicited similar effective connectivity patterns. © 2018 Wiley Periodicals, Inc.

  7. A Software Engineering Environment for the Navy.

    DTIC Science & Technology

    1982-03-31

    Engineering Pr.cess . - 55 ?art II: Description of A Software Engineering Env.Lonnmeut 1. Data Base ........................................ 7 -3 L.I...Methodology to Tool 1-54 2.2.2.2-6 Flow of Management: Activity to Methodology to Tool 21- 55 2.2.2.2-7 Pipelining for Activity-Specific Tools 11-56 A.1.1-1 A...testing techniques. 2.2. 2 Methodciogies and Tools: Correctness Analysis Pai e T- 4Metboioioo ies aews - Pev2.ews Jeicrmine the in ernai ’ Qolc .. ness and

  8. Institutional Memory Preservation at NASA Glenn Research Center

    NASA Technical Reports Server (NTRS)

    Coffey, J.; Moreman, Douglas; Dyer, J.; Hemminger, J. A.

    1999-01-01

    In this era of downsizing and deficit reduction, the preservation of institutional memory is a widespread concern for U.S. companies and governmental agencies. The National Aeronautical and Space Administration faces the pending retirement of many of the agency's long-term, senior engineers. NASA has a marvelous long-term history of success, but the agency faces a recurring problem caused by the loss of these engineers' unique knowledge and perspectives on NASA's role in aeronautics and space exploration. The current work describes a knowledge elicitation effort aimed at demonstrating the feasibility of preserving the more personal, heuristic knowledge accumulated over the years by NASA engineers, as contrasted with the "textbook" knowledge of launch vehicles. Work on this project was performed at NASA Glenn Research Center and elsewhere, and focused on launch vehicle systems integration. The initial effort was directed toward an historic view of the Centaur upper stage which is powered by two RL-10 engines. Various experts were consulted, employing a variety of knowledge elicitation techniques, regarding the Centaur and RL-10. Their knowledge is represented in searchable Web-based multimedia presentations. This paper discusses the various approaches to knowledge elicitation and knowledge representation employed, and assesses successes and challenges in trying to perform large-scale knowledge preservation of institutional memory. It is anticipated that strategies for knowledge elicitation and representation that have been developed in this grant will be utilized to elicit knowledge in a variety of domains including the complex heuristics that underly use of simulation software packages such as that being explored in the Expert System Architecture for Rocket Engine Numerical Simulators.

  9. A New Wnt1-CRE TomatoRosa Embryonic Stem Cell Line: A Tool for Studying Neural Crest Cell Integration Capacity.

    PubMed

    Acuna-Mendoza, Soledad; Martin, Sabrina; Kuchler-Bopp, Sabine; Ribes, Sandy; Thalgott, Jérémy; Chaussain, Catherine; Creuzet, Sophie; Lesot, Hervé; Lebrin, Franck; Poliard, Anne

    2017-12-01

    Neural crest (NC) cells are a migratory, multipotent population giving rise to numerous lineages in the embryo. Their plasticity renders attractive their use in tissue engineering-based therapies, but further knowledge on their in vivo behavior is required before clinical transfer may be envisioned. We here describe the isolation and characterization of a new mouse embryonic stem (ES) line derived from Wnt1-CRE-R26 Rosa TomatoTdv blastocyst and show that it displays the characteristics of typical ES cells. Further, these cells can be efficiently directed toward an NC stem cell-like phenotype as attested by concomitant expression of NC marker genes and Tomato fluorescence. As native NC progenitors, they are capable of differentiating toward typical derivative phenotypes and interacting with embryonic tissues to participate in the formation of neo-structures. Their specific fluorescence allows purification and tracking in vivo. This cellular tool should facilitate a better understanding of the mechanisms driving NC fate specification and help identify the key interactions developed within a tissue after in vivo implantation. Altogether, this novel model may provide important knowledge to optimize NC stem cell graft conditions, which are required for efficient tissue repair.

  10. An expert system based software sizing tool, phase 2

    NASA Technical Reports Server (NTRS)

    Friedlander, David

    1990-01-01

    A software tool was developed for predicting the size of a future computer program at an early stage in its development. The system is intended to enable a user who is not expert in Software Engineering to estimate software size in lines of source code with an accuracy similar to that of an expert, based on the program's functional specifications. The project was planned as a knowledge based system with a field prototype as the goal of Phase 2 and a commercial system planned for Phase 3. The researchers used techniques from Artificial Intelligence and knowledge from human experts and existing software from NASA's COSMIC database. They devised a classification scheme for the software specifications, and a small set of generic software components that represent complexity and apply to large classes of programs. The specifications are converted to generic components by a set of rules and the generic components are input to a nonlinear sizing function which makes the final prediction. The system developed for this project predicted code sizes from the database with a bias factor of 1.06 and a fluctuation factor of 1.77, an accuracy similar to that of human experts but without their significant optimistic bias.

  11. Using hybrid expert system approaches for engineering applications

    NASA Technical Reports Server (NTRS)

    Allen, R. H.; Boarnet, M. G.; Culbert, C. J.; Savely, R. T.

    1987-01-01

    In this paper, the use of hybrid expert system shells and hybrid (i.e., algorithmic and heuristic) approaches for solving engineering problems is reported. Aspects of various engineering problem domains are reviewed for a number of examples with specific applications made to recently developed prototype expert systems. Based on this prototyping experience, critical evaluations of and comparisons between commercially available tools, and some research tools, in the United States and Australia, and their underlying problem-solving paradigms are made. Characteristics of the implementation tool and the engineering domain are compared and practical software engineering issues are discussed with respect to hybrid tools and approaches. Finally, guidelines are offered with the hope that expert system development will be less time consuming, more effective, and more cost-effective than it has been in the past.

  12. Decision Matrices: Tools to Enhance Middle School Engineering Instruction

    ERIC Educational Resources Information Center

    Gonczi, Amanda L.; Bergman, Brenda G.; Huntoon, Jackie; Allen, Robin; McIntyre, Barb; Turner, Sheri; Davis, Jen; Handler, Rob

    2017-01-01

    Decision matrices are valuable engineering tools. They allow engineers to objectively examine solution options. Decision matrices can be incorporated in K-12 classrooms to support authentic engineering instruction. In this article we provide examples of how decision matrices have been incorporated into 6th and 7th grade classrooms as part of an…

  13. Integrated Tools for Future Distributed Engine Control Technologies

    NASA Technical Reports Server (NTRS)

    Culley, Dennis; Thomas, Randy; Saus, Joseph

    2013-01-01

    Turbine engines are highly complex mechanical systems that are becoming increasingly dependent on control technologies to achieve system performance and safety metrics. However, the contribution of controls to these measurable system objectives is difficult to quantify due to a lack of tools capable of informing the decision makers. This shortcoming hinders technology insertion in the engine design process. NASA Glenn Research Center is developing a Hardware-inthe- Loop (HIL) platform and analysis tool set that will serve as a focal point for new control technologies, especially those related to the hardware development and integration of distributed engine control. The HIL platform is intended to enable rapid and detailed evaluation of new engine control applications, from conceptual design through hardware development, in order to quantify their impact on engine systems. This paper discusses the complex interactions of the control system, within the context of the larger engine system, and how new control technologies are changing that paradigm. The conceptual design of the new HIL platform is then described as a primary tool to address those interactions and how it will help feed the insertion of new technologies into future engine systems.

  14. Engineering Problem-Solving Knowledge: The Impact of Context

    ERIC Educational Resources Information Center

    Wolff, Karin

    2017-01-01

    Employer complaints of engineering graduate inability to "apply knowledge" suggests a need to interrogate the complex theory-practice relationship in twenty-first century real world contexts. Focussing specifically on the application of mathematics, physics and logic-based disciplinary knowledge, the research examines engineering…

  15. Design Knowledge Management System (DKMS) Beta Test Report

    DTIC Science & Technology

    1992-11-01

    design process. These problems, which include knowledge representation, constraint propagation, model design, and information integration, are...effective delivery of life-cycle engineering knowledge assistance and information to the design/engineering activities. It does not matter whether these...platfomi. 4. Reuse - existing data, information , and knowledge can be reused. 5. Remote Execution -- automatically handles remote execution without

  16. Middle-School Teachers' Understanding and Teaching of the Engineering Design Process: A Look at Subject Matter and Pedagogical Content Knowledge

    ERIC Educational Resources Information Center

    Hynes, Morgan M.

    2012-01-01

    This paper reports on research investigating six middle school teachers without engineering degrees as they taught an engineering unit on the engineering design process. Videotaped classroom sessions and teacher interviews were analyzed to understand the subject matter and pedagogical content knowledge the teachers used and developed as they…

  17. Genomics Approaches For Improving Salinity Stress Tolerance in Crop Plants.

    PubMed

    Nongpiur, Ramsong Chantre; Singla-Pareek, Sneh Lata; Pareek, Ashwani

    2016-08-01

    Salinity is one of the major factors which reduces crop production worldwide. Plant responses to salinity are highly complex and involve a plethora of genes. Due to its multigenicity, it has been difficult to attain a complete understanding of how plants respond to salinity. Genomics has progressed tremendously over the past decade and has played a crucial role towards providing necessary knowledge for crop improvement. Through genomics, we have been able to identify and characterize the genes involved in salinity stress response, map out signaling pathways and ultimately utilize this information for improving the salinity tolerance of existing crops. The use of new tools, such as gene pyramiding, in genetic engineering and marker assisted breeding has tremendously enhanced our ability to generate stress tolerant crops. Genome editing technologies such as Zinc finger nucleases, TALENs and CRISPR/Cas9 also provide newer and faster avenues for plant biologists to generate precisely engineered crops.

  18. The Leadership Lab for Women: Advancing and Retaining Women in STEM through Professional Development.

    PubMed

    Van Oosten, Ellen B; Buse, Kathleen; Bilimoria, Diana

    2017-01-01

    Innovative professional development approaches are needed to address the ongoing lack of women leaders in science, technology, engineering, and math (STEM) careers. Developed from the research on women who persist in engineering and computing professions and essential elements of women's leadership development, the Leadership Lab for Women in STEM Program was launched in 2014. The Leadership Lab was created as a research-based leadership development program, offering 360-degree feedback, coaching, and practical strategies aimed at increasing the advancement and retention of women in the STEM professions. The goal is to provide women with knowledge, tools and a supportive learning environment to help them navigate, achieve, flourish, and catalyze organizational change in male-dominated and technology-driven organizations. This article describes the importance of creating unique development experiences for women in STEM fields, the genesis of the Leadership Lab, the design and content of the program, and the outcomes for the participants.

  19. The utilisation of engineered invert traps in the management of near bed solids in sewer networks.

    PubMed

    Ashley, R M; Tait, S J; Stovin, V R; Burrows, R; Framer, A; Buxton, A P; Blackwood, D J; Saul, A J; Blanksby, J R

    2003-01-01

    Large existing sewers are considerable assets which wastewater utilities will require to operate for the foreseeable future to maintain health and the quality of life in cities. Despite their existence for more than a century there is surprisingly little guidance available to manage these systems to minimise problems associated with in-sewer solids. A joint study has been undertaken in the UK, to refine and utilise new knowledge gained from field data, laboratory results and Computational Fluid Dynamics (CFD) simulations to devise cost beneficial engineering tools for the application of small invert traps to localise the deposition of sediments in sewers at accessible points for collection. New guidance has been produced for trap siting and this has been linked to a risk-cost-effectiveness assessment procedure to enable system operators to approach in-sewer sediment management pro-actively rather than reactively as currently happens.

  20. CANISTER HANDLING FACILITY DESCRIPTION DOCUMENT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    J.F. Beesley

    The purpose of this facility description document (FDD) is to establish requirements and associated bases that drive the design of the Canister Handling Facility (CHF), which will allow the design effort to proceed to license application. This FDD will be revised at strategic points as the design matures. This FDD identifies the requirements and describes the facility design, as it currently exists, with emphasis on attributes of the design provided to meet the requirements. This FDD is an engineering tool for design control; accordingly, the primary audience and users are design engineers. This FDD is part of an iterative designmore » process. It leads the design process with regard to the flowdown of upper tier requirements onto the facility. Knowledge of these requirements is essential in performing the design process. The FDD follows the design with regard to the description of the facility. The description provided in this FDD reflects the current results of the design process.« less

  1. 3D printing from diagnostic images: a radiologist's primer with an emphasis on musculoskeletal imaging-putting the 3D printing of pathology into the hands of every physician.

    PubMed

    Friedman, Tamir; Michalski, Mark; Goodman, T Rob; Brown, J Elliott

    2016-03-01

    Three-dimensional (3D) printing has recently erupted into the medical arena due to decreased costs and increased availability of printers and software tools. Due to lack of detailed information in the medical literature on the methods for 3D printing, we have reviewed the medical and engineering literature on the various methods for 3D printing and compiled them into a practical "how to" format, thereby enabling the novice to start 3D printing with very limited funds. We describe (1) background knowledge, (2) imaging parameters, (3) software, (4) hardware, (5) post-processing, and (6) financial aspects required to cost-effectively reproduce a patient's disease ex vivo so that the patient, engineer and surgeon may hold the anatomy and associated pathology in their hands.

  2. [Distance learning using internet in the field of bioengineering].

    PubMed

    Ciobanu, O

    2003-01-01

    The Leonardo da Vinci training programme supports innovative transnational initiatives for promoting the knowledge, aptitudes and skills necessary for successful integration into working life. Biomedical engineering is an emerging interdisciplinary field that contributes to understand, define and solve problems in biomedical technology within industrial and health service contexts. Paper presents a Leonardo da Vinci pilot-project called Web-based learning and training in the field of biomedical and design engineering (WEBD). This project has started on 2001. The WEBD project proposes to use advanced learning technologies to provide education in the www. Project uses interactive 3D graphics and virtual reality tools. The WEBD distance training permits users to experience and interact with a life-like model or environment, in safety and at convenient times, while providing a degree of control over the simulation that is usually not possible in the real-life situation.

  3. Advanced engineering environment collaboration project.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lamph, Jane Ann; Pomplun, Alan R.; Kiba, Grant W.

    2008-12-01

    The Advanced Engineering Environment (AEE) is a model for an engineering design and communications system that will enhance project collaboration throughout the nuclear weapons complex (NWC). Sandia National Laboratories and Parametric Technology Corporation (PTC) worked together on a prototype project to evaluate the suitability of a portion of PTC's Windchill 9.0 suite of data management, design and collaboration tools as the basis for an AEE. The AEE project team implemented Windchill 9.0 development servers in both classified and unclassified domains and used them to test and evaluate the Windchill tool suite relative to the needs of the NWC using weaponsmore » project use cases. A primary deliverable was the development of a new real time collaborative desktop design and engineering process using PDMLink (data management tool), Pro/Engineer (mechanical computer aided design tool) and ProductView Lite (visualization tool). Additional project activities included evaluations of PTC's electrical computer aided design, visualization, and engineering calculations applications. This report documents the AEE project work to share information and lessons learned with other NWC sites. It also provides PTC with recommendations for improving their products for NWC applications.« less

  4. Open environments to support systems engineering tool integration: A study using the Portable Common Tool Environment (PCTE)

    NASA Technical Reports Server (NTRS)

    Eckhardt, Dave E., Jr.; Jipping, Michael J.; Wild, Chris J.; Zeil, Steven J.; Roberts, Cathy C.

    1993-01-01

    A study of computer engineering tool integration using the Portable Common Tool Environment (PCTE) Public Interface Standard is presented. Over a 10-week time frame, three existing software products were encapsulated to work in the Emeraude environment, an implementation of the PCTE version 1.5 standard. The software products used were a computer-aided software engineering (CASE) design tool, a software reuse tool, and a computer architecture design and analysis tool. The tool set was then demonstrated to work in a coordinated design process in the Emeraude environment. The project and the features of PCTE used are described, experience with the use of Emeraude environment over the project time frame is summarized, and several related areas for future research are summarized.

  5. The approach to engineering tasks composition on knowledge portals

    NASA Astrophysics Data System (ADS)

    Novogrudska, Rina; Globa, Larysa; Schill, Alexsander; Romaniuk, Ryszard; Wójcik, Waldemar; Karnakova, Gaini; Kalizhanova, Aliya

    2017-08-01

    The paper presents an approach to engineering tasks composition on engineering knowledge portals. The specific features of engineering tasks are highlighted, their analysis makes the basis for partial engineering tasks integration. The formal algebraic system for engineering tasks composition is proposed, allowing to set the context-independent formal structures for engineering tasks elements' description. The method of engineering tasks composition is developed that allows to integrate partial calculation tasks into general calculation tasks on engineering portals, performed on user request demand. The real world scenario «Calculation of the strength for the power components of magnetic systems» is represented, approving the applicability and efficiency of proposed approach.

  6. Distributed Engine Control Empirical/Analytical Verification Tools

    NASA Technical Reports Server (NTRS)

    DeCastro, Jonathan; Hettler, Eric; Yedavalli, Rama; Mitra, Sayan

    2013-01-01

    NASA's vision for an intelligent engine will be realized with the development of a truly distributed control system featuring highly reliable, modular, and dependable components capable of both surviving the harsh engine operating environment and decentralized functionality. A set of control system verification tools was developed and applied to a C-MAPSS40K engine model, and metrics were established to assess the stability and performance of these control systems on the same platform. A software tool was developed that allows designers to assemble easily a distributed control system in software and immediately assess the overall impacts of the system on the target (simulated) platform, allowing control system designers to converge rapidly on acceptable architectures with consideration to all required hardware elements. The software developed in this program will be installed on a distributed hardware-in-the-loop (DHIL) simulation tool to assist NASA and the Distributed Engine Control Working Group (DECWG) in integrating DCS (distributed engine control systems) components onto existing and next-generation engines.The distributed engine control simulator blockset for MATLAB/Simulink and hardware simulator provides the capability to simulate virtual subcomponents, as well as swap actual subcomponents for hardware-in-the-loop (HIL) analysis. Subcomponents can be the communication network, smart sensor or actuator nodes, or a centralized control system. The distributed engine control blockset for MATLAB/Simulink is a software development tool. The software includes an engine simulation, a communication network simulation, control algorithms, and analysis algorithms set up in a modular environment for rapid simulation of different network architectures; the hardware consists of an embedded device running parts of the CMAPSS engine simulator and controlled through Simulink. The distributed engine control simulation, evaluation, and analysis technology provides unique capabilities to study the effects of a given change to the control system in the context of the distributed paradigm. The simulation tool can support treatment of all components within the control system, both virtual and real; these include communication data network, smart sensor and actuator nodes, centralized control system (FADEC full authority digital engine control), and the aircraft engine itself. The DECsim tool can allow simulation-based prototyping of control laws, control architectures, and decentralization strategies before hardware is integrated into the system. With the configuration specified, the simulator allows a variety of key factors to be systematically assessed. Such factors include control system performance, reliability, weight, and bandwidth utilization.

  7. Flight Avionics Sequencing Telemetry (FAST) DIV Latching Display

    NASA Technical Reports Server (NTRS)

    Moore, Charlotte

    2010-01-01

    The NASA Engineering (NE) Directorate at Kennedy Space Center provides engineering services to major programs such as: Space Shuttle, Inter national Space Station, and the Launch Services Program (LSP). The Av ionics Division within NE, provides avionics and flight control syste ms engineering support to LSP. The Launch Services Program is respons ible for procuring safe and reliable services for transporting critical, one of a kind, NASA payloads into orbit. As a result, engineers mu st monitor critical flight events during countdown and launch to asse ss anomalous behavior or any unexpected occurrence. The goal of this project is to take a tailored Systems Engineering approach to design, develop, and test Iris telemetry displays. The Flight Avionics Sequen cing Telemetry Delta-IV (FAST-D4) displays will provide NASA with an improved flight event monitoring tool to evaluate launch vehicle heal th and performance during system-level ground testing and flight. Flight events monitored will include data from the Redundant Inertial Fli ght Control Assembly (RIFCA) flight computer and launch vehicle comma nd feedback data. When a flight event occurs, the flight event is ill uminated on the display. This will enable NASA Engineers to monitor c ritical flight events on the day of launch. Completion of this project requires rudimentary knowledge of launch vehicle Guidance, Navigatio n, and Control (GN&C) systems, telemetry, and console operation. Work locations for the project include the engineering office, NASA telem etry laboratory, and Delta launch sites.

  8. Product Lifecycle Management and the Quest for Sustainable Space Exploration Solutions

    NASA Technical Reports Server (NTRS)

    Caruso, Pamela W.; Dumbacher, Daniel L.; Grieves, Michael

    2011-01-01

    Product Lifecycle Management (PLM) is an outcome of lean thinking to eliminate waste and increase productivity. PLM is inextricably tied to the systems engineering business philosophy, coupled with a methodology by which personnel, processes and practices, and information technology combine to form an architecture platform for product design, development, manufacturing, operations, and decommissioning. In this model, which is being implemented by the Marshall Space Flight Center (MSFC) Engineering Directorate, total lifecycle costs are important variables for critical decision-making. With the ultimate goal to deliver quality products that meet or exceed requirements on time and within budget, PLM is a powerful concept to shape everything from engineering trade studies and testing goals, to integrated vehicle operations and retirement scenarios. This briefing will demonstrate how the MSFC Engineering Directorate is implementing PLM as part of an overall strategy to deliver safe, reliable, and affordable space exploration solutions and how that strategy aligns with the Agency and Center systems engineering policies and processes. Sustainable space exploration solutions demand that all lifecycle phases be optimized, and engineering the next generation space transportation system requires a paradigm shift such that digital tools and knowledge management, which are central elements of PLM, are used consistently to maximum effect. Adopting PLM, which has been used by the aerospace and automotive industry for many years, for spacecraft applications provides a foundation for strong, disciplined systems engineering and accountable return on investment. PLM enables better solutions using fewer resources by making lifecycle considerations in an integrative decision-making process.

  9. Tool use in left brain damage and Alzheimer's disease: What about function and manipulation knowledge?

    PubMed

    Jarry, Christophe; Osiurak, François; Besnard, Jérémy; Baumard, Josselin; Lesourd, Mathieu; Croisile, Bernard; Etcharry-Bouyx, Frédérique; Chauviré, Valérie; Le Gall, Didier

    2016-03-01

    Tool use disorders are usually associated with difficulties in retrieving function and manipulation knowledge. Here, we investigate tool use (Real Tool Use, RTU), function (Functional Association, FA) and manipulation knowledge (Gesture Recognition, GR) in 17 left-brain-damaged (LBD) patients and 14 AD patients (Alzheimer disease). LBD group exhibited predicted deficit on RTU but not on FA and GR while AD patients showed deficits on GR and FA with preserved tool use skills. These findings question the role played by function and manipulation knowledge in actual tool use. © 2016 The British Psychological Society.

  10. Knowledge Translation Tools are Emerging to Move Neck Pain Research into Practice.

    PubMed

    Macdermid, Joy C; Miller, Jordan; Gross, Anita R

    2013-01-01

    Development or synthesis of the best clinical research is in itself insufficient to change practice. Knowledge translation (KT) is an emerging field focused on moving knowledge into practice, which is a non-linear, dynamic process that involves knowledge synthesis, transfer, adoption, implementation, and sustained use. Successful implementation requires using KT strategies based on theory, evidence, and best practice, including tools and processes that engage knowledge developers and knowledge users. Tools can provide instrumental help in implementing evidence. A variety of theoretical frameworks underlie KT and provide guidance on how tools should be developed or implemented. A taxonomy that outlines different purposes for engaging in KT and target audiences can also be useful in developing or implementing tools. Theoretical frameworks that underlie KT typically take different perspectives on KT with differential focus on the characteristics of the knowledge, knowledge users, context/environment, or the cognitive and social processes that are involved in change. Knowledge users include consumers, clinicians, and policymakers. A variety of KT tools have supporting evidence, including: clinical practice guidelines, patient decision aids, and evidence summaries or toolkits. Exemplars are provided of two KT tools to implement best practice in management of neck pain-a clinician implementation guide (toolkit) and a patient decision aid. KT frameworks, taxonomies, clinical expertise, and evidence must be integrated to develop clinical tools that implement best evidence in the management of neck pain.

  11. Predicting cell viability within tissue scaffolds under equiaxial strain: multi-scale finite element model of collagen-cardiomyocytes constructs.

    PubMed

    Elsaadany, Mostafa; Yan, Karen Chang; Yildirim-Ayan, Eda

    2017-06-01

    Successful tissue engineering and regenerative therapy necessitate having extensive knowledge about mechanical milieu in engineered tissues and the resident cells. In this study, we have merged two powerful analysis tools, namely finite element analysis and stochastic analysis, to understand the mechanical strain within the tissue scaffold and residing cells and to predict the cell viability upon applying mechanical strains. A continuum-based multi-length scale finite element model (FEM) was created to simulate the physiologically relevant equiaxial strain exposure on cell-embedded tissue scaffold and to calculate strain transferred to the tissue scaffold (macro-scale) and residing cells (micro-scale) upon various equiaxial strains. The data from FEM were used to predict cell viability under various equiaxial strain magnitudes using stochastic damage criterion analysis. The model validation was conducted through mechanically straining the cardiomyocyte-encapsulated collagen constructs using a custom-built mechanical loading platform (EQUicycler). FEM quantified the strain gradients over the radial and longitudinal direction of the scaffolds and the cells residing in different areas of interest. With the use of the experimental viability data, stochastic damage criterion, and the average cellular strains obtained from multi-length scale models, cellular viability was predicted and successfully validated. This methodology can provide a great tool to characterize the mechanical stimulation of bioreactors used in tissue engineering applications in providing quantification of mechanical strain and predicting cellular viability variations due to applied mechanical strain.

  12. Considerations for a design and operations knowledge support system for Space Station Freedom

    NASA Technical Reports Server (NTRS)

    Erickson, Jon D.; Crouse, Kenneth H.; Wechsler, Donald B.; Flaherty, Douglas R.

    1989-01-01

    Engineering and operations of modern engineered systems depend critically upon detailed design and operations knowledge that is accurate and authoritative. A design and operations knowledge support system (DOKSS) is a modern computer-based information system providing knowledge about the creation, evolution, and growth of an engineered system. The purpose of a DOKSS is to provide convenient and effective access to this multifaceted information. The complexity of Space Station Freedom's (SSF's) systems, elements, interfaces, and organizations makes convenient access to design knowledge especially important, when compared to simpler systems. The life cycle length, being 30 or more years, adds a new dimension to space operations, maintenance, and evolution. Provided here is a review and discussion of design knowledge support systems to be delivered and operated as a critical part of the engineered system. A concept of a DOKSS for Space Station Freedom (SSF) is presented. This is followed by a detailed discussion of a DOKSS for the Lyndon B. Johnson Space Center and Work Package-2 portions of SSF.

  13. Conserving intertidal habitats: What is the potential of ecological engineering to mitigate impacts of coastal structures?

    NASA Astrophysics Data System (ADS)

    Perkins, Matthew J.; Ng, Terence P. T.; Dudgeon, David; Bonebrake, Timothy C.; Leung, Kenneth M. Y.

    2015-12-01

    Globally, coastlines are under pressure as coastal human population growth and urbanization continues, while climatic change leads to stormier seas and rising tides. These trends create a strong and sustained demand for land reclamation and infrastructure protection in coastal areas, requiring engineered coastal defence structures such as sea walls. Here, we review the nature of ecological impacts of coastal structures on intertidal ecosystems, seek to understand the extent to which ecological engineering can mitigate these impacts, and evaluate the effectiveness of mitigation as a tool to contribute to conservation of intertidal habitats. By so doing, we identify critical knowledge gaps to inform future research. Coastal structures alter important physical, chemical and biological processes of intertidal habitats, and strongly impact community structure, inter-habitat linkages and ecosystem services while also driving habitat loss. Such impacts occur diffusely across localised sites but scale to significant regional and global levels. Recent advances in ecological engineering have focused on developing habitat complexity on coastal structures to increase biodiversity. 'Soft' engineering options maximise habitat complexity through inclusion of natural materials, species and processes, while simultaneously delivering engineering objectives such as coastal protection. Soft options additionally sustain multiple services, providing greater economic benefits for society, and resilience to climatic change. Currently however, a lack of inclusion and economic undervaluation of intertidal ecosystem services may undermine best practice in coastline management. Importantly, reviewed evidence shows mitigation and even restoration do not support intertidal communities or processes equivalent to pre-disturbance conditions. Crucially, an absence of comprehensive empirical baseline biodiversity data, or data comprising additional ecological parameters such as ecosystem functions and services, prohibits quantification of absolute and relative magnitudes of ecological impacts due to coastal structures or effectiveness of mitigation interventions. This knowledge deficit restricts evaluation of the potential of ecological engineering to contribute to conservation policies for intertidal habitats. To improve mitigation design and effectiveness, a greater focus on in-situ research is needed, requiring stronger and timely collaboration between government agencies, construction partners and research scientists.

  14. Algorithm Optimally Orders Forward-Chaining Inference Rules

    NASA Technical Reports Server (NTRS)

    James, Mark

    2008-01-01

    People typically develop knowledge bases in a somewhat ad hoc manner by incrementally adding rules with no specific organization. This often results in a very inefficient execution of those rules since they are so often order sensitive. This is relevant to tasks like Deep Space Network in that it allows the knowledge base to be incrementally developed and have it automatically ordered for efficiency. Although data flow analysis was first developed for use in compilers for producing optimal code sequences, its usefulness is now recognized in many software systems including knowledge-based systems. However, this approach for exhaustively computing data-flow information cannot directly be applied to inference systems because of the ubiquitous execution of the rules. An algorithm is presented that efficiently performs a complete producer/consumer analysis for each antecedent and consequence clause in a knowledge base to optimally order the rules to minimize inference cycles. An algorithm was developed that optimally orders a knowledge base composed of forwarding chaining inference rules such that independent inference cycle executions are minimized, thus, resulting in significantly faster execution. This algorithm was integrated into the JPL tool Spacecraft Health Inference Engine (SHINE) for verification and it resulted in a significant reduction in inference cycles for what was previously considered an ordered knowledge base. For a knowledge base that is completely unordered, then the improvement is much greater.

  15. Product Lifecycle Management and the Quest for Sustainable Space Exploration Solutions

    NASA Technical Reports Server (NTRS)

    Caruso, Pamela W.; Dumbacher, Daniel L.

    2010-01-01

    Product Lifecycle Management (PLM) is an outcome of lean thinking to eliminate waste and increase productivity. PLM is inextricably tied to the systems engineering business philosophy, coupled with a methodology by which personnel, processes and practices, and information technology combine to form an architecture platform for product design, development, manufacturing, operations, and decommissioning. In this model, which is being implemented by the Engineering Directorate at the National Aeronautics and Space Administration's (NASA's) Marshall Space Flight Center, total lifecycle costs are important variables for critical decisionmaking. With the ultimate goal to deliver quality products that meet or exceed requirements on time and within budget, PLM is a powerful tool to shape everything from engineering trade studies and testing goals, to integrated vehicle operations and retirement scenarios. This paper will demonstrate how the Engineering Directorate is implementing PLM as part of an overall strategy to deliver safe, reliable, and affordable space exploration solutions. It has been 30 years since the United States fielded the Space Shuttle. The next generation space transportation system requires a paradigm shift such that digital tools and knowledge management, which are central elements of PLM, are used consistently to maximum effect. The outcome is a better use of scarce resources, along with more focus on stakeholder and customer requirements, as a new portfolio of enabling tools becomes second nature to the workforce. This paper will use the design and manufacturing processes, which have transitioned to digital-based activities, to show how PLM supports the comprehensive systems engineering and integration function. It also will go through a launch countdown scenario where an anomaly is detected to show how the virtual vehicle created from paperless processes will help solve technical challenges and improve the likelihood of launching on schedule, with less hands-on labor needed for processing and troubleshooting. Sustainable space exploration solutions demand that all lifecycle phases be optimized. Adopting PLM, which has been used by the automotive industry for many years, for aerospace applications provides a foundation for strong, disciplined systems engineering and accountable return on investment by making lifecycle considerations variables in an iterative decision-making process. This paper combines the perspectives of the founding father of PLM, along with the experience of Engineering leaders who are implementing these processes and practices real-time. As the nation moves from an industrial-based society to one where information is a valued commodity, future NASA programs and projects will benefit from the experience being gained today for the exploration missions of tomorrow.

  16. Spreadsheet-based engine data analysis tool - user's guide.

    DOT National Transportation Integrated Search

    2016-07-01

    This record refers to both the spreadsheet tool - Fleet Equipment Performance Measurement Preventive Maintenance Model: Spreadsheet-Based Engine Data Analysis Tool, http://ntl.bts.gov/lib/60000/60000/60007/0-6626-P1_Final.xlsm - and its accompanying ...

  17. Learning from the Mars Rover Mission: Scientific Discovery, Learning and Memory

    NASA Technical Reports Server (NTRS)

    Linde, Charlotte

    2005-01-01

    Purpose: Knowledge management for space exploration is part of a multi-generational effort. Each mission builds on knowledge from prior missions, and learning is the first step in knowledge production. This paper uses the Mars Exploration Rover mission as a site to explore this process. Approach: Observational study and analysis of the work of the MER science and engineering team during rover operations, to investigate how learning occurs, how it is recorded, and how these representations might be made available for subsequent missions. Findings: Learning occurred in many areas: planning science strategy, using instrumen?s within the constraints of the martian environment, the Deep Space Network, and the mission requirements; using software tools effectively; and running two teams on Mars time for three months. This learning is preserved in many ways. Primarily it resides in individual s memories. It is also encoded in stories, procedures, programming sequences, published reports, and lessons learned databases. Research implications: Shows the earliest stages of knowledge creation in a scientific mission, and demonstrates that knowledge management must begin with an understanding of knowledge creation. Practical implications: Shows that studying learning and knowledge creation suggests proactive ways to capture and use knowledge across multiple missions and generations. Value: This paper provides a unique analysis of the learning process of a scientific space mission, relevant for knowledge management researchers and designers, as well as demonstrating in detail how new learning occurs in a learning organization.

  18. Semantically-Rigorous Systems Engineering Modeling Using Sysml and OWL

    NASA Technical Reports Server (NTRS)

    Jenkins, J. Steven; Rouquette, Nicolas F.

    2012-01-01

    The Systems Modeling Language (SysML) has found wide acceptance as a standard graphical notation for the domain of systems engineering. SysML subsets and extends the Unified Modeling Language (UML) to define conventions for expressing structural, behavioral, and analytical elements, and relationships among them. SysML-enabled modeling tools are available from multiple providers, and have been used for diverse projects in military aerospace, scientific exploration, and civil engineering. The Web Ontology Language (OWL) has found wide acceptance as a standard notation for knowledge representation. OWL-enabled modeling tools are available from multiple providers, as well as auxiliary assets such as reasoners and application programming interface libraries, etc. OWL has been applied to diverse projects in a wide array of fields. While the emphasis in SysML is on notation, SysML inherits (from UML) a semantic foundation that provides for limited reasoning and analysis. UML's partial formalization (FUML), however, does not cover the full semantics of SysML, which is a substantial impediment to developing high confidence in the soundness of any conclusions drawn therefrom. OWL, by contrast, was developed from the beginning on formal logical principles, and consequently provides strong support for verification of consistency and satisfiability, extraction of entailments, conjunctive query answering, etc. This emphasis on formal logic is counterbalanced by the absence of any graphical notation conventions in the OWL standards. Consequently, OWL has had only limited adoption in systems engineering. The complementary strengths and weaknesses of SysML and OWL motivate an interest in combining them in such a way that we can benefit from the attractive graphical notation of SysML and the formal reasoning of OWL. This paper describes an approach to achieving that combination.

  19. The Engineering of Engineering Education: Curriculum Development from a Designer's Point of View

    ERIC Educational Resources Information Center

    Rompelman, Otto; De Graaff, Erik

    2006-01-01

    Engineers have a set of powerful tools at their disposal for designing robust and reliable technical systems. In educational design these tools are seldom applied. This paper explores the application of concepts from the systems approach in an educational context. The paradigms of design methodology and systems engineering appear to be suitable…

  20. Towards sustainable infrastructure management: knowledge-based service-oriented computing framework for visual analytics

    NASA Astrophysics Data System (ADS)

    Vatcha, Rashna; Lee, Seok-Won; Murty, Ajeet; Tolone, William; Wang, Xiaoyu; Dou, Wenwen; Chang, Remco; Ribarsky, William; Liu, Wanqiu; Chen, Shen-en; Hauser, Edd

    2009-05-01

    Infrastructure management (and its associated processes) is complex to understand, perform and thus, hard to make efficient and effective informed decisions. The management involves a multi-faceted operation that requires the most robust data fusion, visualization and decision making. In order to protect and build sustainable critical assets, we present our on-going multi-disciplinary large-scale project that establishes the Integrated Remote Sensing and Visualization (IRSV) system with a focus on supporting bridge structure inspection and management. This project involves specific expertise from civil engineers, computer scientists, geographers, and real-world practitioners from industry, local and federal government agencies. IRSV is being designed to accommodate the essential needs from the following aspects: 1) Better understanding and enforcement of complex inspection process that can bridge the gap between evidence gathering and decision making through the implementation of ontological knowledge engineering system; 2) Aggregation, representation and fusion of complex multi-layered heterogeneous data (i.e. infrared imaging, aerial photos and ground-mounted LIDAR etc.) with domain application knowledge to support machine understandable recommendation system; 3) Robust visualization techniques with large-scale analytical and interactive visualizations that support users' decision making; and 4) Integration of these needs through the flexible Service-oriented Architecture (SOA) framework to compose and provide services on-demand. IRSV is expected to serve as a management and data visualization tool for construction deliverable assurance and infrastructure monitoring both periodically (annually, monthly, even daily if needed) as well as after extreme events.

Top