Note: This page contains sample records for the topic model project knowledge from Science.gov.
While these samples are representative of the content of Science.gov,
they are not comprehensive nor are they the most current set.
We encourage you to perform a real-time search of Science.gov
to obtain the most current and comprehensive results. Last update: August 15, 2014.
Purpose: The purpose of this paper is to analyze the impact of learning in a project-driven organization and demonstrate analytically how the learning, which takes place during the execution of successive projects, and the forgetting that takes place during the dormant time between the project executions, can impact performance and productivity in…
Knowledge is one of the key factors in the success of development projects. The implementation and impact of a project depend to a large extent on individual knowledge of project staff, access to local and global knowledge resources and recognition and integration of indigenous knowledge. We may look at the implementation of a development project as an effort to apply
Located at Georgetown University, the Visible KnowledgeProject (VKP) "aims to improve the quality of college and university teaching by focusing on both student learning and faculty developments in technology-enhanced environments." By drawing on the strengths of their 12 partner schools (which include large research universities and community colleges), the various faculty from each institution involved with VKP document the impact of their various pedagogical and technological innovations on student learning and present them in a variety of formats. Many of these engaging projects and tools are available on the website, and may be searched by institution or discipline title. Quite a few will be of interest to instructors, as they feature such topics as Dante and the Journey to Freedom and Multiple Media for Cultural Analysis. Along with these helpful resources, visitors can learn more about the project, read the quarterly newsletter, and learn about individual participants who have taken these ideas to heart throughout the duration of the VKP.
The Spencer S. Eccles Health Sciences library at the University of Utah is responsible for the Knowledge Weavers project, which aimed to "produce innovative multimedia resources which included tutorials, interactive cases, animations and other multimedia methods of delivery to support health sciences education." There are more than two dozen resources in subjects that include neurology, nurse midwifery, cardiology, and environmental medicine. Visitors can use the Image Banks/Collections to look at "The ECG Learning Center", which includes an interactive ECG tutorial, ECG terminology, and diagnostic criteria. The "Voluntary Control of the Facial Muscles" animation under the "Animations" heading, allows visitors to see how the movement of the voluntary facial muscles are impeded by such things as infections and tumors. "EnviroDx" is under the heading "Interactive Cases" and operates as a virtual clinic for a doctor trying to diagnose a patient whose illness is likely caused by exposure to environmental factors.
This paper focuses on challenges in mentoring software development projects in the high school and analyzes difficulties encountered by Computer Science teachers in the mentoring process according to Shulman's Teacher Knowledge Base Model. The main difficulties that emerged from the data analysis belong to the following knowledge sources of…
This article examines a knowledge-based optimal scheduling system designed to monitor the progress towards project objectives and minimize delays in scheduled completion dates. This computerized system was developed to reduce management by exception techniques (crisis management) and to help concentrate efforts on project objectives. The system's operation and use are discussed.
Rahbar, F.F. (Iowa State Univ., Ames (United States)); Yates, J.K. (Univ. of Colorado, Boulder (United States)); Spencer, G.R. (Benham Group, Tulsa, OK (United States))
SUMMARY: This paper proposes a knowledge management framework for project definition of capital facility projects. The conceptual framework emphasizes project-based learning and the creation of group knowledge in early phase project planning and design activity. The use of multi-disciplinary expertise in this phase of project development acknowledges the use of multiple decision frames by which stakeholders approach project solutions. This
Representation of activity knowledge is important to any application which must reason about activities such as new product management, factory scheduling, robot control, vehicle control, software engineering, and air traffic control. This paper provides an integration of the underlying theories needed for modeling activities. Using the domain of large computer design projects as an example, the semantics of activity modeling is described. While the past research in knowledge representation has discovered most of the underlying concepts, our attempt is toward their integration. This includes the epistemological concepts for erecting the required knowledge structure; the concepts of activity, state, goal, and manifestation for the adequate description of the plan and the progress; and the concepts of time and causality to infer the progression among the activities. We also address the issues which arise due to the integration of aggregation, time, and causality among activities and states. PMID:21869291
This slide presentation reviews the Knowledge Management (KM) project of the Propulsion Systems Department at Marshall Space Flight Center. KM is needed to support knowledge capture, preservation and to support an information sharing culture. The presenta...
projects. The conceptual framework emphasizes project-based learning and the creation of group knowledge in early phase project planning and design activity. The use of multi-disciplinary expertise in this phase of project development acknowledges the use of multiple decision frames by which stakeholders approach project solutions. This research views project definition as a collaborative decision-making process, and highlights the need for
models of problem solvingThe mining approach, as exemplified by Mycin , a system which diagnosespulmonary infections, considers expert knowledge and rule-based representation asessentially equivalent knowledge acquisition is an interactive transfer of if-thenassociations. This uniform approach to representation was criticised by Clancey, who showed that, at least in the case of the Mycins knowledge base, it fails tocapture important conceptual distinctions
This slide presentation reviews the Knowledge Management (KM) project of the Propulsion Systems Department at Marshall Space Flight Center. KM is needed to support knowledge capture, preservation and to support an information sharing culture. The presentation includes the strategic plan for the KM initiative, the system requirements, the technology description, the User Interface and custom features, and a search demonstration.
Purpose--Training multimedia projects often face identical knowledge-transfer obstacles that partly originate in the multidisciplinarity of the project team. The purpose of this paper is to describe these difficulties and the tools used to overcome them. In particular, the aim is to show how elements of cognitive psychology theory (concept maps,…
This paper develops a modeling framework for systems engineering that encompasses systems modeling, task modeling, and knowledgemodeling, and allows knowledge engineering and software engineering to be seen as part of a unified developmental process. This framework is used to evaluate what novel contributions the 'knowledge engineering' paradigm has made and how these impact software engineering.
Parametric, or model-based CBR is being used in IBM projects for knowledge management. This paper provides an overview of the use of CBR in an internal project called TEPM Technology Enabled ProjectModel. TEPM uses CBR in a collaborative web-based environment to provide reuse support for practitioners doing project work. The paper outlines the knowledgemodel of an \\
Purpose – Efficient project execution is a key business objective in many domains and particularly so for capital projects in the construction industry. Each construction project is unique in terms of how specialist professionals manage knowledge. Construction projects generate a large body of knowledge for sharing and reuse within the construction organization and across projects. In addition, projects provide opportunities
Francisco Loforte Ribeiro; Vanessa Leitão Tomásio Ferreira
The paper investigates the role of knowledge management in enabling project success, innovation, completion times, operational efficiency and the generation of new knowledge in development projects. Four projects in Uganda, Nigeria, and Cote d'Ivoire were used as case studies. The objective was to explore the nature of knowledge management practices in these projects in order to see how they could
Social work clinicians across health care settings are uniquely positioned to disseminate valuable practice experience, thereby contributing to knowledge development within their field of practice and across disciplines. Unfortunately, practitioners tend to shy away from writing and research, and are often reluctant to publicly disseminate their expertise through peer-reviewed journals and conference presentations. To better support health social workers in
This database project focuses on learning through knowledge integration; i.e., sharing and applying specialized (database) knowledge within a group, and combining it with other business knowledge to create new knowledge. Specifically, the Tiny Tots, Inc. project described below requires students to design, build, and instantiate a database system…
This paper describes the Knowledge Encapsulation Framework (KEF), a suite of tools to enable knowledge inputs (relevant, domain-specific facts) to modeling and simulation projects, as well as other domains that require effective collaborative workspaces for knowledge-based task. This framework can be used to capture evidence (e.g., trusted material such as journal articles and government reports), discover new evidence (covering both trusted and social media), enable discussions surrounding domain-specific topics and provide automatically generated semantic annotations for improved corpus investigation. The current KEF implementation is presented within a wiki environment, providing a simple but powerful collaborative space for team members to review, annotate, discuss and align evidence with their modeling frameworks. The novelty in this approach lies in the combination of automatically tagged and user-vetted resources, which increases user trust in the environment, leading to ease of adoption for the collaborative environment.
Cowell, Andrew J.; Gregory, Michelle L.; Marshall, Eric J.; McGrath, Liam R.
The Research ProjectKnowledge Base (RPKB) is currently being designed and will be implemented in a manner that is fully compatible and interoperable with enterprise architecture tools developed to support NASA's Applied Sciences Program. Through user needs assessment, collaboration with Stennis Space Center, Goddard Space Flight Center, and NASA's DEVELOP Staff personnel insight to information needs for the RPKB were gathered from across NASA scientific communities of practice. To enable efficient, consistent, standard, structured, and managed data entry and research results compilation a prototype RPKB has been designed and fully integrated with the existing NASA Earth Science Systems Components database. The RPKB will compile research project and keyword information of relevance to the six major science focus areas, 12 national applications, and the Global Change Master Directory (GCMD). The RPKB will include information about projects awarded from NASA research solicitations, project investigator information, research publications, NASA data products employed, and model or decision support tools used or developed as well as new data product information. The RPKB will be developed in a multi-tier architecture that will include a SQL Server relational database backend, middleware, and front end client interfaces for data entry. The purpose of this project is to intelligently harvest the results of research sponsored by the NASA Applied Sciences Program and related research program results. We present various approaches for a wide spectrum of knowledge discovery of research results, publications, projects, etc. from the NASA Systems Components database and global information systems and show how this is implemented in SQL Server database. The application of knowledge discovery is useful for intelligent query answering and multiple-layered database construction. Using advanced EA tools such as the Earth Science Architecture Tool (ESAT), RPKB will enable NASA and partner agencies to efficiently identify the significant results for new experiment directions and principle investigators to formulate experiment directions for new proposals.
Dabiru, L.; O'Hara, C. G.; Shaw, D.; Katragadda, S.; Anderson, D.; Kim, S.; Shrestha, B.; Aanstoos, J.; Frisbie, T.; Policelli, F.; Keblawi, N.
Analysis of management discourses, especially project-based learning and knowledge management, indicates that such terms as human capital, working knowledge, and knowledge assets construe managerial workers as cogito-economic subjects. Although workplace learning should develop economically related capabilities, such discourses imply that these…
Knowledge management (KM) is becoming recognized as a valuable tool for establishing and maintaining competitive advantage. Decision superiority, the ultimate goal of KM, is only possible through the effective and efficient use of knowledge. But, to effectively and efficiently create and use KM, it is important to carefully select KM projects. This research assesses the usefulness of a KM project
Kevin G. Budai; CAPT USAF; Alan R. Heminger; Summer Bartczak
This paper examines how and why project teams may be conceived as highly creative and generative knowledge creation spaces\\u000a – as opposed to traditionally and primarily being conceived as only temporal task focused entities where knowledge is simply\\u000a exchanged. In this qualitative analysis which draws on situated learning theory, project teams offer significant and yet generally\\u000a underexploited personal knowledge growth
In the process of enterprise ITSM operation, the management of IT knowledge is of crucial importance. Based on the knowledge management process and SKMS of the service transition process in ITIL v3, this paper raised a reference model for enterprise ITSM knowledge management, which provides detailed design for function layers as data collection, information integration, knowledge processing and knowledge presentation,
The embedding of new management knowledge in project-based organization is made particularly problematic due to the attenuated links that exist between organization- wide change initiatives and project management practice. To explore the complex processes involved in change in project-based organization, this paper draws upon a case study of change within the UK construction industry. Analysing the case study through the
The role of the NASA/DOD Aerospace Knowledge DIffusion Research Project in helping to maintain U.S. competitiveness is addressed. The phases of the project are examined in terms of the focus, emphasis, subjects, methods, and desired outcomes. The importance of the project to aerospace R&D is emphasized.
Pinelli, Thomas E.; Kennedy, John M.; Barclay, Rebecca O.
Modeling Guru is an on-line knowledge-sharing resource for anyone involved with or interested in NASA's scientific models or High End Computing (HEC) systems. Developed and maintained by the NASA's Software Integration and Visualization Office (SIVO) and the NASA Center for Computational Sciences (NCCS), Modeling Guru's combined forums and knowledge base for research and collaboration is becoming a repository for the accumulated expertise of NASA's scientific modeling and HEC communities. All NASA modelers and associates are encouraged to participate and provide knowledge about the models and systems so that other users may benefit from their experience. Modeling Guru is divided into a hierarchy of communities, each with its own set forums and knowledge base documents. Current modeling communities include those for space science, land and atmospheric dynamics, atmospheric chemistry, and oceanography. In addition, there are communities focused on NCCS systems, HEC tools and libraries, and programming and scripting languages. Anyone may view most of the content on Modeling Guru (available at http://modelingguru.nasa.gov/), but you must log in to post messages and subscribe to community postings. The site offers a full range of "Web 2.0" features, including discussion forums, "wiki" document generation, document uploading, RSS feeds, search tools, blogs, email notification, and "breadcrumb" links. A discussion (a.k.a. forum "thread") is used to post comments, solicit feedback, or ask questions. If marked as a question, SIVO will monitor the thread, and normally respond within a day. Discussions can include embedded images, tables, and formatting through the use of the Rich Text Editor. Also, the user can add "Tags" to their thread to facilitate later searches. The "knowledge base" is comprised of documents that are used to capture and share expertise with others. The default "wiki" document lets users edit within the browser so others can easily collaborate on the same document, even allowing the author to select those who may edit and approve the document. To maintain knowledge integrity, all documents are moderated before they are visible to the public. Modeling Guru, running on Clearspace by Jive Software, has been an active resource to the NASA modeling and HEC communities for more than a year and currently has more than 100 active users. SIVO will soon install live instant messaging support, as well as a user-customizable homepage with social-networking features. In addition, SIVO plans to implement a large dataset/file storage capability so that users can quickly and easily exchange datasets and files with one another. Continued active community participation combined with periodic software updates and improved features will ensure that Modeling Guru remains a vibrant, effective, easy-to-use tool for the NASA scientific community.
Describes an intelligent Web-based construction project management system called VHBuild.com which integrates project management, knowledge management, and artificial intelligence technologies. Highlights include an information flow model; time-cost optimization based on genetic algorithms; rule-based drawing interpretation; and a case-based…
Li, Heng; Tang, Sandy; Man, K. F.; Love, Peter E. D.
Based on a taxonomy of knowledge management processes, provides a synopsis of technologies and vendors that support knowledge management. Proposes a model for classifying the various types of technologies related to knowledge management that are most often used in institutional research. (EV)
\\u000a We present a maturity model that reveals the levels an organization traverses in the move from independent and distinct to integrated analysis and development of information systems and knowledge management systems. The Knowledge Management Systems Integration\\u000a Maturity Model emerged from a 5-year action research project in the Israeli Navy documenting the development of 15 systems\\u000a as the organization went through
This study is a result from an increasing need to develop processes and improve knowledge management and working methods in business process development projects in a fast growing case company. This study has been done in co-operation of the case company ...
This pamphlet surveys the research and literature dealing with the identification and use of organized knowledge in educational systems. Focusing on curriculum development and planning, the document discusses curriculum programs and projects in four fields: (1) physical and biological sciences, (2) mathematics, (3) social studies, and (4) English.…
This paper presents a "standard" method that is being developed by ARESlab of Rome's La Sapienza University for the documentation and the representation of the archaeological artifacts and structures through automatic photogrammetry software. The image-based 3D modeling technique was applied in two projects: in Sarno and in Rome. The first is a small city in Campania region along Via Popilia, known as the ancient way from Capua to Rhegion. The interest in this city is based on the recovery of over 2100 tombs from local necropolis that contained more than 100.000 artifacts collected in "Museo Nazionale Archeologico della Valle del Sarno". In Rome the project regards the archaeological area of Insula Volusiana placed in Forum Boarium close to Sant'Omobono sacred area. During the studies photographs were taken by Canon EOS 5D Mark II and Canon EOS 600D cameras. 3D model and meshes were created in Photoscan software. The TOF-CW Z+F IMAGER® 5006h laser scanner is used to dense data collection of archaeological area of Rome and to make a metric comparison between range-based and image-based techniques. In these projects the IBM as a low-cost technique proved to be a high accuracy improvement if planned correctly and it shown also how it helps to obtain a relief of complex strata and architectures compared to traditional manual documentation methods (e.g. two-dimensional drawings). The multidimensional recording can be used for future studies of the archaeological heritage, especially for the "destructive" character of an excavation. The presented methodology is suitable for the 3D registration and the accuracy of the methodology improved also the scientific value.
We created three models to represent a comprehensive knowledgemodel: Stages of Knowledge Management Model (Forrester) Expanded Life-Cycle Information Management Model Organizational Knowledge Management Model. In building a series of models, we started w...
Project LINK (A Live and Interactive Network of Knowledge), a collaboration among Eureka Scientific, Inc., the Exploratorium, and NASA/Ames Research Center will demonstrate video-conferencing capabilities from the Kuiper Airborne Observatory (KAO) to the San Francisco Exploratorium in the context of science education outreach to K--12 teachers and students. The project is intended to pilot-test strategies for facilitating the live interface between scientists aboard the KAO and K-12 teachers and students through the resources and technical expertise available at science museums and private industry. The interface will be based on Internet/CuSeeMe videoconferencing capabilities which will allow teachers and students at the Exploratorium to collaborate in a live and interactive manner with teachers and scientists aboard the KAO. The teacher teams chosen for the on-board experiments represent rural and urban school districts in California. The teachers will interface with colleagues as part of the NASA-Funded Project FOSTER (Flight Opportunities for Science Teacher EnRichment). Our project will serve to demonstrate live interface capabilities in preparation for the ``Live from the Stratosphere" Project. Teachers from Project LINK will participate on two flights aboard the KAO during the Summer of 1995. Lesson plans, classroom activities, project description and lessons learned will be disseminated through the World Wide Web. Project LINK is made possible by a grant from NASA to Eureka Scientific, Inc.
Introduction In this article, we present a methodological design for qualitative investigation of knowledge translation (KT) between participants in a participatory research project. In spite of a vast expansion of conceptual models and frameworks for conducting KT between research and practice, few models emphasise how KTs come about. Better understanding of the actions and activities involved in a KT process is important for promoting diffusion of knowledge and improving patient care. The purpose of this article is to describe a methodological design for investigating how KTs come about in participatory research. Methods and analysis The article presents an ethnographic study which investigates meetings between participants in a participatory research project. The participants are researchers and primary healthcare clinicians. Data are collected through observation, interviews and document studies. The material is analysed using the analytical concepts of knowledge objects, knowledge forms and knowledge positions. These concepts represent an analytical framework enabling us to observe knowledge and how it is translated between participants. The main expected outcome of our study is to develop a typology of KT practices relevant to participatory research. Ethics and dissemination The project has been evaluated and approved by the Norwegian Social Science Data Services. Informed consent was obtained for all participants. The findings from this study will be disseminated through peer-reviewed publications and national and international conference presentations.
This paper discusses the evaluation of an informal science education project, The Birdhouse Network (TBN) of the Cornell Laboratory of Ornithology. The Elaboration Likelihood Model and the theory of Experiential Education were used as frameworks to analyse the impact of TBN on participants' attitudes toward science and the environment, on their knowledge of bird biology, and on their understanding of
Content extraction from medical texts is achievable today by linguistic applications, in so far as sufficient domain knowledge is available. Such knowledge represents a model of the domain and is hard to collect with sufficient depth and good coverage, despite numerous attempts. To leverage this task is a priority in order to benefit from the awaited linguistic tools. The light model is designed with this goal in mind. Syntactic and lexical information are generally available with large lexicons. A domain model should add the necessary semantic information. The authors have designed a light knowledgemodel for the collection of semantic information on the basis of the recognized syntactical and lexical attributes. It has been tailored for the acquisition of enough semantic information in order to retrieve terms of a controlled vocabulary from free texts, as for example, to retrieve Mesh terms from patient records. PMID:11833480
Abstract Applying the method of inductive theory building, we have developed a case study based on the Linux kernel development,process to build a model of Open Source knowledge,creation. The Linux model touches upon a broad set of issues revealing the nature of our connected society because the Linux project was among,the first attempts that make a deliberate effort to use
The SAGE Project is a multi-institution effort to enable encoding and dissemination of interoperable, computable clinical practice guidelines. We have developed a standards-based guideline-knowledge representation model that specifies computable guideline content. We incorporate a \\
Robert M. Abarbanel; Nick Beard; James R. Campbell; Julie I. Glasgow; Stanley M. Huff; James G. Mansfield; Eric Mays; James McClay; David N. Mohr; Mark A. Musen; Craig G. Parker; Prabhu Ram; Roberto A. Rocha; Sidna M. Scheitel; Samson W. Tu; Tony Weida; Qin Ye
As human society entered the era of knowledge economy, lasting in-depth studies have been done to knowledge learning, knowledge management, and other fields. How to identify, acquire, develop, decompose, store, and transmit knowledge has become the focus of research. The creation of knowledge network supplies a better solution to this problem. Knowledge network is composed of knowledge points. Each knowledge
The aim of this case study is to investigate the knowledge integration process of college students completing a project in web project course which guided students doing their project on line. We elicited five knowledge integration processes from the discourse of college students and the advisor who discussed the group project on-line. The research group of four was chosen purposively
The Lee College (Baytown, Texas) Rural Health Occupations ModelProject was designed to provide health occupations education tailored to disadvantaged, disabled, and/or limited-English-proficient high school students and adults and thereby alleviate the shortage of nurses and health care technicians in two rural Texas counties. A tech prep program…
Six Sigma as a powerful approach moves toward continuous and sustainable improvement by increasing customer satisfaction, and decreasing activity time and number of failures. On the other hand, Knowledge Management (KM) is a modern approach that deals with the greatest capital of organization i.e. knowledge. Although creation of more organizational knowledge for performance improvement, organization deployment, competitiveness increase, and acquiring
This study examined connections between science literacy and writing. Science e-mails were written as content-oriented professional development materials for K-8 teachers. E-mail drafts underwent multiple revisions. The study data included drafts, final e-mails, and feedback from the supervising scientist and the e-mails' teacher audience. The analyses, informed by Bereiter & Scardamalia's knowledge-transforming process (1987), Schindler's audience theories (2001), and Johnson and Aragon's on-line instruction framework (2003), sought connections among three components: the writer's struggle between content and discourse, audience, and format. The e-mail drafts indicated a large percentage of text changes involving two or more of the components, primarily concerning discourse. Redundancies surfaced among the components, indicating Bereiter and Scardamalia's knowledge-transforming process sufficiently explained the e-mail project; additional format and audience models were unnecessary. Recommendations for extending the knowledge-transforming process specifically for science are included.
Abstract Genome-scale metabolic model reconstruction is a complicated process beginning with (semi-)automatic inference of the reactions participating in the organism's metabolism, followed by many iterations of network analysis and improvement. Despite advances in automatic model inference and analysis tools, reconstruction may still miss some reactions or add erroneous ones. Consequently, a human expert's analysis of the model will continue to play an important role in all the iterations of the reconstruction process. This analysis is hampered by the size of the genome-scale models (typically thousands of reactions), which makes it hard for a human to understand them. To aid human experts in curating and analyzing metabolic models, we have developed a method for knowledge-based generalization that provides a higher-level view of a metabolic model, masking its inessential details while presenting its essential structure. The method groups biochemical species in the model into semantically equivalent classes based on the ChEBI ontology, identifies reactions that become equivalent with respect to the generalized species, and factors those reactions into generalized reactions. Generalization allows curators to quickly identify divergences from the expected structure of the model, such as alternative paths or missing reactions, that are the priority targets for further curation. We have applied our method to genome-scale yeast metabolic models and shown that it improves understanding by helping to identify both specificities and potential errors. PMID:24766276
In most current applications of belief networks, domain knowledge is represented by a single belief network that applies to all problem instances in the domain. In more complex domains, problem-specific models must be constructed from a knowledge base encoding probabilistic relationships in the domain. Most work in knowledge-bas ed model construction takes the rule as the basic unit of knowledge.
The report covers Phase I of the Burn Injury Education Demonstration Project, a four-phased project designed to explore the feasibility of using educational intervention strategies to increase knowledge and appropriate behaviors and attitudes to reduce th...
Project CAPABLE (Classroom Action Program: Aim: Basic Learning Effectiveness) is a classroom approach which integrates the basic learning skills with content. The goal of the project is to use basic learning skills to enhance the learning of content and at the same time use the content to teach basic learning skills. This manual illustrates how…
AMMA-MIP is a Model Intercomparison Project developed within the framework of the African Mon- soon Multidisciplinary Analyses project (AMMA). It is a relatively light intercomparison and evaluation ex- ercise of both global and regional atmospheric models, focused on the study of the seasonal and intraseasonal variations of the climate and rainfall over the Sahel. Taking advantage of the relative zonal
Studied the meaning of symmetry of knowledge-based roles for knowledge construction and sharing in social interaction during the report writing phase of an experimental science learning project with four ninth grade students in Finland. Identified four patterns of interaction that differed in terms of their symmetry of knowledge-based roles and…
'This Project Background Information and State of Knowledge document provides supporting materials that help readers understand the following elements: the Hanford Site environmental setting; the waste disposal history of the Hanford Site; the regulatory framework and management strategies in effect at the time that the Integration Project was established; and the technical state of knowledge regarding key technical areas within
Purpose: This conceptual paper aims to explain how "project management centres of excellence (CoEs)", a particular class of knowledge network, can be viewed as providing great potential for assisting project management (PM) teams to make wise decisions. Design/methodology/approach: The paper presents a range of knowledge network types and…
Repeated failures have plagued many development programs that were sociologically ill-informed or ill-conceived. More recently, however, a combination of factors is leading to increased use of knowledge derived from sociology and anthropology in developme...
Recently, interests in the notion of collaborative product innovation (CPI) from academia and industry have been significantly increased. Comprehensive requirements analysis along with a cogent framework, however, do not address the problem that how to support product innovation with the distributed knowledge resources. A knowledge decision support model is proposed including collaborative knowledge database setting, knowledge mining and innovative design
Yu Yang; Xuedong Liang; Jie Yang; Zijun Zhou; Jing Wang
Researchers in the area of knowledge compilation are developing general purpose techniques for improving the efficiency of knowledge-based systems. In this article, an attempt is made to define knowledge compilation, to characterize several classes of knowledge compilation techniques, and to illustrate how some of these techniques can be applied to improve the performance of model-based reasoning systems.
It is obvious from the study of literature that university-industry (U-I) relationships and their subsequent knowledge transfers are topics of high political, economical, managerial and academic interest. Indeed, technological knowledge is seen as a major source of long-term economic growth and its transfer to the firm is critical since it acts as a significant innovation factor. In order to access
The CARETAKER1 project, which is a 30-month project that has just kicked off, aims at studying, developing and assessing multimedia knowledge-based content analysis, knowledge extraction components, and metadata management sub-systems in the context of automated situation awareness, diagnosis and decision support. More precisely, CARETAKER will focus on the extraction of structured knowledge from large multimedia collections recorded over networks of
C. Carincotte; X. Desurmont; B. Ravera; F. Bremond; J. Orwell; S. A. Velastin; J. M. Odobez; B. Corbucci; J. Palo; J. Cernocky
Creativity and knowledge management are both important competences that university students need to strive to develop. This study therefore developed and evaluated an instructional program for improving university students' creativity based on a blended knowledge-management (KM) model that integrates e-learning and three core processes of KM:…
Project-based learning engages students in problem solving through artefact design. However, previous studies of online project-based learning have focused primarily on the dynamics of online collaboration; students' knowledge construction throughout this process has not been examined thoroughly. This case study analyzed the relationship between students' levels of knowledge construction during asynchronous online discussions with respect to engagement in project-based learning.
Joyce Hwee Ling Koh; Susan C. Herring; Khe Foon Hew
Increasing reliance on and investment in information technology and electronic networking systems presupposes that computing and information technology will play a major role in the diffusion of aerospace knowledge. Little is known, however, about actual information technology needs, uses, and problems within the aerospace knowledge diffusion process. The authors state that the potential contributions of information technology to increased productivity and competitiveness will be diminished unless empirically derived knowledge regarding the information-seeking behavior of the members of the social system - those who are producing, transferring, and using scientific and technical information - is incorporated into a new technology policy framework. Research into the use of information technology and electronic networks by U.S. aerospace engineers and scientists, collected as part of a research project designed to study aerospace knowledge diffusion, is presented in support of this assertion.
Pinelli, Thomas E.; Bishop, Ann P.; Barclay, Rebecca O.; Kennedy, John M.
Modern scientific enterprises are inherently knowledge-intensive. In general, scientific studies in domains such as geosciences, climate, and biology require the acquisition and manipulation of large amounts of experimental and field data in order to create inputs for large-scale computational simulations. The results of these simulations must then be analyzed, leading to refinements of inputs and models and further simulations. Further, these results must be managed and archived to provide justifications for publications and regulatory decisions that are based on these models. In this paper we describe our Velo framework that is designed as a reusable, domain independent knowledge management infrastructure for modeling and simulation. Velo leverages, integrates, and extends open source collaborative and content management technologies to create a scalable and flexible core platform that can be tailored to specific scientific domains. In this paper we describe the architecture of Velo for managing and associating the various types of data that are used and created in modeling and simulation projects, as well as the framework for integrating domain-specific tools. To demonstrate a realization of Velo, we describe the Geologic Sequestration Software Suite (GS3) that has been developed to support geologic sequestration modeling. This provides a concrete example of the inherent extensibility and utility of our approach.
Gorton, Ian; Sivaramakrishnan, Chandrika; Black, Gary D.; White, Signe K.; Purohit, Sumit; Madison, Michael C.; Schuchardt, Karen L.
Modern scientific enterprises are inherently knowledge-intensive. Scientific studies in domains such as geosciences, climate, and biology require the acquisition and manipulation of large amounts of experimental and field data to create inputs for large-scale computational simulations. The results of these simulations are then analyzed, leading to refinements of inputs and models and additional simulations. The results of this process must be managed and archived to provide justifications for regulatory decisions and publications that are based on the models. In this paper we introduce our Velo framework that is designed as a reusable, domain independent knowledge management infrastructure for modeling and simulation. Velo leverages, integrates and extends open source collaborative and content management technologies to create a scalable and flexible core platform that can be tailored to specific scientific domains. We describe the architecture of Velo for managing and associating the various types of data that are used and created in modeling and simulation projects, as well as the framework for integrating domain-specific tools. To demonstrate realizations of Velo, we describe examples from two deployed sites for carbon sequestration and climate modeling. These provide concrete example of the inherent extensibility and utility of our approach.
Gorton, Ian; Sivaramakrishnan, Chandrika; Black, Gary D.; White, Signe K.; Purohit, Sumit; Lansing, Carina S.; Madison, Michael C.; Schuchardt, Karen L.; Liu, Yan
This paper describes an approach to using agent technology to extend the automated discovery mechanism of the Knowledge Encapsulation Framework (KEF). KEF is a suite of tools to enable the linking of knowledge inputs (relevant, domain-specific evidence) to modeling and simulation projects, as well as other domains that require an effective collaborative workspace for knowledge-based tasks. This framework can be used to capture evidence (e.g., trusted material such as journal articles and government reports), discover new evidence (covering both trusted and social media), enable discussions surrounding domain-specific topics and provide automatically generated semantic annotations for improved corpus investigation. The current KEF implementation is presented within a semantic wiki environment, providing a simple but powerful collaborative space for team members to review, annotate, discuss and align evidence with their modeling frameworks. The novelty in this approach lies in the combination of automatically tagged and user-vetted resources, which increases user trust in the environment, leading to ease of adoption for the collaborative environment.
Haack, Jereme N.; Cowell, Andrew J.; Marshall, Eric J.; Fligg, Alan K.; Gregory, Michelle L.; McGrath, Liam R.
Purpose: The purpose of this paper is to present a "soft methodology" model in knowledge management that addresses the problem of accessing and managing one particular type of knowledge: personal (implicit/tacit) knowledge. Design/methodology/approach: The model is based on the theories and methodologies of grounded theory, adult learning,…
Objectives: The effectiveness of computerized clinical decision support systems (CDSS) depends on the quality of the knowledge they refer to. In this article, we are inter- ested in the acquisition, modeling and representation of the knowledge embedded in the \\
Explains how to construct a three dimensional model for stereographic projection. It will be suitable for presenting the symmetry of crystal systems, and will help physics students understand the nature of crystallography. (GA)
This document constitutes Andersen Consulting's Final Report on its project, the Knowledge-Based Software Assistant/Advanced Development Model (KBSA/ADM). Instantiated under contract F3O602-93-C-0015 by Rome Laboratory in 1992, the KBSA/ADM project signif...
Clancey (1992) proposed the model-construction framework as a way to explain the reasoning of knowledge-based systems (KBSs), based o n h is realization that all KBSs construct i mplicit or explicit situation-specific models (SSMs). An SSM is a rational argument t hat explains the solution produced for a specific problem situation pertaining to a target application task (e.g., SSMs constructed
The maintenance of knowledge-rich clinical decision-support systems is challenging, in particular in the complex setting of a large academic medical center. Distributing the maintenance tasks to the source of expertise can address scalability, accuracy and currency issues. It also helps to foster a more global sense of ownership among the system users. The knowledge maintenance model must provide processes and tools to deal with a wide range of stakeholders (resident and attending physicians, consulting specialists, other care providers, case managers, ancillary departments), with knowledge embedded in legacy departmental systems, and with the continuous evolution of the content and form of the knowledge base. We describe and illustrate the "knowledge library" model in use at Vanderbilt University Medical Center for the distributed maintenance of the integrated knowledge base that drives the WizOrder clinical decision-support, physician order entry, and notes capture system.
Research is being conducted in the Indexing Aid Project, Automated Classification and Retrieval Program, Lister Hill National Center for Biomedical Communications, on the development of interactive knowledge-based systems for computer-assisted indexing of...
This paper reports progress and practical experience in security- requirements engineering using the security center Knowledge Web (KWeb) as a case study. It describes the project, architecture, and the approach of the Spiral System Implementation Methodo...
Knowledge of mathematical equivalence, the principle that 2 sides of an equation represent the same value, is a foundational concept in algebra, and this knowledge develops throughout elementary and middle school. Using a construct-modeling approach, we developed an assessment of equivalence knowledge. Second through sixth graders (N = 175)…
Rittle-Johnson, Bethany; Matthews, Percival G.; Taylor, Roger S.; McEldoon, Katherine L.
Another language for expressing 'knowing that' is given together with axioms and rules of inference and a Kripke type semantics. The formalism is extended to time-dependent knowledge. Completeness and decidability theorems are given. The problem of the wi...
The knowledge-based supervision system described is intended to detect cutter damages in milling machines, using x-axis and y-axis displacement signals. The model hierarchically integrates real-time signal processing algorithms in a knowledge-based processing environment where rules and objects coexist. A deeply coupled, numeric\\/symbolic model is developed. It incorporates physical models and empirical knowledge. It is implemented in a multiprocessor architecture
This paper discusses data mining--an end-to-end (ETE) data analysis tool that is used by researchers in higher education. It also relates data mining and other software programs to a brand new concept called "Knowledge Management." The paper culminates in the Tier Knowledge Management Model (TKMM), which seeks to provide a stable structure with…
Purpose – This paper aims to explore the pertinent issues of knowledge management in tourism using the example of tourism organizations in Austria. Design\\/methodology\\/approach – The paper undertakes a review of the relevant literature before applying Grant's model of knowledge management to Austrian tourism organizations. Data are gathered by means of a standardized online questionnaire. Findings – The results of
The development of knowledge-based systems involves the management of a diversity of knowledge sources, computing resources and system users, often geographically distributed. The knowledge acquisition, modeling and representation communities have developed a wide range of tools relevant to the development and management of large-scale knowledge-based systems, but the majority of these tools run on individual workstations and use specialist data
Software engineering (SE) and knowledge engineering (KE) develop software systems usingdifferent construction process models. Because of the growing complexity of the problems to besolved by computers, the conventional systems (CS) and knowledge-based systems (KBS)software process is at present passing through a period of integration. In this paper, we propose asoftware process model applicable to both CS and KBS. The model
Silvia Teresita Acuña; Marta López; Natalia Juristo Juzgado; Ana María Moreno
This paper proposes a web-based testing model for project management. The model is based on ontology for encoding project management knowledge, so it is able to facilitate resource extraction in the web-based testware environment. It also allows generation of parameterized tests, according to the targeted difficulty level. The authors present the theoretical approaches that led to the model: semantic nets and concept space graphs have an important role in model designing. The development of the ontology model is made with SemanticWorks software. The test ontology has applicability in project management certification, especially in those systems with different levels, as the IPMA four-level certification system.
We created three models to represent a comprehensive knowledgemodel: · Stages of Knowledge Management Model (Forrester) · Expanded Life-Cycle Information Management Model · Organizational Knowledge Management Model. In building a series of models, we started with an attempt to create a graphical model that illustrates the ideas outlined in the Forrester article (Leadership Strategies, Vol. 3, No. 2, November/December 1997). We then expanded and detailed a life-cycle model. Neither of these effectively reflected how to manage the complexities involved in weaving local, enterprise, and global information into an easily navigated resource for end users. We finally began to synthesize these ideas into an Organizational Knowledge Management Model. This model acknowledges the relevance of life-cycle management for different granularities of information collections and places it in the context of the integrating infrastructure needed to assist end users.
Project-based learning engages students in problem solving through artefact design. However, previous studies of online project-based learning have focused primarily on the dynamics of online collaboration; students' knowledge construction throughout this process has not been examined thoroughly. This case study analyzed the relationship between…
Koh, Joyce Hwee Ling; Herring, Susan C.; Hew, Khe Foon
Explores the utility of mental models as learning outcomes in using complex and situated learning environments. Describes two studies: one aimed at eliciting mental models in the heads of novice refrigeration technicians, and the other an ethnographic study eliciting knowledge and models within the community of experienced refrigeration…
Purpose: The paper seeks to develop a business model that shows the impact of operational knowledge assets on intellectual capital (IC) components and business performance and use the model to show how knowledge assets can be prioritized in driving resource allocation decisions. Design/methodology/approach: Quantitative data were collected from 84…
A knowledge representation model for the nuclear power field is proposed. The model is a generalized production rule function inspired by a neural network approach that enables the representation of physical systems of nuclear power plants. The article discusses some techniques currently used for knowledge representation of physical systems and argues that the proposed approach overcomes some aspects of the
In theories and studies of persuasion, people's personal knowledge about persuasion agents' goals and tactics, and about how to skillfully cope with these, has been ignored. We present a model of how people develop and use persuasion knowledge to cope with persuasion attempts. We discuss what the model implies about how consumers use marketers' advertising and selling attempts to refine
Describes the Microcomputer Infusion Project (MIP), which was developed at Arizona State University to provide faculty with the necessary hardware, software, and training to become models of computer use in both lesson development and presentation for preservice teacher education students. Topics discussed include word processing; database…
For model postsecondary demonstration projects serving individuals with disabilities, a portfolio of project activities may serve as a method for program evaluation, program replication, and program planning. Using a portfolio for collecting, describing, and documenting a project's successes, efforts, and failures enables project staff to take…
Advances in food transformation have dramatically increased the diversity of products on the market and, consequently, exposed consumers to a complex spectrum of bioactive nutrients whose potential risks and benefits have mostly not been confidently demonstrated. Therefore, tools are needed to efficiently screen products for selected physiological properties before they enter the market. NutriChip is an interdisciplinary modular project funded by the Swiss programme Nano-Tera, which groups scientists from several areas of research with the aim of developing analytical strategies that will enable functional screening of foods. The project focuses on postprandial inflammatory stress, which potentially contributes to the development of chronic inflammatory diseases. The first module of the NutriChip project is composed of three in vitro biochemical steps that mimic the digestion process, intestinal absorption, and subsequent modulation of immune cells by the bioavailable nutrients. The second module is a miniaturised form of the first module (gut-on-a-chip) that integrates a microfluidic-based cell co-culture system and super-resolution imaging technologies to provide a physiologically relevant fluid flow environment and allows sensitive real-time analysis of the products screened in vitro. The third module aims at validating the in vitro screening model by assessing the nutritional properties of selected food products in humans. Because of the immunomodulatory properties of milk as well as its amenability to technological transformation, dairy products have been selected as model foods. The NutriChip project reflects the opening of food and nutrition sciences to state-of-the-art technologies, a key step in the translation of transdisciplinary knowledge into nutritional advice. PMID:22943857
The aim of this work in progress is to implement a generative and validated model of Theory of Emo- tional Mind in a knowledge-based system. Model requirements are elucidated via human subject re- sponses to autism therapy exercises.
This project will provide descriptive and analytical data regarding the flow of STI at the individual, organizational, national, and international levels. It will examine both the channels used to communicate information and the social system of the aerospace knowledge diffusion process. Results of the project should provide useful information to R and D managers, information managers, and others concerned with improving access to and use of STI. Objectives include: (1) understanding the aerospace knowledge diffusion process at the individual, organizational, and national levels, placing particular emphasis on the diffusion of Federally funded aerospace STI; (2) understanding the international aerospace knowledge diffusion process at the individual and organizational levels, placing particular emphasis on the systems used to diffuse the results of Federally funded aerospace STI; (3) understanding the roles NASA/DoD technical report and aerospace librarians play in the transfer and use of knowledge derived from Federally funded aerospace R and D; (4) achieving recognition and acceptance within NASA, DoD and throughout the aerospace community that STI is a valuable strategic resource for innovation, problem solving, and productivity; and (5) providing results that can be used to optimize the effectiveness and efficiency of the Federal STI aerospace transfer system and exchange mechanism.
Purpose: With the advent of information and communication technologies (ICT), some organisations have endeavoured to develop and maintain systems commonly known as project histories. This paper aims to provide a framework to the construction organisations in order to improve the learning from projects through the development and use of project…
The project has both immediate and long term purposes. In the first instance it provides a practical and pragmatic basis for understanding how the results of NASA/DoD research diffuse into the aerospace R and D process. Over the long term it provides an empirical basis for understanding the aerospace knowledge diffusion process itself, and its implications at the individual, organizational, national, and international levels. The project is studying the major barriers to effective knowledge diffusion. This project will provide descriptive and analytical data regarding the flow of scientific and technical information (STI). It will examine both channels used to communicate information and the social system of the aerospace knowledge diffusion process.
The CommonKADS methodology is a collection of structured methods for building knowledge-based sys- tems. A key component of CommonKADS is the li- brary of generic inference models which can be applied to tasks of specified types. These generic models can either be used as frameworks for knowledge acquisi- tion, or to verify the completeness of models devel- oped by analysis
The objective was an evaluation of the Spiral Model (SM) development approach to allow NASA Marshall to develop an experience base of that software management methodology. A discussion is presented of the Information Model (IM) that was used as part of the SM methodology. A key concept of the SM is the establishment of an IM to be used by management to track the progress of a project. The IM is the set of metrics that is to be measured and reported throughout the life of the project. These metrics measure both the product and the process to ensure the quality of the final delivery item and to ensure the project met programmatic guidelines. The beauty of the SM, along with the IM, is the ability to measure not only the correctness of the specification and implementation of the requirements but to also obtain a measure of customer satisfaction.
The authors propose a model for a knowledge-based assistant (expert system) to aid in allocating a blackboard-oriented system to a multiprocessor or distributed platform. It incorporates variants of several important techniques of artificial intelligence to provide recommendations for the allocation of a set of knowledge sources to a set of processors. To provide intelligent recommendations, an expert system acquires the
In this study, we used Rasch model analyses to examine (1) the unidimensionality of the alphabet knowledge construct and (2) the relative difficulty of different alphabet knowledge tasks (uppercase letter recognition, names, and sounds, and lowercase letter names) within a sample of preschoolers (n=335). Rasch analysis showed that the four…
Drouin, Michelle; Horner, Sherri L.; Sondergeld, Toni A.
Many of the current policy debates in Europe focus on what kind of "knowledge economy" or "knowledge society" would be best in the future if it is to combine both economic competitiveness and social cohesion. Should European economies move increasingly towards the so-called Anglo-Saxon model of flexible labour markets and high employment…
This paper describes the three-stage approach to KBS design recommended by the CommonKADS Design Model (choosing an overall approach to design, choosing ideal knowledge representation and programming techniques, and deciding how to implement the recommended techniques in the chosen software), as well as outlining possible sources of guidance for making good selections of knowledge representations and inference techniques. It then
The Solid Waste ProjectionModel (SWPM) system is an analytical tool developed by Pacific Northwest Laboratory (PNL) for Westinghouse Hanford company (WHC) specifically to address solid waste management issues at the Hanford Central Waste Complex (HCWC). This document, one of six documents supporting the SWPM system, contains a description of the system and instructions for preparing to use SWPM and operating Version 1 of the model. 4 figs., 1 tab.
The CommonKADS methodology is a collection of structured methods for buildingknowledge based systems. A key component of CommonKADS is the libraryof generic inference models which can be applied to tasks of specified types. Thesegeneric models can either be used as frameworks for knowledge acquisition, or toverify the completeness of models developed by analysis of the domain. However, thegeneric models for
Explores conceptual, epistemological, methodological, and practical 'gaps' that seem to reproduce themselves in successive instantiations of this quest. Aims to help build the critical perspective that comes with problematizing the very project of knowledge-behavior modeling by identifying the positivistic residues still present in the enterprise…
The purpose of the study is to investigate the effects of an applied environmental education project carried out using the green-class model on students' environmental knowledge and its retention. 101 7th grade students attending Nazim Akcan Primary School in the Altindag Province of Ankara participated in the study. The study was carried out in…
The Lunar Mapping and ModelingProject (LMMP) has been created to manage the development of a suite of lunar mapping and modeling products that support the Constellation Program (CxP) and other lunar exploration activities, including the planning, design, development, test and operations associated with lunar sortie missions, crewed and robotic operations on the surface, and the establishment of a lunar outpost. The project draws on expertise from several NASA and non-NASA organizations (MSFC, ARC, GSFC, JPL, CRREL and USGS). LMMP will utilize data predominately from the Lunar Reconnaissance Orbiter, but also historical and international lunar mission data (e.g. Apollo, Lunar Orbiter, Kaguya, Chandrayaan-1), as available and appropriate, to meet Constellation s data needs. LMMP will provide access to this data through a single, common, intuitive and easy to use NASA portal that transparently accesses appropriately sanctioned portions of the widely dispersed and distributed collections of lunar data, products and tools. LMMP will provide such products as DEMs, hazard assessment maps, lighting maps and models, gravity models, and resource maps. We are working closely with the LRO team to prevent duplication of efforts and ensure the highest quality data products. While Constellation is our primary customer, LMMP is striving to be as useful as possible to the lunar science community, the lunar education and public outreach (E/PO) community, and anyone else interested in accessing or utilizing lunar data.
Noble, Sarah K.; French, Raymond; Nall,Mark; Muery, Kimberly
Project management education programmes are often proposed in higher education to give stude nts competences in project planning (Gantt's chart), pr oject organizing, human and technical resource management, quality control and also social competences (collaboration, communication), emotional ones (empathy, consideration of the other, humour, ethic s), and organizational ones (leadership, political vision, and so on). This training is often given
This paper is presenting a generic ontology-based user modeling ar- chitecture, (OntobUM), applied in the context of a Knowledge Management System (KMS). Due to their powerful knowledge representation formalism and associated inference mechanisms, ontology-based systems are emerging as a natural choice for the next generation of KMSs operating in organizational, in- terorganizational as well as community contexts. User models, often
Liana Razmerita; Albert A. Angehrn; Alexander Maedche
Learning science requires students to make inferences and draw conclusions about concepts that are abstract, invisible or otherwise difficult to imagine. Scientific visualization is one way to make science and scientific thinking more visible to students. This dissertation investigates how visualization can be utilized for science education by studying how students integrate information from visualizations into their thinking. For this study, I developed a series of computer visualizations depicting thermodynamic phenomena. Thermodynamics is a topic that is both fundamental for several branches of science and difficult for many students to master (Linn & Songer, 1991). The design of the visualizations was learner centered. Pilot studies suggested that a dot-density representation of temperature would present a visual analogy of temperature as a measure of heat energy density. Energy density is a powerful model that can help students explain everyday heating and cooling phenomena. Dot-density computer visualizations were introduced into a public middle school science class studying thermodynamics (N = 178). Half of the students used the visualizations, while the other half served as a control. Interviews, classwork and tests were collected from the students in order to determine how the visualizations affected students' learning. Although there were not significant differences in the posttests for the groups, the classwork during the semester showed that the visualizations did affect how students envisioned heat and temperature. Students could often apply the energy density model in their reasoning during visualization activities, but when the visualizations were unavailable, many students applied less useful models. The interviews illustrated several difficulties that students had in learning from the visualizations. Some students interpreted the visualizations to support their existing conceptions of heat. Other students needed to have a visualization present to cue the energy-density model during problem solving. On the posttest, some students drew images with dots in them, but they lacked the model that underlies the representation. Students who avoided these problems and integrated the visualizations into their thinking were highly successful on the posttest. These results suggest that for visualizations to be effective learning tools, students need to understand the visualizations and also explore underlying scientific model. Students in this study who connected the visualizations to other ideas about thermodynamics developed a robust understanding of the science. These findings inform our understanding of the science learning process. Students appear to draw from a repertoire of models in their reasoning. Visualizations are a powerful way to introduce models to students, but work best with opportunities for students to integrate the models into their thinking.
Based on social network and social exchange theories, this paper explored the relations among perceived team values, paternalistic leadership, team affective commitment (TAC) and knowledge integration behaviors (KIBs) in the 31 medical project teams. The empirical study founded team values, paternalistic leadership did not have Synergistic effects on TAC and KIBs; meanwhile, although team values, paternalistic leadership have direct effects
Background Since Swanson proposed the Undiscovered Public Knowledge (UPK) model, there have been many approaches to uncover UPK by mining the biomedical literature. These earlier works, however, required substantial manual intervention to reduce the number of possible connections and are mainly applied to disease-effect relation. With the advancement in biomedical science, it has become imperative to extract and combine information from multiple disjoint researches, studies and articles to infer new hypotheses and expand knowledge. Methods We propose MKEM, a Multi-level Knowledge Emergence Model, to discover implicit relationships using Natural Language Processing techniques such as Link Grammar and Ontologies such as Unified Medical Language System (UMLS) MetaMap. The contribution of MKEM is as follows: First, we propose a flexible knowledge emergence model to extract implicit relationships across different levels such as molecular level for gene and protein and Phenomic level for disease and treatment. Second, we employ MetaMap for tagging biological concepts. Third, we provide an empirical and systematic approach to discover novel relationships. Results We applied our system on 5000 abstracts downloaded from PubMed database. We performed the performance evaluation as a gold standard is not yet available. Our system performed with a good precision and recall and we generated 24 hypotheses. Conclusions Our experiments show that MKEM is a powerful tool to discover hidden relationships residing in extracted entities that were represented by our Substance-Effect-Process-Disease-Body Part (SEPDB) model.
Purpose – The paper aims to present the design rationale, the structure and the use of a web-based information systems framework for collaborative business process modelling. Design\\/methodology\\/approach – By viewing process modelling as a “problematic situation” that entails a considerable amount of social and knowledge activity in order to be resolved, a novel process modelling construct has been developed and
The authors present 2 experiments that establish the presence of knowledge partitioning in perceptual categorization. Many participants learned to rely on a context cue, which did not predict category membership but identified partial boundaries, to gate independent partial categorization strategies. When participants partitioned their knowledge, a strategy used in 1 context was unaffected by knowledge demonstrably present in other contexts. An exemplar model, attentional learning covering map, was shown to be unable to accommodate knowledge partitioning. Instead, a mixture-of-experts model, attention to rules and instances in a unified model (ATRIUM), could handle the results. The success of ATRIUM resulted from its assumption that people memorize not only exemplars but also the way in which they are to be classified. PMID:15355135
The objectives of the research program, Space Market Model Development Project, (Phase 1) were: (1) to study the need for business information in the commercial development of space; and (2) to propose a design for an information system to meet the identified needs. Three simultaneous research strategies were used in proceeding toward this goal: (1) to describe the space business information which currently exists; (2) to survey government and business representatives on the information they would like to have; and (3) to investigate the feasibility of generating new economical information about the space industry.
Federal attempts to stimulate technological innovation have been unsuccessful because of the application of an inappropriate policy framework that lacks conceptual and empirical knowledge of the process of technological innovation and fails to acknowledge the relationship between knowledge production, transfer, and use as equally important components of the process of knowledge diffusion. This article argues that the potential contributions of high-speed computing and networking systems will be diminished unless empirically derived knowledge about the information-seeking behavior of members of the social system is incorporated into a new policy framework. Findings from the NASA/DoD Aerospace Knowledge Diffusion Research Project are presented in support of this assertion.
Pinelli, Thomas E.; Barclay, Rebecca O.; Bishop, Ann P.; Kennedy, John M.
\\u000a In this paper we analyse the problem of traffic sign recognition at the knowledge level. Due to the complexity of the task,\\u000a our approach decomposes it into simpler subtasks until the primitive level is reached. The task has been modeled at the knowledge\\u000a level as a hierarchical classification task. This has allowed to discover a simple and robust Problem Solving
This article describes and references the relevant literature related to knowledge-based simulation. There are essentially ten areas of literature that would likely contain relevant articles. They are the management science/operations research literature, the simulation (and modeling) literature, the production/operations management literature, the knowledge engineering and artificial intelligence literature, the systems science literature, the industrial engineering literature, the mechanical engineering literature, and the information science literature.
Many of the current policy debates in Europe focus on what kind of ‘knowledge economy’ or ‘knowledge society’ would be best in the future if it is to combine both economic competitiveness and social cohesion. Should European economies move increasingly towards the so?called Anglo?Saxon model of flexible labour markets and high employment rates—with the increasing income inequalities that attend them—or
One of the challenges for any knowledge engineering methodology is to find appropriate ways of modelingknowledge in a schematic way. The CommonKADS methodology is a de-facto technique for knowledgemodeling. It specifies knowledge and reasoning requirements of the prospective system. CommonKADS's knowledgemodeling consists of three parts that capturing a related group of knowledge structure: Domain knowledge, Inference knowledge
The concept of accessing computer aided design (CAD) design databases and extracting a process model automatically is investigated as a possible source for the generation of knowledge bases for model-based reasoning systems. The resulting system, referred to as automated knowledge generation (AKG), uses an object-oriented programming structure and constraint techniques as well as internal database of component descriptions to generate a frame-based structure that describes the model. The procedure has been designed to be general enough to be easily coupled to CAD systems that feature a database capable of providing label and connectivity data from the drawn system. The AKG system is capable of defining knowledge bases in formats required by various model-based reasoning tools.
Gonzalez, Avelino J.; Myler, Harley R.; Towhidnejad, Massood; Mckenzie, Frederic D.; Kladke, Robin R.
Describes a class project that included a literature search, observation of the Hale-Bopp comet, planning and building a model solar system, and presentation of the model in class. Finds that bilingual students in the class made significant progress in their learning of concepts and the acquisition of English as a result of completing the project.…
This report summarizes the results of thermal analysis performed to provide a technical basis in support of Project W-320 to retrieve by sluicing the sludge in Tank 241-C-106 and to transfer into Tank 241-AY-102. Prior theraml evaluations in support of Project W-320 safety analysis assumed the availability of 2000 to 3000 CFM, as provided by Tank Farm Operations, for tank floor cooling channels from the secondary ventilation system. As this flow availability has no technical basis, a detailed Tank 241-AY-102 secondary ventilation and floor coating channel flow model was developed and analysis was performed. The results of the analysis show that only about 150 cfm flow is in floor cooLing channels. Tank 241-AY-102 thermal evaluation was performed to determine the necessary cooling flow for floor cooling channels using W-030 primary ventilation system for different quantities of Tank 241-C-106 sludge transfer into Tank 241-AY-102. These sludge transfers meet different options for the project along with minimum required modification of the ventilation system. Also the results of analysis for the amount of sludge transfer using the current system is presented. The effect of sludge fluffing factor, heat generation rate and its distribution between supernatant and sludge in Tank 241-AY-102 on the amount of sludge transfer from Tank 241-C-106 were evaluated and the results are discussed. Also transient thermal analysis was performed to estimate the time to reach the steady state. For a 2 feet sludge transfer, about 3 months time will be requirad to reach steady state. Therefore, for the purpose of process control, a detailed transient thermal analysis using GOTH Computer Code will be required to determine transient response of the sludge in Tank 241-AY-102. Process control considerations are also discussed to eliminate the potential for a steam bump during retrieval and storage in Tanks 241-C-106 and 241-AY-102 respectively.
Purpose: This paper seeks to present a knowledge management (KM) conceptual model for competency development and a case study in a law service firm, which implemented the KM model in a competencies development program. Design/methodology/approach: The case study method was applied according to Yin (2003) concepts, focusing a six-professional group…
Models belong to a wider family of knowledge technologies, applied in the transport area. Models sometimes share with other such technologies the fate of not being used as intended, or not at all. The result may be ill?conceived plans as well as wasted resources. Frequently, the blame for such a mismatch is put on irrational decision?making, or ‘politics’, while still
This research focused on the challenges experienced when executing risk management activities for information technology projects. The lack of adequate knowledge management support of risk management activities has caused many project failures in the past. The research objective was to propose a conceptual framework of the Knowledge-Based Risk…
Federal attempts to stimulate technological innovation have been unsuccessful because of the application of an inappropriate policy framework that lacks conceptual and empirical knowledge of the process of technological innovation and fails to acknowledge the relationship between knowled reproduction, transfer, and use as equally important components of the process of knowledge diffusion. It is argued that the potential contributions of high-speed computing and networking systems will be diminished unless empirically derived knowledge about the information-seeking behavior of the members of the social system is incorporated into a new policy framework. Findings from the NASA/DoD Aerospace Knowledge Diffusion Research Project are presented in support of this assertion.
Pinelli, Thomas E.; Barclay, Rebecca O.; Bishop, Ann P.; Kennedy, John M.
Descriptive and analytical data regarding the flow of aerospace-based scientific and technical information (STI) in the academic community are presented. An overview is provided of the Federal Aerospace Knowledge Diffusion Research Project, illustrating a five-year program on aerospace knowledge diffusion. Preliminary results are presented of the project's research concerning the information-seeking habits, practices, and attitudes of U.S. aerospace engineering and science students and faculty. The type and amount of education and training in the use of information sources are examined. The use and importance ascribed to various information products by U.S. aerospace faculty and students including computer and other information technology is assessed. An evaluation of NASA technical reports is presented and it is concluded that NASA technical reports are rated high in terms of quality and comprehensiveness, citing Engineering Index and IAA as the most frequently used materials by faculty and students.
Now the Triple Helix model has been the new paradigm of research on technological innovation. Previous research has shown that from a macro perspective, the knowledge transfer among University Industry Government is quite clear. But the social property of micro-actors has great effects on the knowledge transfer. Based on this, this article proposed three theoretical models of knowledge spillover: the
The Lunar Mapping and ModelingProject (LMMP) has been created to manage the development of a suite of lunar mapping and modeling products that support the Constellation Program (CxP) and other lunar exploration activities, including the planning, design, development, test and operations associated with lunar sortie missions, crewed and robotic operations on the surface, and the establishment of a lunar outpost. The information provided through LMMP will assist CxP in: planning tasks in the areas of landing site evaluation and selection, design and placement of landers and other stationary assets, design of rovers and other mobile assets, developing terrain-relative navigation (TRN) capabilities, and assessment and planning of science traverses. The project draws on expertise from several NASA and non-NASA organizations (MSFC, ARC, GSFC, JPL, CRREL US Army Cold Regions Research and Engineering Laboratory, and the USGS). LMMP will utilize data predominately from the Lunar Reconnaissance Orbiter, but also historical and international lunar mission data (e.g. Apollo, Lunar Orbiter, Kaguya, Chandrayaan-1), as available and appropriate, to meet Constellation s data needs. LMMP will provide access to this data through a single intuitive and easy to use NASA portal that transparently accesses appropriately sanctioned portions of the widely dispersed and distributed collections of lunar data, products and tools. Two visualization systems are being developed, a web-based system called Lunar Mapper, and a desktop client, ILIADS, which will be downloadable from the LMMP portal. LMMP will provide such products as local and regional imagery and DEMs, hazard assessment maps, lighting and gravity models, and resource maps. We are working closely with the LRO team to prevent duplication of efforts and to ensure the highest quality data products. While Constellation is our primary customer, LMMP is striving to be as useful as possible to the lunar science community, the lunar commercial community, the lunar education and public outreach (E/PO) community, and anyone else interested in accessing or utilizing lunar data. A beta version of the portal and visualization systems is expected to be released in late 2009, with a version 1 release planned for early 2011.
Noble, Sarah K.; French, R. A.; Nall, M. E.; Muery, K. G.
Most spatio-temporal models are based on the assumption that the relationship between system state change and its explanatory processes is stationary. This means that model structure and parameterization are usually kept constant over time, ignoring potential systemic changes in this relationship resulting from e.g., climatic or societal changes, thereby overlooking a source of uncertainty. We define systemic change as a change in the system indicated by a system state change that cannot be simulated using a constant model structure. We have developed a method to detect systemic change, using a Bayesian data assimilation technique, the particle filter. The particle filter was used to update the prior knowledge about the model structure. In contrast to the traditional particle filter approach (e.g., Verstegen et al., 2014), we apply the filter separately for each point in time for which observations are available, obtaining the optimal model structure for each of the time periods in between. This allows us to create a time series of the evolution of the model structure. The Runs test (Wald and Wolfowitz, 1940), a stationarity test, is used to check whether variation in this time series can be attributed to randomness or not. If not, this indicates systemic change. The uncertainty that the systemic change adds to the existing modelprojection uncertainty can be determined by comparing model outcomes of a model with a stationary model structure and a model with a model structure changing according to the variation found in the time series. To test the systemic change detection methodology, we apply it to a land use change cellular automaton (CA) (Verstegen et al., 2012) and use observations of real land use from all years from 2004 to 2012 and associated uncertainty as observational data in the particle filter. A systemic change was detected for the period 2006 to 2008. In this period the influence on the location of sugar cane expansion of the driver sugar cane in the neighbourhood doubled, while the influence of slope and potential yield decreased by 75% and 25% respectively. Allowing these systemic changes to occur in our CA in the future (up to 2022) resulted in an increase in modelprojection uncertainty by a factor two compared to the assumption of a stationary system. This means that the assumption of a constant model structure is not adequate and largely underestimates uncertainty in the projection. References Verstegen, J.A., Karssenberg, D., van der Hilst, F., Faaij, A.P.C., 2014. Identifying a land use change cellular automaton by Bayesian data assimilation. Environmental Modelling & Software 53, 121-136. Verstegen, J.A., Karssenberg, D., van der Hilst, F., Faaij, A.P.C., 2012. Spatio-Temporal Uncertainty in Spatial Decision Support Systems: a Case Study of Changing Land Availability for Bioenergy Crops in Mozambique. Computers , Environment and Urban Systems 36, 30-42. Wald, A., Wolfowitz, J., 1940. On a test whether two samples are from the same population. The Annals of Mathematical Statistics 11, 147-162.
Verstegen, Judith; Karssenberg, Derek; van der Hilst, Floor; Faaij, André
Research on cognitive modeling of information search and Web navigation emphasizes the importance of "information scent" (the relevance of semantic cues such as link labels and headings to a reader's goal; Pirolli & Card, 1999). This article shows that not only semantic but also structural knowledge is involved in navigating the Web (Juvina,…
Engineering approach for transforming business process and workforce management is an emerging area that is gaining increasing interest for its many promising applications in service science. Yet, three major challenges stand out in engineering any business transformation process. The first challenge is accurate modeling of the transformation knowledge that may be scattered across multiple application domains. The second challenge is
This paper describes a general approach for real time traffic management support using knowledge based models. Recognizing that human intervention is usually required to apply the current automatic traffic control systems, it is argued that there is a need for an additional intelligent layer to help operators to understand traffic problems and to make the best choice of strategic control
Purpose: The purpose of this paper is to present the development of the relationship management maturity model (RMMM), the output of an initiative aimed at bridging the gap between business units and the IT organisation. It does this through improving and assessing knowledge sharing between business and IT staff in Finco, a large financial…
Martin, Valerie A.; Hatzakis, Tally; Lycett, Mark; Macredie, Robert
The aim of this paper is to contribute to a central issue in neural networkthat is of combining expert knowledge and observations (data) for learning. Itis generally known that neural networks, as other adaptive models, have goodlearning and generalization capabilities because of their statistical consistency.However, such consistency is theoretically valid only for large size trainingsets. To enhance learning with small
Despite a recent focus on engaging students in epistemic practices, there is relatively little research on how learning environments can support the simultaneous, coordinated development of both practice and the knowledge that emerges from and supports scientific activity. This study reports on the co-construction of modeling practice and…
Five optimization models are constructed for selecting an optimal subset of projects submitted for a statewide programming process. Our approach develops models that are consistent with user needs and appropriate for the assumptions used in the project prioritization process. Each of the models builds on a basic linear-programming formulation in which a maximization of benefits and minimization of costs is
Debbie A. Niemeier; Zelda B. Zabinsky; Zhaohui Zeng; G. Scott Rutherford
This report describes the context and goals of the Higher Education Funding Council for England's e-University project to develop Internet-based higher education. It summarizes the proposed business model and outlines next steps in implementing the project. A February 2000 letter announced the project and invited higher education institutions…
Higher Education Funding Council for England, Bristol.
The proliferation of aid projects may overburden recipient governments with reporting requirements, donor visits, and other administrative overhead, siphoning off scarce domestic recipient resources, such as tax revenue or the time of skilled government officials, from directly productive use. But greater oversight may also improve the administration of projects, increasing development. I present a model of aid projects that reflects
The citizen science projects developed by Zooniverse afford volunteers the opportunity to contribute to scientific research in a meaningful way by interacting with actual scientific data. We created two surveys to measure the impact that participation in the Galaxy Zoo and Moon Zoo citizen science projects has on user conceptual knowledge. The Zooniverse Astronomy Concept Survey (ZACS) was designed to assess Galaxy Zoo user understanding of concepts related to galaxies and how their understanding changed through participation in classifying galaxies. The Lunar Cratering Concept Inventory (LCCI) was designed to measure the impact of the Moon Zoo activities on user knowledge about lunar craters and cratering history. We describe how the surveys were developed and validated in collaboration with education researchers and astronomers. Both instruments are administered over time to measure changes to user conceptual knowledge as they gain experience with either Galaxy Zoo or Moon Zoo. Data collection has already begun and in the future we will be able to compare survey answers from users who have classified, for example, a thousand galaxies with users who have only classified ten galaxies. This material is based upon work supported by the National Science Foundation under Grant No. 0715517, a CCLI Phase III Grant for the Collaboration of Astronomy Teaching Scholars (CATS). Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation and the Sloan Digital Sky Survey III Education and Public Outreach Program.
Cormier, Sebastien; Prather, E. E.; Brissenden, G.; Lintott, C.; Gay, P. L.; Raddick, J.; Collaboration of Astronomy Teaching Scholars CATS
We propose a novel framework for performing quantitative Bayesian inference based on qualitative knowledge. Here, we focus on the treatment in the case of inconsistent qualitative knowledge. A hierarchical Bayesian model is proposed for integrating inconsistent qualitative knowledge by calculating a prior belief distribution based on a vector of knowledge features. Each inconsistent knowledge component uniquely defines a model class
The paper elaborates on the role of user models and user modeling for enhanced support in Knowledge Management Systems (KMSs). User models in KMSs, often addressed as user profiles, include user's preferences and are often similar to competency definitions. We extend this view with other characteristics of the users (e.g. level of activity, level of knowledge sharing, type of activity
This article reports on a performance evaluation on the knowledge management system (KMS) developed for a short-term food assistance programme in Hong Kong, the authors have conducted. DeLone's and McLean's (1992) widely accepted model on information system success is adopted as the evaluative framework. Instead of just another revalidation of the model on KMS evaluation in social services, this study
Zeno C. S. Leung; C. F. Cheung; K. T. Chan; Kenneth H. K. Lo
This article reports on a performance evaluation on the knowledge management system (KMS) developed for a short-term food assistance program in Hong Kong the authors have conducted. DeLone and McLean's (1992) widely accepted model on information system success is adopted as the evaluative framework. Instead of just another revalidation of the model on KMS evaluation in social services, this study
Zeno C. S. Leung; C. F. Cheung; K. T. Chan; Kenneth H. K. Lo
Arsenic in drinking water is an important public health issue in Bangladesh, which is affected by households' knowledge about arsenic threats from their drinking water. In this study, spatial statistical models were used to investigate the determinants and spatial dependence of households' knowledge about arsenic risk. The binary join matrix/binary contiguity matrix and inverse distance spatial weight matrix techniques are used to capture spatial dependence in the data. This analysis extends the spatial model by allowing spatial dependence to vary across divisions and regions. A positive spatial correlation was found in households' knowledge across neighboring districts at district, divisional and regional levels, but the strength of this spatial correlation varies considerably by spatial weight. Literacy rate, daily wage rate of agricultural labor, arsenic status, and percentage of red mark tube well usage in districts were found to contribute positively and significantly to households' knowledge. These findings have policy implications both at regional and national levels in mitigating the present arsenic crisis and to ensure arsenic-free water in Bangladesh. PMID:22385815
The American Institute of Aeronautics and Astronautics has initiated a Committee on Standards for Artificial Intelligence. Presented here are the initial efforts of one of the working groups of that committee. The purpose here is to present a candidate model for the development life cycle of Knowledge Based Systems (KBS). The intent is for the model to be used by the Aerospace Community and eventually be evolved into a standard. The model is rooted in the evolutionary model, borrows from the spiral model, and is embedded in the standard Waterfall model for software development. Its intent is to satisfy the development of both stand-alone and embedded KBSs. The phases of the life cycle are detailed as are and the review points that constitute the key milestones throughout the development process. The applicability and strengths of the model are discussed along with areas needing further development and refinement by the aerospace community.
Vision is a part of a larger informational system that converts visual information into knowledge structures. These structures drive vision process, resolving ambiguity and uncertainty via feedback, and provide image understanding, that is an interpretation of visual information in terms of such knowledgemodels. The solution to Image Understanding problems is suggested in form of active multilevel hierarchical networks represented dually as discrete and continuous structures. Computational intelligence methods transform images into model-based knowledge representation. Certainty Dimension converts attractors in neural networks into fuzzy sets, preserving input-output relationships. Symbols naturally emerge in such networks. Symbolic Space is a dual structure that combines closed distributed space split by the set of fuzzy regions, and discrete set of symbols equivalent to the cores of regions represented as points in the Certainty dimension. Model Space carries knowledge in form of links and relations between the symbols, and supports graph, diagrammatic and topological operations. Composition of spaces works similar to M. Minsky frames and agents, Gerard Edelman's maps of maps, etc., combining machine learning, classification and analogy together with induction, deduction and other methods of higher level model-based reasoning. Based on such principles, an Image Understanding system can convert images into knowledgemodels, effectively resolving uncertainty and ambiguity via feedback projections and does not require supercomputers.
This document provides a common set of astrodynamic constants and planetary models for use by the Mars pathfinder Project. It attempts to collect in a single reference all the quantities and models in use across the project during development and for mission operations.
The Southern Maryland Educational Consortium's Tech Prep Model Demonstration project is described in this final report. The consortium members are Calvert, Charles, and St. Mary's county school districts and Charles County Community College in southern Maryland. The project is based on a 4 + 2 model in which ninth-grade students develop career…
Southern Maryland Educational Consortium, La Plata.
In this study, four road delivery projectmodels are analyzed by data envelopment analysis. The four models are design-bid-build (DBB), design-build (DB), construction management (CM) and design-build-maintenance (DBM). According to the analysis, DBM is the most efficient model and CM has the best time-scheduled control. DB performs the best efficiency in construction. The results may provide public sectors to employ an adequate model so as to proceed with road construction project.
Tasks remaining to be completed are summarized for the following major project elements: (1) evaluation of crop yield models; (2) crop yield model research and development; (3) data acquisition processing, and storage; (4) related yield research: defining spectral and/or remote sensing data requirements; developing input for driving and testing crop growth/yield models; real time testing of wheat plant process models) and (5) project management and support.
Major road maintenance projects need an accurate cost estimate at the early stage of design. At present, simple and quick models are lacking. Two easy?to?use cost models for major asphalt road maintenance projects are developed. The double mean model and the trend line model are based on the cost?significance method. The historical data used are bills of quantities (BoQs) of
Following the WHO initiative named World Alliance for Patient Safety (PS) launched in 2004 a conceptual framework developed by PS national reporting experts has summarized the knowledge available. As a second step, the Department of Public Health of the University of Saint Etienne team elaborated a Categorial Structure (a semi formal structure not related to an upper level ontology) identifying the elements of the semantic structure underpinning the broad concepts contained in the framework for patient safety. This knowledge engineering method has been developed to enable modeling patient safety information as a prerequisite for subsequent full ontology development. The present article describes the semantic dissection of the concepts, the elicitation of the ontology requirements and the domain constraints of the conceptual framework. This ontology includes 134 concepts and 25 distinct relations and will serve as basis for an Information Model for Patient Safety.
Souvignet, Julien; Bousquet, Cedric; Lewalle, Pierre; Trombert-Paviot, Beatrice; Rodrigues, Jean Marie
This article examines two aspects of preservice English-as-a-Second-Language (ESL) teacher education programs: (1) the knowledge base or information that students must learn; and (2) the way or ways in which the knowledge is delivered to students. The knowledge base includes content knowledge, pedagogic knowledge, pedagogic content knowledge, and…
This paper introduces a declarative model of semantic memory, called PSN, written in Prolog. It is shown to be a descendant\\u000a of Quillian’s (1969) Teachable Language Comprehender (TLC) in its structuring of knowledge as a conceptual reticulum and in\\u000a its use of spreading activation as a retrieval mechanism. PSN goes beyond TLC, however, in its ability to instantiate the\\u000a essential
The purposes of this study were to develop a knowledge management (KM) model for self-reliant communities, to examine satisfaction with KM operation of self-reliant communities, and to determine factors of success in KM for self-reliant communities. The focus areas and groups were Ban Nam Kliang and Ban Lao Rat Phatthana, Amphoe Wapi Pathum, Changwat Maha Sarakham comprising 8 focus organizations
The main goals of this study were to look after the technological knowledge construction process by high-school high-achievers, and their ability to design and implement solutions for technological problems. More specifically, we examine the contribution of Project-based-learning (PBL), as pedagogical means for supporting the students' knowledge…
The objective of the NASA/DOD Aerospace Knowledge Diffusion Research Project is to provide descriptive and analytical data regarding the flow of scientific and technical information (STI) at the individual, organizational, national, and international levels, placing emphasis on the systems used to diffuse the results of federally funded aerospace STI. An overview of project assumptions, objectives, and design is presented and preliminary results of the phase 2 aerospace library survey are summarized. Phase 2 addressed aerospace knowledge transfer and use within the larger social system and focused on the flow of aerospace STI in government and industry and the role of the information intermediary in knowledge transfer.
Policy-related research in general, and impact assessments in particular, are too loosely connected to decision-making processes. The result is often sub-optimal or even undesirable, as one of two situations arises: (1) much research is done; however, those with the real power to make decisions do not make use of all of the resulting information, or (2) advocates of contrary opinions struggle with each other, using policy-related research as ammunition. To avoid these unwanted situations, the connection between the world of knowledge and the world of decision-making should be carefully constructed, by connecting the process of decision-making to the academic research and carefully developing research goals in response to the demands of decision-makers. By making these connections in a stepwise manner, knowledge may generate new insights and views for involved decision-makers and stakeholders, thus changing perceptions and problem definitions. In this way, these actors learn about the possibilities of several alternatives as well as each other's perceptions, and thus can make educated decisions leading to the most desirable and socially acceptable solution. The way this proposed method works is illustrated using two cases in The Netherlands: the project 'Mainport Rotterdam' (the enlargement of the port of Rotterdam), the project 'A fifth runway for Amsterdam Airport (Schiphol)'.
Deelstra, Y.; Nooteboom, S.G.; Kohlmann, H.R.; Berg, J. van den; Innanen, S
This paper summarizes results from the Distributed Model Intercomparison Project (DMIP) study. DMIP simulations from twelve different models are compared with both observed streamflow and lumped model simulations. The lumped model simulations were produced using the same techniques used at National Weather Service River Forecast Centers (NWS-RFCs) for historical calibrations and serve as a useful benchmark for comparison. The differences
Seann Reed; Victor Koren; Michael Smith; Ziya Zhang; Fekadu Moreda; Dong-Jun Seo
This paper proposes a cooperative knowledge production model CSDM (Cooperative Spatial Data Mining), which is suitable for distributed GIS environment. It aims at two shortages of existing system, the large computational work, and the dispersive resources in a distributed system. It's based on data-sharing, process synchronization and parallel datamining method. Although there are many parallel algorithms for data-mining, we choose the GA (Genetic Algorithm) for illustration. The prototype system shows the model could work effectively in a path selection problem.
Economists, management theorists, business strategists, and governments alike recognize knowledge as the single most important resource in today's global economy. Because of its relationship to technological progress and economic growth, many governments have taken a keen interest in knowledge; specifically its production, transfer, and use. This paper focuses on the technical report as a product for disseminating the results of aerospace research and development (R&D) and its use and importance to aerospace engineers and scientists. The emergence of knowledge as an intellectual asset, its relationship to innovation, and its importance in a global economy provides the context for the paper. The relationships between government and knowledge and government and innovation are used to place knowledge within the context of publicly-funded R&D. Data, including the reader preferences of NASA technical reports, are derived from the NASA/DoD Aerospace Knowledge Diffusion Research Project, a ten-year study of knowledge diffusion in the U.S. aerospace industry.
Current research on distributed knowledge processes suggests a critical conflict between knowledge processes in groups and the technologies built to support them. The conflict centers on observations that authentic and efficient knowledge creation and sharing is deeply embedded in an interpersonal face to face context, but that technologies to support distributed knowledge processes rely on the assumption that knowledge can
Alaina G. Kanfer; C. Haythornthwaite; G. C. Bowker; B. C. Bruce; N. Burbules; J. Porac; J. Wade
Using an econometric model estimated with state data for the period 1961-1982, long-term projections of the growth rate in electricity consumption are presented. A projection of 3.25% per year is produced by an aggregate model. Using a similar model estimated on a regional basis, an aggregate projection of 4.19% is obtained. The main variable contributing to these projections are economic expansion, broadly defined to include income and the number of customers. Although the price of electricity is a statistically significant variable, this price is projected to be approximately stable in real terms, and hence not to contribute to electricity demand. 10 references, 1 figure, 5 tables.
To remain a world leader in aerospace, the US must improve and maintain the professional competency of its engineers and scientists, increase the research and development (R&D) knowledge base, improve productivity, and maximize the integration of recent technological developments into the R&D process. How well these objectives are met, and at what cost, depends on a variety of factors, but largely on the ability of US aerospace engineers and scientists to acquire and process the results of federally funded R&D. The Federal Government's commitment to high speed computing and networking systems presupposes that computer and information technology will play a major role in the aerospace knowledge diffusion process. However, we know little about information technology needs, uses, and problems within the aerospace knowledge diffusion process. The use of computer and information technology by US aerospace engineers and scientists in academia, government, and industry is reported.
Pinelli, Thomas E.; Kennedy, John M.; Barclay, Rebecca O.; Bishop, Ann P.
The third and final report is made on the Model Technical Library Project conducted in the Savannah District, Corps of Engineers, U. S. Army, 1968-1972, in accordance with Task 05 of the Army Technical Information Support Activities Project (TISA). It is ...
Presented is a report of a 4-month project designed to review literature on projects pertaining to deaf-blind prevocational training, to implement a model prevocational program for six severely handicapped deaf-blind students (10 years old), and to conduct two workshops in the area of prevocational training for deaf-blind students. Provided in…
In France, since 2011, it is mandatory for local communities to conduct cost-benefit analysis (CBA) of their flood management projects, to make them eligible for financial support from the State. Meanwhile, as a support, the French Ministry in charge of Environment proposed a methodology to fulfill CBA. Like for many other countries, this methodology is based on the estimation of flood damage. Howerver, existing models to estimate flood damage were judged not convenient for a national-wide use. As a consequence, the French Ministry in charge of Environment launched studies to develop damage models for different sectors, such as: residential sector, public infrastructures, agricultural sector, and commercial and industrial sector. In this presentation, we aim at presenting and discussing methodological choices of those damage models. They all share the same principle: no sufficient data from past events were available to build damage models on a statistical analysis, so modeling was based on expert knowledge. We will focus on the model built for agricultural activities and more precisely for agricultural lands. This model was based on feedback from 30 agricultural experts who experienced floods in their geographical areas. They were selected to have a representative experience of crops and flood conditions in France. The model is composed of: (i) damaging functions, which reveal physiological vulnerability of crops, (ii) action functions, which correspond to farmers' decision rules for carrying on crops after a flood, and (iii) economic agricultural data, which correspond to featured characteristics of crops in the geographical area where the flood management project studied takes place. The two first components are generic and the third one is specific to the area studied. It is, thus, possible to produce flood damage functions adapted to different agronomic and geographical contexts. In the end, the model was applied to obtain a pool of damage functions giving damage in euros by hectare for 14 agricultural lands categories. As a conclusion, we will discuss the validation step of the model. Although the model was validated by experts, we analyse how it could gain insight from comparison with past events.
The Ocean Circulation and Climate Advanced ModelingProject (OCCAM) uses a primitive equation numerical model to develop several high resolution visualizations of the world's oceans, including the Arctic Ocean and marginal seas such as the Mediterranean. This site features a selection of animation sequences and Quicktime movies from the model at Ã, 1/8, and 1/12 degree resolutions. Data from the model may also be requested, and after being prepared off-line, users can collect the data via ftp. The site also includes model details and parameters, an explanation of data assimilation methods, and links to publications and related projects.
A three-dimensional model that can be used to demonstrate the symmetry of crystal systems to physics students is described. The symmetry of the cube is taken as an example. A spherical model is constructed in which a disc of cardboard represents each of the planes of symmetry.
This site provides an overview of the Climate, Ocean and Sea Ice ModelingProject (COSIM) and its mission to develop sea ice and ocean models which can be applied to coupled climate models. Research areas include polar processes, thermohaline circulation, ocean biogeochemistry, and eddy resolving ocean simulations. Available models include the Parallel Ocean Program (POP), the Los Alamos Sea Ice Model, and eventually the hybrid vertical coordinate version of POP. In addition, COSIM researchers have provided substantial input and development to the Miami Isopycnal Coordinate Ocean Model and its hybrid vertical coordinate equivalent Hybrid Coordinate Ocean Model. Links to these model pages contain model downloads, documentation and data.
Abstract This paper presents a skill learning model CLARION Different from existing models of mostly high - level skill learning that use a top - down approach (that is, turning declarative knowledge into procedural knowledge through practice), we adopt a bottom - up approach toward low - level skill learning, where procedural knowledge develops ?rst and declarative knowledge develops later
To enable document data and knowledge to be efficiently shared and reused across application, enterprise, and community boundaries, desktop documents should be completely open and queryable resources, whose data and knowledge are represented in a form understandable to both humans and machines. At the same time, these are the requirements that desktop documents need to satisfy in order to contribute to the visions of the Semantic Web. With the aim of achieving this goal, we have developed the Semantic Document Model (SDM), which turns desktop documents into Semantic Documents as uniquely identified and semantically annotated composite resources, that can be instantiated into human-readable (HR) and machine-processable (MP) forms. In this paper, we present the SDM along with an RDF and ontology-based solution for the MP document instance. Moreover, on top of the proposed model, we have built the Semantic Document Management System (SDMS), which provides a set of services that exploit the model. As an application example that takes advantage of SDMS services, we have extended MS Office with a set of tools that enables users to transform MS Office documents (e.g., MS Word and MS PowerPoint) into Semantic Documents, and to search local and distant semantic document repositories for document content units (CUs) over Semantic Web protocols.
We propose a new model of knowledge creation in purposeful, loosely-coordinated, distributed systems, as an alternative to a firm-based one. Specifically, using the case of Linux kernel development project, we build a model of community-based, evolutionary knowledge creation to study how thousands of talented volunteers, dispersed across organizational and geographical boundaries, collaborate via the Internet to produce a knowledge- intensive,
Abstract This paper contributes to the discussion on how to manage knowledge in organizations. Taking a perspective, which acknowledges the importance, but does not privilege IT as the decisive element, it reports the results of a study investigating the process of establishing as opposed to conducting knowledge management. Based on a grounded theory approach to the analysis of the empirical
This study explores how individuals make use of scientific content knowledge for socioscientific argumentation. More specifically, this mixed-methods study investigates how learners apply genetics content knowledge as they justify claims relative to genetic engineering. Interviews are conducted with 45 participants, representing three distinct groups: high school students with variable genetics knowledge, college nonscience majors with little genetics knowledge, and college
An important difference between projection images such as x-rays and natural images is that the intensity at a single pixel in a projection image comprises information from all objects between the source and detector. In order to exploit this information, a Dirichlet mixture of Gaussian distributions is used to model the intensity function forming the projection image. The model requires initial seeding of Gaussians and uses the EM (estimation maximisation) algorithm to arrive at a final model. The resulting models are shown to be robust with respect to the number and positions of the Gaussians used to seed the algorithm. As an example, a screening mammogram is modelled as the Dirichlet sum of Gaussians suggesting possible application to early detection of breast cancer.
Present knowledge management initiatives and systems failed or remained ineffective due to the lack of knowledge creation activities in organizations. It is hypothesized that emphasizing the knowledge creation activities and embedding such activities in the organization’s objective, mission and processes will directly enhance the effectiveness of the organization’s Knowledge Management System. In designing the Holistic Knowledge Management System (HiKMaS™), a
Juhana Salim; Yazrina Yahya; Nurul Rafidza Muhamad Rashid; Abdul Razak Hamdan; Aziz Deraman; Hazilah Mohd. Amin; Akmal Aris; Mohd Shahizan Othman
The EUME project is intended to develop an Intelligent Learning Management System (ILMS) with the aim to improve the quality of traditional teaching strategies as well as to facilitate the implementation of new learning methodologies. At this project stage, we have developed a knowledgemodel of the educational domain, designed a component-based software architecture, and implemented the second cycle of
E. Sánchez; M. Lama; R. Amorim; A. Riera; J. A. Vila; S. Barro
Providing knowledge at the point of care offers the possibility for reducing error and improving patient outcomes. However, the vast majority of physician’s information needs are not met in a timely fashion. The research presented in this paper models an expert librarian’s search strategies as it pertains to the selection and use of various electronic information resources. The 10 searches conducted by the librarian to address physician’s information needs, varied in terms of complexity and question type. The librarian employed a total of 10 resources and used as many as 7 in a single search. The longer term objective is to model the sequential process in sufficient detail as to be able to contribute to the development of intelligent automated search agents.
KAUFMAN, David R.; MEHRYAR, Maryam; CHASE, Herbert; HUNG, Peter; CHILOV, Marina; JOHNSON, Stephen B.; MENDONCA, Eneida
A typical approach to project climate change impacts on water resources systems is to downscale general circulation model (GCM) or regional climate model (RCM) outputs as forcing data for a watershed model. With downscaled climate model outputs becoming readily available, multi-model ensemble approaches incorporating mutliple GCMs, multiple emissions scenarios and multiple initializations are increasingly being used. While these multi-model climate ensembles represent a range of plausible futures, different hydrologic models and methods may complicate impact assessment. In particular, associated loss, flow routing, snowmelt and evapotranspiration computation methods can markedly increase hydrological modeling uncertainty. Other challenges include properly calibrating and verifying the watershed model and maintaining a consistent energy budget between climate and hydrologic models. An alternative approach, particularly appealing for ungauged basins or locations where record lengths are short, is to directly predict selected streamflow quantiles from regional regression equations that include physical basin characteristics as well as meteorological variables output by climate models (Fennessey 2011). Two sets of regional regression models are developed for the Great Lakes states using ordinary least squares and weighted least squares regression. The regional regression modeling approach is compared with physically based hydrologic modeling approaches for selected Great Lakes watersheds using downscaled outputs from the Coupled Model Intercomparison Project (CMIP3) as inputs to the Large Basin Runoff Model (LBRM) and the U.S. Army Corps Hydrologic Modeling System (HEC-HMS).
Knowledge is the fundamental resource that allows us to function intelligently. Similarly, organizations typically use different\\u000a types of knowledge to enhance their performance. Commonsense knowledge that is not well formalized modeling is the key to\\u000a disaster management in the process of information gathering into a formalized way. Modeling commonsense knowledge is crucial\\u000a for classifying and presenting of unstructured knowledge. This
D. S. Kalana Mendis; Asoka S. Karunananda; Udaya Samaratunga; Uditha Ratnayake
This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS+) Main Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.
This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS+) International Transportation model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.
This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS+) District Heat Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.
This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS+) World Electricity Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.
This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS+) Commercial Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.
This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS+) World Industrial Model (WIM). It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.
This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS+) Coal Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.
This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS+) Greenhouse Gases Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.
This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS+) Residential Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.
This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS+) Natural Gas Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.
This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS+) Refinery Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.
Climate and weather prediction hinge on numerical models. Most of the climate models included in the Coupled Model Intercomparison Project 5 (CMIP5) and which will underpin the Intergovernmental Panel for Climate Change 5th Assessment Report (IPCC AR5) include a dust module because dust is known to play an important role in the Earth system. However dust emission schemes in climate models are relatively simple and are tuned to represent observed background aerosol concentrations most of which are many thousands of kilometres from source regions. The physics of dust emission in the models was developed from idealised experiments such as those conducted in wind tunnels decades ago. Improvement of current model dust emission schemes has been difficult to achieve because of the paucity of observations from key dust sources. Dust Observations for Models (DO4Models) is a project designed to gather data from source regions at a scale appropriate to climate model grid box resolution. The UK NERC funded project, led by the University of Oxford, aims to: 1) Generate a data set at an appropriate scale for climate models which characterises surface erodibility and erosivity in dust source areas from remote sensing and fieldwork 2) Quantify how observed erodibility and erosivity influence observed emissions at the climate model scale 3) Test, develop and optimise the dust emission scheme for the Met Office regional model (HadGEM3-RA) using this unique dust source area data set 4) Quantify which component(s) of observed erodibility and erosivity, and at what spatial scale, make the largest improvement to physically-based, observationally optimised dust emission simulations in climate models. This paper provides a project overview and some early observational and modelling results from the 2011 field season.
A significant challenge in software engineering is accurately modelingprojects in order to correctly forecast success or failure. The primary difficulty is that software development efforts are complex in terms of both the technical and social aspects of the engineering environment. This is compounded by the lack of real data that captures both the measures of success in performing a process, and the measures that reflect a group s social dynamics. This research focuses on the development of a model for predicting software project success that leverages the wealth of available open source project data in order to accurately model the behavior of those software engineering groups. Our model accounts for both the technical elements of software engineering as well as the social elements that drive the decisions of individual developers. We use agent-based simulations to represent the complexity of the group interactions, and base the behavior of the agents on the real software engineering data acquired. For four of the five project success measures, our results indicate that the developed model represents the underlying data well and provides accurate predictions of open source project success indicators.
Beaver, Justin M [ORNL; Cui, Xiaohui [ORNL; ST Charles, Jesse Lee [ORNL; Potok, Thomas E [ORNL
The results of a research project investigating information needs for space commercialization is described. The Space Market Model Development Project (SMMDP) was designed to help NASA identify the information needs of the business community and to explore means to meet those needs. The activity of the SMMDP is reviewed and a report of its operation via three sections is presented. The first part contains a brief historical review of the project since inception. The next part reports results of Phase 3, the most recent stage of activity. Finally, overall conclusions and observations based on the SMMDP research results are presented.
Introduction: The present study aimed to generate a model that would provide a conceptual framework for linking disparate components of knowledge translation. A theoretical model of such would enable the organization and evaluation of attempts to analyze current conditions and to design interventions on the transfer and utilization of research…
Legislation for people with disabilities has also changed due to other changes in the law, especially due to the recent ratification of the UN Convention on the Rights of Persons with Disabilities. These laws, in particular the UN Convention on the Rights of Persons with Disabilities, with its inclusion of the right to equitable and universal access to education for people with disabilities and their implementation, are of central importance for students who are impaired. As part of the GINKO (the legislative effect laws have on the professional integration of those who are hard of hearing, people who have gone deaf and those who are deaf through communication and organization; promotion: BMAS) project, the following questions were also brought up for discussion and were investigated: to what extent hearing-impaired students are aware of legislation that benefits them, whether these laws will be implemented, and what factors have an impact on this legal knowledge or its implementation. Overall, 4,825 handicapped individuals with hearing impairments - including n=166 students - took part in the survey. The results of the evaluation of the group of hearing-impaired students indicate that many of them are not informed about laws important to them. It was also found that the knowledge of a law cannot be equated with its implementation. This survey also resulted in a resolve for the future, to demand information about legal options be reinforced, and to adjust this information to fit the needs of specific target groups, e.g. this information could be disseminated through sign language films. On the other hand, these results also apply to higher education, for these institutions to create learning conditions where existing regulatory design options for students with disabilities are implemented, thereby affording students an equal opportunity to participate in higher education. PMID:23824568
Weber, A; Weber, U; Schlenker-Schulte, C; Schulte, K
Research Findings: A theory-based 2-factor structure of preschoolers' emotion knowledge (i.e., recognition of emotional expression and understanding of emotion-eliciting situations) was tested using confirmatory factor analysis. Compared to 1- and 3-factor models, the 2-factor model showed a better fit to the data. The model was found to be…
Bassett, Hideko Hamada; Denham, Susanne; Mincic, Melissa; Graling, Kelly
The World Energy Projection System (WEPS) was developed by the Office of Integrated Analysis and Forecasting within the Energy Information Administration (EIA), the independent statistical and analytical agency of the US Department of Energy. WEPS is an integrated set of personal computer based spreadsheets containing data compilations, assumption specifications, descriptive analysis procedures, and projectionmodels. The WEPS accounting framework incorporates projections from independently documented models and assumptions about the future energy intensity of economic activity (ratios of total energy consumption divided by gross domestic product GDP), and about the rate of incremental energy requirements met by natural gas, coal, and renewable energy sources (hydroelectricity, geothermal, solar, wind, biomass, and other renewable resources). Projections produced by WEPS are published in the annual report, International Energy Outlook. This report documents the structure and procedures incorporated in the 1998 version of the WEPS model. It has been written to provide an overview of the structure of the system and technical details about the operation of each component of the model for persons who wish to know how WEPS projections are produced by EIA.
A number of studies have observed that the motor system is activated when processing the semantics of manipulable objects. Such phenomena have been taken as evidence that simulation over motor representations is a necessary and intermediary step in the process of conceptual understanding. Cognitive neuropsychological evaluations of patients with impairments for action knowledge permit a direct test of the necessity of motor simulation in conceptual processing. Here, we report the performance of a 47-year-old male individual (Case AA) and six age-matched control participants on a number of tests probing action and object knowledge. Case AA had a large left-hemisphere frontal-parietal lesion and hemiplegia affecting his right arm and leg. Case AA presented with impairments for object-associated action production, and his conceptual knowledge of actions was severely impaired. In contrast, his knowledge of objects such as tools and other manipulable objects was largely preserved. The dissociation between action and object knowledge is difficult to reconcile with strong forms of the embodied cognition hypothesis. We suggest that these, and other similar findings, point to the need to develop tractable hypotheses about the dynamics of information exchange among sensory, motor and conceptual processes.
Garcea, Frank E.; Dombovy, Mary; Mahon, Bradford Z.
This paper investigates the interrelationship among various measured characteristics of a software project, ranging from projectmodel, size, and metrics used to govern the administration of the project. By analyzing various dimensions of project characteristics based on the underlying model, metrics and project technicality such as language and development paradigm, our findings reveal that certain metrics and models are not
There has been an increasing attention to integration of project -based learning into Asian school curricula. And the notion of learning community is a heated topic for learning to learn in knowledge -based societies. Yet the cognitive research on using web-based learning community for project-based learning is still under-developed in Asian context s Through a scrutiny of basic functional tools
The relation between models of knowledge and learning\\/living in educational contexts are crucial problems in the field of immigration studies. The biographies of immigrants showed that their knowledge contains relations between learning as individuals and groups in their various national and cultural contexts. The polish immigrants can benefit from the creation of models of biographical knowledge in different fields of
This paper analyzes the medical knowledge required for formulating decision models in the domain of pulmonary infectious diseases (PIDs) with acquired immunodeficiency syndrome (AIDS). Aiming to support dynamic decision-modeling, the knowledge characterization focuses on the ontology of the clinical decision problem. Relevant inference patterns and knowledge types are identified.
There is growing recognition that the economic climate of the world is shifting towards a knowledge-based economy where knowledge will be cherished as the most prized asset. In this regard, technology can be leveraged as a useful tool in effectually managing the knowledge capital of an organisation. Although several research studies have advanced…
There is growing recognition that the economic climate of the world is shifting towards a knowledge-based economy where knowledge will be cherished as the most prized asset. In this regard, technology can be leveraged as a useful tool in effectually managing the knowledge capital of an organisation. Although several research studies have advanced…
Matching supply and demand for knowledge in the fields of global change and sustainability is a daunting task. Science and public policy differ in their timeframes, epistemologies, objectives, process-cycles and criteria for judging the quality of knowledge, while global change and sustainability issues involve value pluralities and large uncertainties. In literature and in practice, it is argued that joint knowledge
D. L. T. Hegger; M Lamers; A. van Zeijl-Rozema; C. Dieperink
The U.S. government technical report is a primary means by which the results of federally funded research and development (R&D) are transferred to the U.S. aerospace industry. However, little is known about this information product in terms of its actual use, importance, and value in the transfer of federally funded R&D. To help establish a body of knowledge, the U.S. government technical report is being investigated as part of the NASA/DOD Aerospace Knowledge Diffusion Research Project. In this report, we summarize the literature on technical reports and provide a model that depicts the transfer of federally funded aerospace R&D via the U.S. government technical report. We present results from our investigation of aerospace knowledge diffusion vis-a-vis the U.S. government technical communications practices of U.S. aerospace engineers and scientists affiliated with, not necessarily belonging to, the Society of Manufacturing Engineers (SME).
Pinelli, Thomas E.; Barclay, Rebecca O.; Kennedy, John M.
Background Researcher-stakeholder collaboration has been identified as critical to bridging research and health system change. While collaboration models vary, meaningful stakeholder involvement over time (“integrated knowledge translation”) is advocated to improve the relevance of research to knowledge users. This short report describes the integrated knowledge translation efforts of Connections, a knowledge translation and exchange project to improve services for women with substance abuse problems and their children, and implementation barriers and facilitators. Findings Strategies of varying intensities were used to engage diverse stakeholders, including policy makers and people with lived experience, and executive directors, program managers, and service providers from Canadian addiction agencies serving women. Barriers to participation included individual (e.g., interest), organizational (e.g., funding), and system level (e.g., lack of centralized stakeholder database) barriers. Similarly, facilitators included individual (e.g., perceived relevance) and organizational (e.g., support) facilitators, as well as initiative characteristics (e.g., multiple involvement opportunities). Despite barriers, Connections’ stakeholder-informed research efforts proved essential for developing clinically relevant and feasible processes, measures, and implementation strategies. Conclusions Stakeholder-researcher collaboration is possible and robust integrated knowledge translation efforts can be productive. Future work should emphasize developing and evaluating a range of strategies to address stakeholders’ knowledge translation needs and to facilitate sustained and meaningful involvement in research.
Online databases store thousands of molecular interactions and pathways, and numerous modelling software tools provide users with an interface to create and simulate mathematical models of such interactions. However, the two most widespread used standards for storing pathway data (biological pathway exchange; BioPAX) and for exchanging mathematical models of pathways (systems biology markup language; SBML) are structurally and semantically different. Conversion between formats (making data present in one format available in another format) based on simple one-to-one mappings may lead to loss or distortion of data, is difficult to automate, and often impractical and/or erroneous. This seriously limits the integration of knowledge data and models. In this paper we introduce an approach for such integration based on a bridging format that we named systems biology pathway exchange (SBPAX) alluding to SBML and BioPAX. It facilitates conversion between data in different formats by a combination of one-to-one mappings to and from SBPAX and operations within the SBPAX data. The concept of SBPAX is to provide a flexible description expanding around essential pathway data - basically the common subset of all formats describing processes, the substances participating in these processes and their locations. SBPAX can act as a repository for molecular interaction data from a variety of sources in different formats, and the information about their relative relationships, thus providing a platform for converting between formats and documenting assumptions used during conversion, gluing (identifying related elements across different formats) and merging (creating a coherent set of data from multiple sources) data. PMID:21028923
Ruebenacker, O; Moraru, I I; Schaff, J C; Blinov, M L
Discusses the value of student-designed, in-depth, modelingprojects in a differential equations course and how to prepare students. Provides excerpts from worksheets, a list of computer software for Macintosh that can be used in teaching differential equations, and an annotated bibliography. (Author/ASK)
This article is a project that takes students through the process of forming a mathematical model of bicycle dynamics. Beginning with basic ideas from Newtonian mechanics (forces and torques), students use techniques from calculus and differential equations to develop the equations of rotational motion for a bicycle-rider system as it tips from…
Software project planning can be one of the most critical activities in the modern software development process. Without a realistic and objective software project plan, the software development process cannot be managed in an effective way. Over-runs of 100-200% are common. Some software projects never deliver anything. Managers have difficulty understanding and visualizing the software development process defined in a
The Government is urging teachers to engage more closely with families and is promoting the concept of the "extended" school. This article reports on the literacy strand of the Home School Knowledge Exchange (HSKE) project, directed by Professor Martin Hughes at the University of Bristol. A selection of literacy activities developed during this…
The authors present an alternative approach to inverse planning optimization and apply it to volumetric modulated are therapy (VMAT) in one rotation with a prior knowledge about the type of leaf motions. The optimization is based on the projection theorem in inner product spaces. MLC motion is directly considered in the optimization, thus avoiding leaf segmentation characteristic of IMRT optimization. In this work they realize the method for concave irregular targets encompassing an organ at risk leading to a repetitive MLC motion pattern. Applying the projection theorem leads to a noniterative optimization method and reduces to solving few systems of linear equations with small numbers of dimensions. The solution of the inverse problem is unique, and false minima are naturally excluded. They divided the full rotation into about 50 short arc segments and for each segment decomposed dose into separate contributions related to stages of MLC motion. This results generally in an inverse problem with just four free parameters per arc segment. Practically three degrees of freedom will be used for the purpose of a constant angular speed of the gantry. Therefore the total number of degrees of freedom for a 3D problem is about 3 x 50 x number of collimator leaf pairs for irradiating the whole target volume in one rotation. Two 2D and one 3D concave target volumes are applied for a slice by slice optimization. A 6 MV photon beam model is used, including realistic scattering and attenuation, and a maximal leaf velocity of 3 cm/s is regarded. The resulting dose distributions cover the PTVs very well and have maxima at about 108% of dose in the PTVs. The OAR is spared very strong in all cases. As a result of optimization, the MLC apertures are repetitively opening and closing and can be interpreted in an intuitive way. Applying the projection method for this knowledge-based VMAT delivery scheme for concave target volumes is an alternative technique for dose optimization. There are several properties, such as uniqueness of MLC motions and their continuous dependence on geometry and prescribed dose, that make this approach interesting to inverse planning. This method is still in an investigational stage, but promising results are presented. In future work it will be extended directly (without conceptual changes) in several directions to be more clinically applicable. PMID:19746810
One of the purposes of this study was to examine the differences between knowledge of pre-service physics teachers who experienced model-based teaching in pre-service education and those who did not. Moreover, it was aimed to determine pre-service physics teachers' perceptions of modelling. Posttest-only control group experimental design was used…
We examine how knowledge-intensive firms modify their organizational knowledge bases in the context of mobility of researchers. Building on a dynamic capabilities perspective, we propose a conceptual model of firm knowledge base dynamics that clearly distinguishes between two mechanisms: (1) changes in a firm’s pool of researchers and (2) a firm’s ability to reconfigure knowledge. Our model posits that these
Kremena Slavcheva; Julio O. De Castro; Andrea Fosfuri
This paper proposes a methodology for carrying out human-machine systems engineering research and in particular focuses on the development of knowledge-based support for training and aiding. The process is characterized by the definition of knowledge requirements, a normative model structure of human-machine interaction, a knowledge architecture that implements the model, and an interactive real-time simulation environment. The knowledge requirements largely
Patricia M. Jones; Rose W. Chu; Christine M. Mitchell
The Aviation Safety Monitoring and Modeling (ASMM) Project of NASA's Aviation Safety program is cultivating sources of data and developing automated computer hardware and software to facilitate efficient, comprehensive, and accurate analyses of the data collected from large, heterogeneous databases throughout the national aviation system. The ASMM addresses the need to provide means for increasing safety by enabling the identification and correcting of predisposing conditions that could lead to accidents or to incidents that pose aviation risks. A major component of the ASMM Project is the Aviation Performance Measuring System (APMS), which is developing the next generation of software tools for analyzing and interpreting flight data.
LMMP was initiated in 2007 to help in making the anticipated results of the LRO spacecraft useful and accessible to Constellation. The LMMP is managing and developing a suite of lunar mapping and modeling tools and products that support the Constellation Program (CxP) and other lunar exploration activities. In addition to the LRO Principal Investigators, relevant activities and expertise that had already been funded by NASA was identified at ARC, CRREL (Army Cold Regions Research & Engineering Laboratory), GSFC, JPL, & USGS. LMMP is a cost capped, design-to-cost project (Project budget was established prior to obtaining Constellation needs)
This paper focuses on the hands-on experience of 3-D solid modeling technique and prototyping employed in product design and realization process. Engineering Graphics and CAD/CAM are two of the core courses in the Manufacturing Engineering program. Computer aided design and drafting, as well as solid modeling of parts, are strongly emphasized in the Engineering Graphics curriculum which is taught in the sophomore year. In continuation, both manual and computer aided CNC programming are covered in the CAD/CAM curriculum taught in the junior year. Computer Aided Reverse Engineering of cork opener, gear puller, cell phone case, and cell phone cover were the selected course projects taken by students in the CAD/CAM course. One of the main objectives of the curse project was for the students to extend their knowledge in design process and gain a hands-on experience in the field of solid modeling and product realization. A caliper and a micrometer were used to measure the main dimensions of the parts, and a solid modeling program was used for creating the parts model and assembly as well. This paper describes hands-on solid modeling and prototyping experiences of manufacturing engineering students regarding product realization process at our program.
The M-PROJECT implements a methodology for the efficient generation of static and transient process models and their subsequent numerical solution. The knowledge-based hierarchy of the M-PROJECT framework is composed of the Material_Model, the Abstract _Constraint_Definition, the Specific_Problem_Instance and the Generic _Solver. The methodology enforces the principle of hierarchical decomposition of balance envelopes with increasing degree of detail without discrimination between
A. A. Linninger; M. Hofer; H. Krendl; H. Druckenthaner; H. P. Jörgl
The primary purpose of this study was to propose a grounded theory of knowledge transformation in the hypertext authoring process. More specifically, the present study attempted to answer the following two questions: 1) what cognitive processes are involved in knowledge transformation through hypertext authoring and 2) how are these cognitive processes interrelated. For the first question, this study identified cognitive
Failure mode and effect analysis (FMEA) is a widely used quality improvement and risk assessment tool in manufacturing. Accumulated information about design and process failures recorded through FMEA provides very valuable knowledge for future product and process design. However, the way the knowledge is captured poses considerable difficulties for reuse. This research aims to contribute to the reuse of FMEA
Good teaching is ultimately deeply and thoroughly grounded in knowledge. The days of the "if you can't do anything else, then teach" mentality are long gone, ushered out due to major changes in knowledge about effective teaching. The National Board for Professional Teaching Standards, National Council for the Accreditation of Teacher Education…
In the digital economy, knowledge is regarded as an asset in an organization, and knowledge management (KM) implementation supports a company in developing innovative products and making critical management strategic decisions for business excellence. Customer relationship management (CRM) is an information-technology-enabling management tool, which manages the relationship with customers to understand, target, and attract them, with the objective of satisfying
The authors present 2 experiments that establish the presence of knowledge partitioning in perceptual categorization. Many participants learned to rely on a context cue, which did not predict category membership but identified partial boundaries, to gate independent partial categorization strategies. When participants partitioned their knowledge,…
The analysis of the evolution of knowledge is distinguished from standard economics and neoDarwinian biology; it combines purpose with the impossibility of empirical proof. Adam Smith’s psychological theory includes motivation, aesthetics, invention, diffusion, renewed search, the speciation of knowledge and the division of labour. Knightian uncertainty gives rise to both routines and imagination; variation and selection require a baseline. Organisation
The issue of knowledge management in a distributed network is receiving increasing attention from both scientific and industrial organizations. Research efforts in this field are motivated by the awareness that knowledge is more and more perceived as a primary economic resource and that, in the context of organization of organizations, the…
Temporal is the essential characteristic of information. Not only data has the nature of temporal but also knowledge. With the development of database and information technology, temporal information plays a very important role in information systems, even is decisive to some systems (electronic government affair and decision support system based on the temporal policy knowledge). In this paper temporal index
Tang Yong; Tang Na; Ye XiaoPing; Feng ZhiSheng; Xiao Wei
This article describes the experience of knowledge translation between researchers of the ITSAL (immigration, work and health) project and representatives of organizations working with immigrants to discuss the results obtained in the project and future research lines. A meeting was held, attended by three researchers and 18 representatives from 11 institutions. Following a presentation of the methodology and results of the project, the participants discussed the results presented and research areas of interest, thus confirming matches between the two sides and obtaining proposals of interest for the ITSAL project. We understand the process described as an approach to social validation of some of the main results of this project. This experience has allowed us to open a channel of communication with the target population of the study, in line with the necessary two-way interaction between researchers and users. PMID:24309520
Ronda, Elena; López-Jacob, M José; Paredes-Carbonell, Joan J; López, Pilar; Boix, Pere; García, Ana M
Research Findings: A theory-based 2-factor structure of preschoolers’ emotion knowledge (i.e., recognition of emotional expression and understanding of emotion-eliciting situations) was tested using confirmatory factor analysis. Compared to 1- and 3-factor models, the 2-factor model showed a better fit to the data. The model was found to be equivalent for gender, race, age, and socioeconomic risk. Theory and the high
Hideko Hamada Bassett; Susanne Denham; Melissa Mincic; Kelly Graling
A semi-structured interview was used to enquire into the knowledge of models and modelling held by a total sample of 39 Brazilian science teachers working in 'fundamental' (ages 6-14 years) and 'medium' (ages 15-17 years) schools, student teachers, and university teachers. This paper focuses on their perceptions of the role of models in science teaching. The teachers' ideas are organized
Purpose: To develop a technique to estimate onboard 4D-CBCT using prior information and limited-angle projections for potential 4D target verification of lung radiotherapy.Methods: Each phase of onboard 4D-CBCT is considered as a deformation from one selected phase (prior volume) of the planning 4D-CT. The deformation field maps (DFMs) are solved using a motion modeling and free-form deformation (MM-FD) technique. In the MM-FD technique, the DFMs are estimated using a motion model which is extracted from planning 4D-CT based on principal component analysis (PCA). The motion model parameters are optimized by matching the digitally reconstructed radiographs of the deformed volumes to the limited-angle onboard projections (data fidelity constraint). Afterward, the estimated DFMs are fine-tuned using a FD model based on data fidelity constraint and deformation energy minimization. The 4D digital extended-cardiac-torso phantom was used to evaluate the MM-FD technique. A lung patient with a 30 mm diameter lesion was simulated with various anatomical and respirational changes from planning 4D-CT to onboard volume, including changes of respiration amplitude, lesion size and lesion average-position, and phase shift between lesion and body respiratory cycle. The lesions were contoured in both the estimated and “ground-truth” onboard 4D-CBCT for comparison. 3D volume percentage-difference (VPD) and center-of-mass shift (COMS) were calculated to evaluate the estimation accuracy of three techniques: MM-FD, MM-only, and FD-only. Different onboard projection acquisition scenarios and projection noise levels were simulated to investigate their effects on the estimation accuracy.Results: For all simulated patient and projection acquisition scenarios, the mean VPD (±S.D.)/COMS (±S.D.) between lesions in prior images and “ground-truth” onboard images were 136.11% (±42.76%)/15.5 mm (±3.9 mm). Using orthogonal-view 15°-each scan angle, the mean VPD/COMS between the lesion in estimated and “ground-truth” onboard images for MM-only, FD-only, and MM-FD techniques were 60.10% (±27.17%)/4.9 mm (±3.0 mm), 96.07% (±31.48%)/12.1 mm (±3.9 mm) and 11.45% (±9.37%)/1.3 mm (±1.3 mm), respectively. For orthogonal-view 30°-each scan angle, the corresponding results were 59.16% (±26.66%)/4.9 mm (±3.0 mm), 75.98% (±27.21%)/9.9 mm (±4.0 mm), and 5.22% (±2.12%)/0.5 mm (±0.4 mm). For single-view scan angles of 3°, 30°, and 60°, the results for MM-FD technique were 32.77% (±17.87%)/3.2 mm (±2.2 mm), 24.57% (±18.18%)/2.9 mm (±2.0 mm), and 10.48% (±9.50%)/1.1 mm (±1.3 mm), respectively. For projection angular-sampling-intervals of 0.6°, 1.2°, and 2.5° with the orthogonal-view 30°-each scan angle, the MM-FD technique generated similar VPD (maximum deviation 2.91%) and COMS (maximum deviation 0.6 mm), while sparser sampling yielded larger VPD/COMS. With equal number of projections, the estimation results using scattered 360° scan angle were slightly better than those using orthogonal-view 30°-each scan angle. The estimation accuracy of MM-FD technique declined as noise level increased.Conclusions: The MM-FD technique substantially improves the estimation accuracy for onboard 4D-CBCT using prior planning 4D-CT and limited-angle projections, compared to the MM-only and FD-only techniques. It can potentially be used for the inter/intrafractional 4D-localization verification.
Zhang, You [Medical Physics Graduate Program, Duke University, Durham, North Carolina 27710 (United States)] [Medical Physics Graduate Program, Duke University, Durham, North Carolina 27710 (United States); Yin, Fang-Fang; Ren, Lei [Medical Physics Graduate Program, Duke University, Durham, North Carolina 27710 and Department of Radiation Oncology, Duke University Medical Center, Durham, North Carolina 27710 (United States)] [Medical Physics Graduate Program, Duke University, Durham, North Carolina 27710 and Department of Radiation Oncology, Duke University Medical Center, Durham, North Carolina 27710 (United States); Segars, W. Paul [Medical Physics Graduate Program, Duke University, Durham, North Carolina 27710 and Department of Radiology, Carl E. Ravin Advanced Imaging Laboratories, Duke University Medical Center, Durham, North Carolina 27705 (United States)] [Medical Physics Graduate Program, Duke University, Durham, North Carolina 27710 and Department of Radiology, Carl E. Ravin Advanced Imaging Laboratories, Duke University Medical Center, Durham, North Carolina 27705 (United States)
Pedagogies for knowledge management courses are still undeveloped. This Teaching Tip introduces a design thinking approach to teaching knowledge management. An induction model used to guide students' real-life projects for knowledge management is presented. (Contains 1 figure.)
The Lunar Mapping and ModelingProject (LMMP) is managing a suite of lunar mapping and modeling tools and data products that support lunar exploration activities, including the planning, de-sign, development, test, and operations associated with crewed and/or robotic operations on the lunar surface. Although the project was initiated primarily to serve the needs of the Constellation program, it is equally suited for supporting landing site selection and planning for a variety of robotic missions, including NASA science and/or human precursor missions and commercial missions such as those planned by the Google Lunar X-Prize participants. In addition, LMMP should prove to be a convenient and useful tool for scientific analysis and for education and public out-reach (E/PO) activities.
We propose a novel framework for performing quantitative Bayesian inference based on qualitative knowledge. Here, we focus on the treatment in the case of inconsistent qualitative knowledge. A hierarchical Bayesian model is proposed for integrating inconsistent qualitative knowledge by calculating a prior belief distribution based on a vector of knowledge features. Each inconsistent knowledge component uniquely defines a model class in the hyperspace. A set of constraints within each class is generated to describe the uncertainty in ground Bayesian model space. Quantitative Bayesian inference is approximated by model averaging with Monte Carlo methods. Our method is firstly benchmarked on ASIA network and is applied to a realistic biomolecular interaction modeling problem for breast cancer bone metastasis. Results suggest that our method enables consistently modeling and quantitative Bayesian inference by reconciling a set of inconsistent qualitative knowledge. PMID:18272332
The paper addresses an issue of measurement of team knowledge. Different, though related, views on team knowledge, namely transactive memory system and team mental models, are discussed. Transactive memory system is a concept of a group memory. It consists of individual expertise of team members as well as their knowledge of \\
Introduction: Knowledge translation (KT) has only recently emerged in the field of rehabilitation with attention on creating effective KT interventions to increase clinicians' knowledge and use of evidence-based practice (EBP). The uptake of EBP is a complex process that can be facilitated by the use of the Knowledge to Action Process model. This…
The objective of this study is to use the knowledge acquisition model of inductive learning to establish a selection system for coordinate measuring machines. In this paper, an example knowledge base for the selection of a coordinate measuring machine was induced and analyzed from the diffusive information in the example. The goal is to reduce the amount of knowledge related
In this paper Chinese foreign invested enterprises (FIEs) are employed as prototypes to generate a model of how transnationals can transfer both tacit and explicit knowledge between their units as well as between FIEs and the parent organization. We propose that successful intra-organization knowledge transfer depends upon: (1) collective creation of knowledge as intellectual and social capital available throughout the
Rural Japanese women have been overlooked or misrepresented in the academic and nationalist discourses on Japanese women. Using an anti-colonial feminist framework, I advocate that centering discussions on Indigenous knowledges will help fill this gap based on the belief that Indigenous-knowledge framework is a tool to show the agency of the…
Evaluated the impact of a reproductive health community mobilization initiative in Cameroon. Baseline and followup survey data indicated that at a rural site, the intervention positively influenced family planning knowledge and practices, HIV/AIDS and sexually transmitted disease knowledge and attitudes, and use of health services. At an urban…
This article describes and references the relevant literature related to knowledge-based simulation. There are essentially ten areas of literature that would likely contain relevant articles. They are the management science/operations research literature,...
From the combined use of different observations/parameters, from the refinement of data analysis methods and the development of suitable physical models, we are expecting major progresses in the research on earthquake's preparatory phases. More than from the use of a single parameter approach, reduced false alarm rates and improved reliability and precision (in the space-time domain) of predictions, are expected from a multi-parameter observational, multi-disciplinary, research, strategy. Less than one year after its start, PRE-EARTHQUAKES FP7 Project already demonstrated its capability to commit together independent expertise and different observation capabilities in order: a) to substantially improve our knowledge of preparatory phases of earthquakes and their possible precursors; b) to promote a worldwide Earthquake Observation System (EQuOS) as a dedicated component of GEOSS (Global Earth Observation System of Systems); c) to develop and offer to the international scientific community an integration platform where independent observations and new data analysis methodologies devoted to the research on/of earthquake precursors can be collected and cross-validated. In this paper results achieved so far, in particular on the earthquakes selected as test cases occurred in recent years in Italy (M6.3 Abruzzo April 2009), Sakhalin (M6,2, Nevelsk, August 2007) and Turkey (M6,1, Elazig March 2010) will be presented emphasizing the significant added values guaranteed by a multi-parameter, multi-disciplinary strategy.
Knowledge is the foundation upon which researchers build as they innovate. Innovation lies at the core of a state's or a firm's ability to survive in a competitive world. Indeed, some economic historians ever that technological innovation, not trade, is the engine to economic growth. Despite the centrality of knowledge to corporate success, analysts have only recently shown an interest in the "knowledge capital" or "intellectual capital" of the firm, often literally trying to assign a value to this resource.
Climate and weather prediction hinge on numerical models. Most of the climate models included in the Coupled Model Intercomparison Project 5 (CMIP5) and which will underpin the Intergovernmental Panel for Climate Change 5th Assessment Report (IPCC AR5) include a dust module because dust is known to play an important role in the Earth system. However dust emission schemes in climate models are relatively simple and are tuned to represent observed background aerosol concentrations most of which are many thousands of kilometres from source regions. The physics of dust emission in the models was developed from idealised experiments such as those conducted in wind tunnels decades ago. Improvement of current model dust emission schemes has been difficult to achieve because of the paucity of observations from key dust sources. Dust Observations for Models (DO4Models) is a project designed to gather data from source regions at a scale appropriate to climate model grid box resolution. The UK NERC funded project, led by the University of Oxford, aims to: 1) Generate a data set at an appropriate scale for climate models which characterises surface erodibility and erosivity in dust source areas from remote sensing and fieldwork 2) Quantify how observed erodibility and erosivity influence observed emissions at the climate model scale 3) Test, develop and optimise the dust emission scheme for the Met Office regional model (HadGEM3-RA) using this unique dust source area data set 4) Quantify which component(s) of observed erodibility and erosivity, and at what spatial scale, make the largest improvement to physically-based, observationally optimised dust emission simulations in climate models. This paper provides a project overview and some early observational and modelling results from the 2011 field season.
Washington, R.; Wiggs, G.; King, J.; Thomas, D. S.; Woodward, S.; Eckardt, F. D.; Haustein, K.; Vickery, K.; Bryant, R. G.; Nield, J. M.; Murray, J.; Brindley, H.; Jones, R.
A basic principle in data modelling is to incorporate available a priori information regarding the underlying data generating mechanism into the modelling process. We adopt this principle and consider grey-box radial basis function (RBF) modelling capable of incorporating prior knowledge. Specifically, we show how to explicitly incorporate the two types of prior knowledge: the underlying data generating mechanism exhibits known
Integral projectionmodels (IPMs) are increasingly being applied to study size-structured populations. Here we call attention to a potential problem in their construction that can have important consequences for model results. IPMs are implemented using an approximating matrix and bounded size range. Individuals near the size limits can be unknowingly "evicted" from the model because their predicted future size is outside the range. We provide simple measures for the magnitude of eviction and the sensitivity of the population growth rate (lambda) to eviction, allowing modelers to assess the severity of the problem in their IPM. For IPMs of three plant species, we found that eviction occurred in all cases and caused underestimation of the population growth rate (lambda) relative to eviction-free models; it is likely that other models are similarly affected. Models with frequent eviction should be modified because eviction is only possible when size transitions are badly mis-specified. We offer several solutions to eviction problems, but we emphasize that the modeler must choose the most appropriate solution based on an understanding of why eviction occurs in the first place. We recommend testing IPMs for eviction problems and resolving them, so that population dynamics are modeled more accurately. PMID:23094372
Williams, Jennifer L; Miller, Tom E X; Ellner, Stephen P
APPL is a research-based organization that serves NASA program and project managers, as well as project teams, at every level of development. In 1997, APPL was created from an earlier program to underscore the importance that NASA places on project manage...
The large scale research program Topo-Iberia aims to unravel the complex structure and mantle processes in the area of interaction between the African and European continental plates in the western Mediterranean. The project, funded by the Spanish Ministry of Science and Education, started in 2007 and will be active till Fall 2013. Topo-Iberia has gathered expertise of different fields of the Earth Sciences. One of the key assets of the project involves the deploying of a technological observatory platform, IberArray, with unprecedented resolution and coverage. This platform is currently building up a comprehensive, multidisciplinary data set, stored by the SIGEOF database, which includes seismological, GPS and magnetotelluric data. Using also other analytical methodologies included in the Topo-Iberia program (potential fields, quantitative analysis of the topography, dating methods) the final scope of the project is to study the relationship between superficial and deep-rooted processes. Topo-Iberia has also benefited from the interaction with other projects investigating the same area, as the American program PICASSO, the French Pyrope or the Portuguese WILAS. This interaction includes sharing the available data to better assess the key geological questions. This contribution will present the current state of the most significant scientific results which are arising from the data acquired using the Iberarray platform. -. SKS splitting analysis has provided a spectacular image of the anisotropic pattern over the area, including a clear rotation of the fast velocity direction along the Gibraltar Arc. -. Receiver functions have revealed the crustal thickness variations beneath the Rif and southern Iberia, including a crustal root beneath the Rif not clearly documented previously. The 410-km and 660-km upper mantle discontinuities have been investigated using novel cross-correlation/stacking techniques. -. Surface wave tomography using both earthquakes and ambient noise allows describing the main characteristics of crustal structure. Local body-wave tomography, currently focused on Northern Morocco, has improved the location of the small magnitude events affecting the area and the details of the crustal structure. Teleseismic tomography has confirmed, using an independent data set, the presence of a high-velocity slab beneath the Gibraltar Arc. -. A number of 2-D Magnetotelluric (MT) profiles have been acquired in Iberia and Morocco. These MT profiles provide a 1500 km long N-S lithospheric transect extending from the Cantabrian Mountains to the Atlas. -. The Topoiberia GPS deployments acquired long-term time series of data allowing well resolved determinations of the relatively small velocity displacements affecting the region. Additional high-resolution active-source seismic experiments recently carried out in the Atlas, the Rif and the Central and Iberian Massifs piggy back with this large scale project are complementing this multidisciplinary data base. This new data provide a large number of physical observables to better constrain numerical models at lithospheric scale, which would result in high-quality lithospheric transects.
Digital Mind ModelingProject by MindPixel invites Web users to contribute to the creation of the first statistical model of human thought. The Canadian scientist, Chris McKinstry, who founded the project "hopes to be able to teach a computer what it means to be human" by using an approach similar to seti@home "to extract the entire content of an average person's mind bit by literal bit from millions of different internet users." After about 10 years running, the final collection will be available for other artificial intelligence researchers. For now, visitors can register using an online form to access the Mindpixel News System, which offers the latest news pertaining to the mind and mind-related science. Internet users can also register and make their contribution to science by talking to the online system, which the author calls GAC, pronounced "Jack." Contributors earn voting rights "that will give them a say in every aspect of how the project is run, from data collection and use to the distribution of data and research funds."
What does it take to create and implement a 7th to 8th grade middle school program where the great majority of students achieve at high academic levels regardless of their previous elementary school backgrounds? This was the major question that guided the research and development of a 7-year long project effort entitled the Chancellor's Model School Project (CMSP) from September 1991 to August 1998. The CMSP effort conducted largely in two New York City public schools was aimed at creating and testing a prototype 7th and 8th grade model program that was organized and test-implemented in two distinct project phases: Phase I of the CMSP effort was conducted from 1991 to 1995 as a 7th to 8th grade extension of an existing K-6 elementary school, and Phase II was conducted from 1995 to 1998 as a 7th to 8th grade middle school program that became an integral part of a newly established 7-12th grade high school. In Phase I, the CMSP demonstrated that with a highly structured curriculum coupled with strong academic support and increased learning time, students participating in the CMSP were able to develop a strong foundation for rigorous high school coursework within the space of 2 years (at the 7th and 8th grades). Mathematics and Reading test score data during Phase I of the project, clearly indicated that significant academic gains were obtained by almost all students -- at both the high and low ends of the spectrum -- regardless of their previous academic performance in the K-6 elementary school experience. The CMSP effort expanded in Phase II to include a fully operating 7-12 high school model. Achievement gains at the 7th and 8th grade levels in Phase II were tempered by the fact that incoming 7th grade students' academic background at the CMSP High School was significantly lower than students participating in Phase 1. Student performance in Phase II was also affected by the broadening of the CMSP effort from a 7-8th grade program to a fully functioning 7-12 high school which as a consequence lessened the focus and structure available to the 7-8th grade students and teachers -- as compared to Phase I. Nevertheless, the CMSP does represent a unique curriculum model for 7th and 8th grade students in urban middle schools. Experience in both Phase I and Phase II of the project allowed the CMSP to be developed and tested along the broad range of parameters and characteristics that embody an operating public school in an urban environment.
This paper introduces a hybrid software process simulation model that combines two common approaches to simulation - discrete event and system dynamics. The purpose of the model is to support software project estimation as well as project management. Although the model was originally constructed for a specific software organization, it's use was never realized by that project. The model has
The Airspace Systems Program (ASP) has identified a set of goals based on projections of annual passenger demands. The topics of discussion include: 1) Virtual Airspace Modeling and Simulation Project (VAMS) Project Description; 2) VAMS Project Management; 3) VAMS Project Schedule; and 4) Technical Interchange Meeting (TIM). This paper is in viewgraph form.
Web-based Learning Management Systems (LMS) allow instructors and students to share instructional materials, make class announcements, submit and return course assignments, and communicate with each other online. Previous LMS-related research has focused on how these systems deliver and manage instructional content with little concern for how students' constructivist learning can be encouraged and facilitated. This study investigated how students use LMS to interact, collaborate, and construct knowledge within the context of a group project but without mediation by the instructor. The setting for this case study was students' use in one upper-level biology course of the local LMS within the context of a course-related group project, a mock National Institutes of Health grant proposal. Twenty-one groups (82 students) voluntarily elected to use the LMS, representing two-thirds of all students in the course. Students' peer-to-peer messages within the LMS, event logs, online surveys, focus group interviews, and instructor interviews were used in order to answer the study's overarching research question. The results indicate that students successfully used the LMS to interact and, to a significant extent, collaborate, but there was very little evidence of knowledge construction using the LMS technology. It is possible that the ease and availability of face-to-face meetings as well as problems and limitations with the technology were factors that influenced whether students' online basic interaction could be further distinguished as collaboration or knowledge construction. Despite these limitations, students found several tools and functions of the LMS useful for their online peer interaction and completion of their course project. Additionally, LMS designers and implementers are urged to consider previous literature on computer-supported collaborative learning environments in order to better facilitate independent group projects within these systems. Further research is needed to identify the best types of scaffolds and overall technological improvements in order to provide support for online collaboration and knowledge construction.
\\u000a This work presents an application of the CommonKADS methodology to define of a knowledgemodel for a therapy admini-stration\\u000a task, reusing CommonKADS task templates. A knowledge-based system for solving the problem of phytosanitary strategy selec-tion\\u000a in greenhouses has been constructed using this knowledgemodel. As a result, we have defined a therapy administration task\\u000a template that can be reused to
Isabel María Del Águila; Joaquín Cañadas; Alfonso Bosch; Samuel Túnez; Roque Marín
Abstract. This paper reports on a case-study of applying the gen- eral purpose and widely accepted methodology,CommonKADS,to a clinical practice guideline. CommonKADS,is focussed on obtain- ing a compact knowledgemodel. However, guidelines usually con- tain incomplete and ambiguous knowledge. So, the resulting knowl- edge model will be incomplete and we will need to detect what parts of the guideline knowledge,are
Maria Taboada; Maria Meizoso; Diego Martínez; S. Tellado
This study evaluated the extent to which 386 instructors in the North Carolina Community College System (NCCCS) use certain modes of knowledge utilization, and the effectiveness thereof as seen by the NCCCS instructors. It also examined differences in mode use and effectiveness among vocational, technical, and college transfer instructors as well…
Due to the dasiayear 2007 problempsila (2007 is the beginning year when the baby-boomers reach the age of retirement), the need to plan for skill succession is now widely recognized in Japan. In addition, merge and acquisition (M&A) and overseas manufacturing are accelerating a necessity of enterprise knowledge management with ldquocommon languagerdquo. We are studying an intelligent information platform in
R. Fujiwara; N. Yokota; T. Shimano; D. Kushida; A. Kitamura
Alternating-time Temporal Logic (ATL) is a logic developed by Alur, Henzinger, and Kupferman for reasoning about coalitional powers in multiagent systems. Since what agents can achieve in a specific state will in general depend on the knowledge they have in that state, in an earlier paper we proposed Alternating-time Temporal Epistemic Logic (ATEL), an epistemic extension to ATL which is
Purpose – This paper aims to investigate the factors influencing the adoption and diffusion of knowledge management systems (KMSs) in Australia. Design\\/methodology\\/approach – A qualitative field study was undertaken, in which six Australian organizations of various sizes, all in various stages of KMS adoption and diffusion, were studied via face-to-face interviews with key personnel in the organizations. Findings – A
A Building Information Model is a digital representation of physical and functional characteristics of a building. BIMs represent the geometrical characteristics of the Building, but also properties like bills of quantities, definition of COTS components, status of material in the different stages of the project, project economic data, etc. The BIM methodology, which is well established in the Architecture Engineering and Construction (AEC) domain for conventional buildings, has been brought one step forward in its application for Astronomical/Scientific facilities. In these facilities steel/concrete structures have high dynamic and seismic requirements, M&E installations are complex and there is a large amount of special equipment and mechanisms involved as a fundamental part of the facility. The detail design definition is typically implemented by different design teams in specialized design software packages. In order to allow the coordinated work of different engineering teams, the overall model, and its associated engineering database, is progressively integrated using a coordination and roaming software which can be used before starting construction phase for checking interferences, planning the construction sequence, studying maintenance operation, reporting to the project office, etc. This integrated design & construction approach will allow to efficiently plan construction sequence (4D). This is a powerful tool to study and analyze in detail alternative construction sequences and ideally coordinate the work of different construction teams. In addition engineering, construction and operational database can be linked to the virtual model (6D), what gives to the end users a invaluable tool for the lifecycle management, as all the facility information can be easily accessed, added or replaced. This paper presents the BIM methodology as implemented by IDOM with the E-ELT and ATST Enclosures as application examples.
Ariño, Javier; Murga, Gaizka; Campo, Ramón; Eletxigerra, Iñigo; Ampuero, Pedro
This is a free internet-based module for high school physics on the topic of electrostatics. The lesson begins with a short video clip of a car catching on fire during refueling. The driving question of the unit is: what caused the fire and how can we use a knowledge of electrostatics to prevent these accidents? The module provides students with modeling experience, digital note-taking, and guided investigation. It blends hands-on labs with computer simulations and interactive tutorials. Editor's Note: Prior registration is required to access this resource. Click below for a direct link to the registration page: WISE Teacher Registration (University of California-Berkeley) WISE is a free online science environment offering standards-based modules designed to complement traditional classroom instruction. SEE RELATED MATERIALS for a link to the full collection.
Second language vocabulary acquisition has been modeled both as multidimensional in nature and as a continuum wherein the learner's knowledge of a word develops along a cline from recognition through production. In order to empirically examine and compare these models, the authors assess the degree to which the Vocabulary Knowledge Scale (VKS;…
Stewart, Jeffrey; Batty, Aaron Olaf; Bovee, Nicholas
A considerably proportion of the adaptativity of an Intelligent Tutoring System is due to the Student Model. Bayesian Networks has been used in student modeling in order to provide adaptativity from the tutor system to the student knowledge. However, a Bayesian Network requires evidences in order to make inferences. This article presents a proposal to measure and update knowledge evidences
To view contemporary Science as an industry is a very apt and timely stance. Ghassib's (2010) historical analysis of knowledge production, which he terms "A Productivist Industrial Model of Knowledge Production," is an interesting one. It is important, however, to observe that the outline of this model is based entirely on the production of…
By its very nature, software development consists of many knowledge-intensive processes. One of the most difficult to model, however, is requirements elicitation. This paper presents a mathematical model of the requirements elicitation process that clearly shows the critical role of knowledge in its performance. One meta- process of requirements elicitation, selection of an appropriate elicitation technique, is also captured in
This paper gives an overview of the work done to develop a task ontology based on the task templates provided by CommonKADS. It discusses how modeling of KBS is done in the CommonKADS and also gives a brief overview of OWL-S which an upper-level ontology for web services. Constructs of OWL-S ontology have been compared with constructs proposed by CommonKADS
In this paper we discuss the impact of differing knowledge structure measurement techniques on assessing instructor mental models for behaviors associated with Situation Awareness. Our goals were, first, to investigate the degree to which an expert model ...
J. Fowlkes L. Martin-Milham R. L. Oser S. M. Fiore
Research on collection, acquisition, multiplication, keeping and dissemination of organizational knowledge is highly supported through private incentives, government institutions and international projects. Many of those projects resulted in numerous applications: knowledge management portals, intensifying communications, knowledge management communities, centers of excellence and new tools and models of knowledge management. Although open in general to all who have new ideas and
The Lunar Mapping and ModelingProject (LMMP) has been created to manage the development of a suite of lunar mapping and modeling products that support the Constellation Program (CxP) and other lunar exploration activities, including the planning, design, development, test and operations associated with lunar sortie missions, crewed and robotic operations on the surface, and the establishment of a lunar outpost. The information provided through LMMP will assist CxP in: planning tasks in the areas of landing site evaluation and selection, design and placement of landers and other stationary assets, design of rovers and other mobile assets, developing terrain-relative navigation (TRN) capabilities, and assessment and planning of science traverses.
Noble, Sarah K.; French, R. A.; Nall, M. E.; Muery, K. G.
The U.S. government technical report is a primary means by which the results of federally funded research and development (R&D) are transferred to the U.S. aerospace industry. However, little is known about this information product in terms of its actual use, importance, and value in the transfer of federally funded R&D. To help establish a body of knowledge, the U.S. government technical report is being investigated as part of the NASA/DOD Aerospace Knowledge Diffusion Research Project. In this report, we summarize the literature on technical reports and provide a model that depicts the transfer of federally funded aerospace R&D via the U.S. government technical report. We present results from our investigation of aerospace knowledge diffusion vis-a-vis the U.S. government technical report, and present the results of research that investigated aerospace knowledge diffusion vis-a-vis U.S. aerospace engineering faculty and students.
Pinelli, Thomas E.; Barclay, Rebecca O.; Kennedy, John M.
The U.S. government technical report is a primary means by which the results of federally funded research and development (R&D) are transferred to the U.S. aerospace industry. However, little is known about this information product in terms of its actual use, importance, and value in the transfer of federally funded R&D. To help establish a body of knowledge, the U.S. government technical report is being investigated as part of the NASA/DOD Aerospace Knowledge Diffusion Research Project. In this report, we summarize the literature on technical reports and provide a model that depicts the transfer of federally funded aerospace R&D via the U.S. government technical report. We present results from our investigation of aerospace knowledge diffusion vis-a-vis the U.S. government technical report, and present the results of research that investigated aerospace knowledge diffusion vis-a-vis the technical communications practices of U.S. aerospace engineers and scientists who are members of the American Institute of Aeronautics and Astronautics (AIAA).
Pinelli, Thomas E.; Barclay, Rebecca O.; Kennedy, John M.
The U.S. government technical report is a primary means by which the results of federally funded research and development (R&D) are transferred to the U.S. aerospace industry. However, little is known about this information product in terms of its actual use, importance, and value in the transfer of federally funded R&D. To help establish a body of knowledge, the U.S. government technical report is being investigated as part of the NASA/DOD Aerospace Knowledge Diffusion Research Project. In this report, we summarize the literature on technical reports and provide a model that depicts the transfer of federally funded aerospace R&D via the U.S. government technical report. We present results from our investigation of aerospace knowledge diffusion vis-a-vis the U.S. government technical report, and present the results of research that investigated aerospace knowledge diffusion vis-a-vis the technical communications practices of U.S. aerospace engineers and scientists affiliated with the Society of Automotive Engineers (SAE).
Pinelli, Thomas E.; Barclay, Rebecca O.; Kennedy, John M.
The U.S. government technical report is a primary means by which the results of federally funded research and development (R&D) are transferred to the U.S. aerospace industry. However, little is known about this information product in terms of its actual use, importance, and value in the transfer of federally funded R&D. To help establish a body of knowledge, the U.S. government technical report is being investigated as part of the NASA/DOD Aerospace Knowledge Diffusion Research Project. In this report, we summarize the literature on technical reports and provide a model that depicts the transfer of federally funded aerospace R&D via the U.S. government technical report. We present results from our investigation of aerospace knowledge diffusion vis-a-vis the U.S. government technical report, and present the results of research that investigated aerospace knowledge diffusion vis-a-vis U.S. academic librarians and technical information specialists as information intermediaries.
Pinelli, Thomas E.; Barclay, Rebecca O.; Kennedy, John M.
The U.S. government technical report is a primary means by which the results of federally funded research and development (R&D) are transferred to the U.S. aerospace industry. However, little is known about this information product in terms of its actual use, importance, and value in the transfer of federally funded R&D. To help establish a body of knowledge, the U.S. government technical report is being investigated as part of the NASA/DoD Aerospace Knowledge Diffusion Research Project. In this report, we summarize the literature on technical reports and provide a model that depicts the transfer of federally funded aerospace R&D via the U.S. government technical report. We present results from our investigation of aerospace knowledge diffusion vis-a-vis the U.S. government technical report, and present the results of research that investigated aerospace knowledge diffusion vis-a-vis U.S. aerospace industry librarians and technical information specialists as information intermediaries.
Pinelli, Thomas E.; Barclay, Rebecca O.; Kennedy, John M.
The U.S. government technical report is a primary means by which the results of federally funded research and development (R&D) are transferred to the U.S. aerospace industry. However, little is known about this information product in terms of its actual use, importance, and value in the transfer of federally funded R&D. To help establish a body of knowledge, the U.S. government technical report is being investigated as part of the NASA/DOD Aerospace Knowledge Diffusion Research Project. In this report, we summarize the literature on technical reports and provide a model that depicts the transfer of federally funded aerospace R&D via the U.S. government technical report. We present results from our investigation of aerospace knowledge diffusion vis-a-vis the U.S. government technical report, and present the results of research that investigated aerospace knowledge diffusion vis-a-vis the production and use of information by U.S. aerospace engineers and scientists who had changed their American Institute of Aeronautics and Astronautics (AIAA) membership from student to professional in the past five years.
Pinelli, Thomas E.; Barclay, Rebecca O.; Kennedy, John M.
Over the past three decades, computer-based simulation models have proven themselves to be cost-effective alternatives to the more structured deterministic methods of systems analysis. During this time, many techniques, tools and languages for constructing computer-based simulation models have been developed. More recently, advances in knowledge-based system technology have led many researchers to note the similarities between knowledge-based programming and simulation technologies and to investigate the potential application of knowledge-based programming techniques to simulation modeling. The integration of conventional simulation techniques with knowledge-based programming techniques is discussed to provide a development environment for constructing knowledge-based simulation models. A comparison of the techniques used in the construction of dynamic stochastic simulation models and those used in the construction of knowledge-based systems provides the requirements for the environment. This leads to the design and implementation of a knowledge-based simulation development environment. These techniques were used in the construction of several knowledge-based simulation models including the Advanced Launch System Model (ALSYM).
\\u000a Defining an appropriate role for expert knowledge in science can lead to contentious debate. The professional experience of\\u000a ecologists, elicited as expert judgment, plays an essential role in many aspects of landscape ecological science. Experts\\u000a may be asked to judge the relevance of competing research or management questions, the quality and suitability of available\\u000a data, the best balance of complexity
Despite a 50-year interdisciplinary and longitudinal research legacy--showing that nearly 80% of young people considered most "at risk" thrive by midlife--only recently have practitioners/researchers engaged in the explicit, prospective facilitation of "resilience" in educational settings. Here, theory/knowledge distinguishing and extending risk…
This paper reviews books and research papers concerned with Indigenous science knowledge and its integration into school curricula and describes current efforts to bridge Western and Native science. "A Yupiaq World View: Implications for Cultural, Educational and Technological Adaptation in a Contemporary World" (Angayuqaq Oscar Kawagley)…
Knowledge management (KM) is becoming recognized as a valuable tool for the Department of Defense (DoD) in its effort to maintain a competitive, strategic advantage against its enemies in a new threat environment. Decision superiority is the ultimate end ...
The next 10 years provide an opportunity for the European Union (EU) to renew the science and technology (S&T) base and build necessary knowledge-society capacities and capabilities in Pre-Accession Countries (PACs). Applied research has faced a major downsize; redressing the balance in the research and development systems is urgently needed.…
There are various forces driving change in the knowledge and skills areas for information professionals: 1) technologies, 2) changing environments, and 3) the changing role of information technology management. These forces affect all levels of information technology-based professionals--those responsible for information processing and those responsible for information services. This paper discusses and reviews the pertinent literature that deals with the
Edwin M. Cortez; Edward Kazlauskas; Sanjay K. Dutta
The analyses presented in this paper document the impact of a community mobilization effort in Cameroon. Between 1997 and 1998, a local non-governmental organization worked with community associations, Njangi, in one urban and one rural location to promote knowledge and positive practices concerning family planning, sexually transmitted diseases, and treatment of common childhood diseases. Based on a multi-tiered structure, the
Medical students performed less well on examinations about drug abuse problems and patient management than on traditional medical board examinations. The best knowledge was of pharmacology of drug abuse, Alcoholics Anonymous, and treatment of delirium tremens. Students knew less about metabolic and biochemical areas, emergency-room treatment, and…
In this paper, we introduce a new modeling approach called meta-modeling and illustrate its practical applicability to the construction of physically-based ecosystem process models. As a critical adjunct to modeling codes meta-modeling requires explicit specification of certain background information related to the construction and conceptual underpinnings of a model. This information formalizes the heretofore tacit relationship between the mathematical modeling code and the underlying real-world phenomena being investigated, and gives insight into the process by which the model was constructed. We show how the explicit availability of such information can make models more understandable and reusable and less subject to misinterpretation. In particular, background information enables potential users to better interpret an implemented ecosystem model without direct assistance from the model author. Additionally, we show how the discipline involved in specifying background information leads to improved management of model complexity and fewer implementation errors. We illustrate the meta-modeling approach in the context of the Scientists' Intelligent Graphical Modeling Assistant (SIGMA) a new model construction environment. As the user constructs a model using SIGMA the system adds appropriate background information that ties the executable model to the underlying physical phenomena under investigation. Not only does this information improve the understandability of the final model it also serves to reduce the overall time and programming expertise necessary to initially build and subsequently modify models. Furthermore, SIGMA's use of background knowledge helps eliminate coding errors resulting from scientific and dimensional inconsistencies that are otherwise difficult to avoid when building complex models. As a. demonstration of SIGMA's utility, the system was used to reimplement and extend a well-known forest ecosystem dynamics model: Forest-BGC.
The general objectives of the NASA/UCF Automated Knowledge Generation Project were the development of an intelligent software system that could access CAD design data bases, interpret them, and generate a diagnostic knowledge base in the form of a system model. The initial area of concentration is in the diagnosis of the process control system using the Knowledge-based Autonomous Test Engineer (KATE) diagnostic system. A secondary objective was the study of general problems of automated knowledge generation. A prototype was developed, based on object-oriented language (Flavors).
This project will provide descriptive and analytical data regarding the flow of STI at the individual, organizational, national, and international levels. It will examine both the channels used to communicate information and the social system of the aeros...
The Lunar Mapping and ModelingProject (LMMP) is managing the development of a suite of lunar mapping and modeling tools and data products that support lunar exploration activities, including the planning, design, development, test, and operations associated with crewed and/or robotic operations on the lunar surface. In addition, LMMP should prove to be a convenient and useful tool for scientific analysis and for education and public outreach (E/PO) activities. LMMP will utilize data predominately from the Lunar Reconnaissance Orbiter, but also historical and international lunar mission data (e.g. Lunar Prospector, Clementine, Apollo, Lunar Orbiter, Kaguya, and Chandrayaan-1) as available and appropriate. LMMP will provide such products as image mosaics, DEMs, hazard assessment maps, temperature maps, lighting maps and models, gravity models, and resource maps. We are working closely with the LRO team to prevent duplication of efforts and ensure the highest quality data products. A beta version of the LMMP software was released for limited distribution in December 2009, with the public release of version 1 expected in the Fall of 2010.
The Army-NASA Aircrew Aircraft Integration program is supporting a joint project to build a visibility computer-aided design (CAD) tool. CAD has become an essential tool in modern engineering applications. CAD tools are used to create engineering drawings and to evaluate potential designs before they are physically realized. The visibility CAD tool will provide the design engineer with a tool to aid in the location and specification of windows, displays, and control in crewstations. In an aircraft cockpit the location of instruments and the emissive and reflective characteristics of the surfaces must be determined to assure adequate aircrew performance. The visibility CAD tool will allow the designer to ask and answer many of these questions in the context of a three-dimensional graphical representation of the cockpit. The graphic representation of the cockpit is a geometrically valid model of the cockpit design. A graphic model of a pilot, called the pilot manikin, can be placed naturalistically in the cockpit model. The visibility tool has the capability of mapping the cockpit surfaces and other objects modeled in this graphic design space onto the simulated pilot's retinas for a given visual fixation.
Larimer, James; Arditi, Aries; Bergen, James; Badler, Norman
Knowledge sharing is important because individual knowledge is not transformed into organizational knowledge until it is shared. The conceptual model presents how social factors create the conditions for effective knowledge sharing. It illustrates how three dimensions of social capital impact with each other and with knowledge sharing. Social…
This paper addresses knowledge management assumptions and development visions in the following types of organisations: organic product-focused and organic service-focused organisations, mechanistic bureaucratic and mechanistic product-focused organisations that represent different models of value creation. These types of organisations are identified and examined in relation to the changing knowledge management context of the transition economy in Estonia. Knowledge management priorities assessed
\\u000a In software process improvement, accumulating and analyzing the historical data from past projects are essential work. However,\\u000a setting up the systematic and logical measurement and analysis program is very difficult. Many mature organizations have their\\u000a own measurement program for the process improvement. However, most of them are based on the statistical metrics-driven approach\\u000a that consequently limits logical reasoning on the
This study examined connections between science literacy and writing. Science e-mails were written as content-oriented professional development materials for K-8 teachers. E-mail drafts underwent multiple revisions. The study data included drafts, final e-mails, and feedback from the supervising scientist and the e-mails' teacher audience. The analyses, informed by Bereiter & Scardamalia's knowledge-transforming process (1987), Schindler's audience theories (2001), and Johnson
This report describes the principal findings and recommendations of a 2-year Rand research project on machine-aided knowledge acquisition and discusses the transfer of expertise from humans to machines, as well as the functions of planning, debugging, kno...
Under the conditions of microgravity, astronauts lose bone mass at a rate of 1% to 2% a month, particularly in the lower extremities such as the proximal femur: (1) The most commonly used countermeasure against bone loss has been prescribed exercise, (2) However, current exercise countermeasures do not completely eliminate bone loss in long duration, 4 to 6 months, spaceflight, (3,4) leaving the astronaut susceptible to early onset osteoporosis and a greater risk of fracture later in their lives. The introduction of the Advanced Resistive Exercise Device, coupled with improved nutrition, has further minimized the 4 to 6 month bone loss. But further work is needed to implement optimal exercise prescriptions, and (5) In this light, NASA's Digital Astronaut Project (DAP) is working with NASA physiologists to implement well-validated computational models that can help understand the mechanisms of bone demineralization in microgravity, and enhance exercise countermeasure development.
Pennline, James A.; Mulugeta, Lealem; Lewandowski, Beth E.; Thompson, William K.; Sibonga, Jean D.
This paper presents a projection regression model (PRM) to assess the relationship between a multivariate phenotype and a set of covariates, such as a genetic marker, age and gender. In the existing literature, a standard statistical approach to this problem is to fit a multivariate linear model to the multivariate phenotype and then use Hotelling’s T2 to test hypotheses of interest. An alternative approach is to fit a simple linear model and test hypotheses for each individual phenotype and then correct for multiplicity. However, even when the dimension of the multivariate phenotype is relatively small, say 5, such standard approaches can suffer from the issue of low statistical power in detecting the association between the multivariate phenotype and the covariates. The PRM generalizes a statistical method based on the principal component of heritability for association analysis in genetic studies of complex multivariate phenotypes. The key components of the PRM include an estimation procedure for extracting several principal directions of multivariate phenotypes relating to covariates and a test procedure based on wild-bootstrap method for testing for the association between the weighted multivariate phenotype and explanatory variables. Simulation studies and an imaging genetic dataset are used to examine the finite sample performance of the PRM.
Lin, Ja-an; Zhu, Hongtu; Knickmeyer, Rebecca; Styner, Martin; Gilmore, John; Ibrahim, Joseph G.
Data-driven models based on the methods of machine learning have proven to be accurate tools in predicting various natural\\u000a phenomena. Their accuracy, however, can be increased if several learning models are combined. A modular model is comprised\\u000a of a set of specialized models each of which is responsible for particular sub-processes or situations, and may be trained\\u000a on a subset
The Triple Helix model of university-industry-government relations can be generalized from a neo-institutional model of networks of relations to a neo-evolutionary model of how three selection environments operate upon one another. Two selection mechanisms operating upon each other can mutually shape a trajectory, while three selection environments can be expected to generate a regime. The neo-evolutionary model enables us to
We argue that learning about the nature and utility of scientific models and engaging in the process of creating and testing models should be a central focus of science education. To realize this vision, we created and evaluated the Model-Enhanced ThinkerTools (METT) Curriculum, which is an inquiry-oriented physics curriculum for middle school…
Relating knowledge management (KM) case studies in various organizational contexts to existing theoretical constructs of learning organizations, a new model, the MIKS (Member Integrated Knowledge System) Model is proposed to include the role of the individual in the process. Their degree of motivation as well as communication and learning…
The sketchy modeling environment (SKME) is a software tool for handling graphically modeled computer texts. It is a multi-representational environment with heterogeneous file collections. In this paper, a general formal definition for the basic notion file project description is given and a special feature of SKME, the file project support is introduced. The concept of file project originates from so
This paper suggests a model of embodied environmental education grounded in participant interviews, fieldwork, scholarly literature, and the author's own embodied relationship with the natural world. In this article, embodiment refers to a process that stems from Indigenous Knowledges and theatre. Although Indigenous Knowledges and theatre…
This research presents a computer model called EUREKA that begins with novice- like strotegies ond knowledge orgonizotions for solving physics word problems and acquires feotures of knowledge orgonizotions ond basic opprooches that chorocterire experts in this domoin. EUREKA learns a highly interrelated network of problem-type schemes with associated solution methodologies. Initially. super- ficial features of the problem statement form the
Few legal knowledge based systems have been constructed which provide numerical advice. None have been built in discretionary domains. Our research, directed towards the domains of sentencing and family law property division has lead to the development of three distinct forms of judicial discretion.To model these different discretionary domains we use diverse artificial intelligence tools including case-based reasoning and knowledge
Yaakov HaCohen Kerner; Uri J. Schild; John Zeleznikowl
Proposes a model of knowledge-based information retrieval (KBIR) that is based on a hierarchical concept graph (HCG) which shows relationships between index terms and constitutes a hierarchical thesaurus as a knowledge base. Conceptual distance between a query and an object is discussed and the use of Boolean operators is described. (25…
The basic premise of this paper is the fact that science has become a major industry: the knowledge industry. The paper throws some light on the reasons for the transformation of science from a limited, constrained and marginal craft into a major industry. It, then, presents a productivist industrial model of knowledge production, which shows its…
Using structural equation modeling analysis, this study examined the contribution of vocabulary and grammatical knowledge to second language reading comprehension among 190 advanced Chinese English as a foreign language learners. Vocabulary knowledge was measured in both breadth (Vocabulary Levels Test) and depth (Word Associates Test);…
Purpose – This research paper attempts to address the strategic challenges of developing knowledge-based innovation (KBI) in China through the analysis of the triple helix (TH) innovation networks between university, government and industry in China. In so doing, the TH model is adopted as an analytical framework to investigate the format and operations of knowledge networks within university, government and
As one of the main bodies of research and development, Swedish universities, who rank top worldwide in getting public funding, should highlight their responsibility in transferring knowledge into productivity in the construction of national innovation system within the new organizational field ``knowledge-based economy''. Based on the framework of Triple Helix Model, the paper describes and evaluates the dynamic mechanism of
Scherngell T. and Hu Y. Collaborative knowledge production in China: regional evidence from a gravity model approach, Regional Studies. This study investigates collaborative knowledge production in China from a regional perspective. The objective is to illustrate spatial patterns of research collaborations between thirty-one Chinese regions, and to estimate the impact of geographical, technological, and economic factors on the variation of
Two experiments involving 125 college and graduate students examined the interrelationship of subject-matter knowledge, interest, and recall in the field of human immunology and biology and assessed cross-domain performance in physics. Patterns of knowledge, interest, and performance fit well with the premises of the Model of Domain Learning. (SLD)
MIS curricula in business schools are challenged to provide MIS courses that give students a strong practical understanding of the basic technologies, while also providing enough hands-on experience to solve real life problems. As an experimental capstone MIS course, the authors developed a cluster-computing project to expose business students to…
Kitchens, Fred L.; Sharma, Sushil K.; Harris, Thomas
This activity discusses a two-day unit on ecology implemented during the summer of 2004 using the project-based science instructional (PBSI) approach. Through collaborative fieldwork, group discussions, presentations, and reflections, students planned, implemented, and reported their own scientific investigations on the environmental health of their local park in the borough of Queens, New York City. Students' questions included a wide range
With funding from NSF, the Prime the Pipeline Project (P[cube]) is responding to the need to strengthen the science, technology, engineering, and mathematics (STEM) pipeline from high school to college by developing and evaluating the scientific village strategy and the culture it creates. The scientific village, a community of high school…
This activity discusses a two-day unit on ecology implemented during the summer of 2004 using the project-based science instructional (PBSI) approach. Through collaborative fieldwork, group discussions, presentations, and reflections, students planned, implemented, and reported their own scientific investigations on the environmental health of…
How big is your project world. Is it big enough to contain other cultures, headquarters, hierarchies, and weird harpoon-like guns. Sure it is. The great American poet Walt Whitman said it best, 'I am large/I contain multitudes.' And so must you, Mr. and M...
Federal involvement in stimulating economic growth through the development and application of technology policy is currently the subject of serious debate. A recession and the recognition that an internationally competitive economy is a prerequisite for the attainment of national goals have fostered a number of technology policy initiatives aimed at improving the economic competitiveness of American industry. This paper suggests that the successful implementation of U.S. technology policy will require the adoption of a knowledge diffusion model, the development of user oriented information products and services, and a more 'activist' approach on the part of sci/tech librarians in the provision of scientific and technical information (STI). These changes will have a dramatic impact on the sci/tech library of the future and the preparation of sci/tech librarians.
Pinelli, Thomas E.; Barclay, Rebecca O.; Hannah, Stan; Lawrence, Barbara; Kennedy, John M.
Federal involvement in simulating economic growth through the development and application of technology policy is currently the subject of serious debate. A recession and the recognition that an internationally competitive economy is a prerequisite for the attainment of national goals have fostered a number of technology policy initiatives aimed at improving the economic competitiveness of American industry. This paper suggests that the successful implementation of U.S. technology policy will require the adoption of a knowledge diffusion model, the development of user oriented information products and services, and a more 'activist' approach on the part of sci/tech librarians in the provision of scientific and technical information (STI). These changes will have a dramatic impact on the sci/tech library of the future and the preparation of sci/tech librarians.
Pinelli, Thomas E.; Barclay, Rebecca O.; Hannah, Stan; Lawrence, Barbara; Kennedy, John M.
U.S. aerospace engineering faculty and students were surveyed as part of the NASA/DoD Aerospace Knowledge Research Project. Faculty and students were viewed as information processors within a conceptual framework of information seeking behavior. Questionn...
M. P. Holland, T. E. Pinelli, R. O. Barclay, J. M. Kennedy
Decisions made on new build coal-fired plants are driven by several factors - emissions, fuel logistics and electric transmission access all provide constraints. The crucial economic decision whether to build supercritical or subcritical units often depends on assumptions concerning the reliability/availability of each technology, the cost of on-fuel operations including maintenance, the generation efficiencies and the potential for emissions credits at some future value. Modeling the influence of these key factors requires analysis and documentation to assure the assets actually meet the projected financial performance. This article addresses some of the issue related to the trade-offs that have the potential to be driven by the supercritical/subcritical decision. Solomon Associates has been collecting cost, generation and reliability data on coal-fired power generation assets for approximately 10 years using a strict methodology and taxonomy to categorize and compare actual plant operations data. This database provides validated information not only on performance, but also on alternative performance scenarios, which can provide useful insights in the pro forma financial analysis and models of new plants. 1 ref., 1 fig., 3 tabs.
NASA's Superfluid Helium On-Orbit Transfer (SHOOT) project is a Shuttle-based experiment designed to acquire data on the properties of superfluid helium in micro-gravity. Aft Flight Deck Computer Software for the SHOOT experiment is comprised of several monitoring programs which give the astronaut crew visibility into SHOOT systems and a rule based system which will provide process control, diagnosis and error recovery for a helium transfer without ground intervention. Given present Shuttle manifests, this software will become the first expert system to be used in space. The SHOOT Command and Monitoring System (CMS) software will provide a near real time highly interactive interface for the SHOOT principal investigator to control the experiment and to analyze and display its telemetry. The CMS software is targeted for all phases of the SHOOT project: hardware development, pre-flight pad servicing, in-flight operations, and post-flight data analysis.
Castellano, Timothy P.; Raymond, Eric A.; Shapiro, Jeff C.; Robinson, Frank A.; Rosenthal, Donald A.
In this study, four road delivery projectmodels are analyzed by grey relational evaluation. The four models are design-bid-build (DBB), design-build (DB), construction management (CM) and design-build-maintenance (DBM). Evaluating road project delivery models is difficult because the projects differ from road to road, state to state and country to country. Thus, the evaluation data of project delivery systems are poor and lacking. Grey theory is an effective mathematical method, which is a multidisciplinary and generic theory dealing with systems characterized by poor information and/or for which information is lacking. Therefore, grey relational analysis and grey model are employed to compare the efficiency of the four road project delivery models. According to the result, DBM is the best model. DBB is the worst one and DB is better than CM. The results may provide public sectors to employ an adequate model so as to proceed with road construction project.
The number of competing-brands changes by new product's entry. The new product introduction is endemic among consumer packaged goods firm and is an integral component of their marketing strategy. As a new product's entry affects markets, there is a pressing need to develop market response model that can adapt to such changes. In this paper, we develop a dynamic model that capture the underlying evolution of the buying behavior associated with the new product. This extends an application of a dynamic linear model, which is used by a number of time series analyses, by allowing the observed dimension to change at some point in time. Our model copes with a problem that dynamic environments entail: changes in parameter over time and changes in the observed dimension. We formulate the model with framework of a state space model. We realize an estimation of the model using modified Kalman filter/fixed interval smoother. We find that new product's entry (1) decreases brand differentiation for existing brands, as indicated by decreasing difference between cross-price elasticities; (2) decreases commodity power for existing brands, as indicated by decreasing trend; and (3) decreases the effect of discount for existing brands, as indicated by a decrease in the magnitude of own-brand price elasticities. The proposed framework is directly applicable to other fields in which the observed dimension might be change, such as economic, bioinformatics, and so forth.
Increasingly people have to communicate knowledge across cultural and language boundaries. Even though recent technologies offer powerful communication facilities people often feel confronted with barriers which clearly reduce their chances of making their interaction a success. Concrete evidence concerning such problems derives from a number of projects, where generated knowledge often results in dead-end products. In the Alpine Space-project SILMAS (Sustainable Instruments for Lake Management in Alpine Space), in which both authors were involved, a special approach (syneris® ) was taken to avoid this problem and to manage projectknowledge in sustainable form. Under this approach knowledge input and output are handled interactively: Relevant knowledge can be developed continuously and users can always access the latest state of expertise. Resort to the respective tools and procedures can also assist in closing knowledge gaps and in developing innovative responses to familiar or novel problems. This contribution intends to describe possible ways and means which have been found to increase the chances of success of knowledge communication across cultural boundaries. The process of trans-cultural discussions of experts to find a standardized solution is highlighted as well as the problem of dissemination of expert knowledge to variant stakeholders. Finally lessons learned are made accessible, where a main task lies in the creation of a tool box for conflict solving instruments, as a demonstrable result of the project and for the time thereafter. The interactive web-based toolbox enables lake managers to access best practice instruments in standardized, explicit and cross-linguistic form.
The Brighton and Sussex Community-University Knowledge Exchange Programme's aim (BSCKE) was to fund projects that would lead to an 'exchange of knowledge' through collaborative work between the university and community groups, with a focus on the impact of marginalisation and issues of social exclusion. Our successful submission out- lined a key concern for community psychologists involving the seeming elevated rate
When modeling or redesigning a process, the knowledge-management perspective is seldomly used. Using the knowledge categorization developed by van Heusden and Jorna, we propose a knowledge-management perspective to provide a strategy for modeling and redesigning a business process. As an illustration of our approach, we use hospital data of multidisciplinary patients. This specific group of patients requires the involvement of different specialisms for their medical treatment that leads to more efforts regarding the coordination of care for these patients. In order to increase the care efficiency, knowledge that supports the reorganization of care for multidisciplinary patients should be provided. We use the above-mentioned knowledge-management perspective for creating new multidisciplinary units, in which different specialisms coordinate the treatment of specific groups of patients. PMID:16138541
In an ongoing collaborative effort between a group of NASA Ames scientists and researchers at the Institute for Human and Machine Cognition (IHMC) of the University of West Florida, a new version of CmapTools has been developed that enable scientists to construct knowledgemodels of their domain of expertise, share them with other scientists, make them available to anybody on the Internet with access to a Web browser, and peer-review other scientists models. These software tools have been successfully used at NASA to build a large-scale multimedia on Mars and in knowledgemodel on Habitability Assessment. The new version of the software places emphasis on greater usability for experts constructing their own knowledgemodels, and support for the creation of large knowledgemodels with large number of supporting resources in the forms of images, videos, web pages, and other media. Additionally, the software currently allows scientists to cooperate with each other in the construction, sharing and criticizing of knowledgemodels. Scientists collaborating from remote distances, for example researchers at the Astrobiology Institute, can concurrently manipulate the knowledgemodels they are viewing without having to do this at a special videoconferencing facility.
The aim of this paper is to argue for a number of statements about what is important for a client to do in order to improve quality in new infrastructure projects, with a focus on procurement and organizational issues. The paper synthesizes theoretical and empirical results concerning organizational performance, especially the role of the client for the quality of a project. The theoretical framework used is contract theory and transaction cost theory, where assumptions about rationality and self-interest are made and where incentive problems, asymmetric information, and moral hazard are central concepts. It is argued that choice of procurement type will not be a crucial factor. There is no procurement method that guarantees a better quality than another. We argue that given the right conditions all procurement methods can give good results, and given the wrong conditions, all of them can lead to low quality. What is crucial is how the client organization manages knowledge and the incentives for the members of the organization. This can be summarized as "organizational culture." One way to improve knowledge and create incentives is to use independent second opinions in a systematic way. PMID:24250274
The aim of this paper is to argue for a number of statements about what is important for a client to do in order to improve quality in new infrastructure projects, with a focus on procurement and organizational issues. The paper synthesizes theoretical and empirical results concerning organizational performance, especially the role of the client for the quality of a project. The theoretical framework used is contract theory and transaction cost theory, where assumptions about rationality and self-interest are made and where incentive problems, asymmetric information, and moral hazard are central concepts. It is argued that choice of procurement type will not be a crucial factor. There is no procurement method that guarantees a better quality than another. We argue that given the right conditions all procurement methods can give good results, and given the wrong conditions, all of them can lead to low quality. What is crucial is how the client organization manages knowledge and the incentives for the members of the organization. This can be summarized as “organizational culture.” One way to improve knowledge and create incentives is to use independent second opinions in a systematic way.
This paper presents an overview of the modeling issues relevant to portraying the construction of actionable knowledge within an effects-based targeting process. At the heart of these issues is the need to consider the various political, military, economi...
This paper reports on a case-study of applying the gen- eral purpose and widely accepted methodology CommonKADS to a clinical practice guideline. CommonKADS is focussed on obtain- ing a compact knowledgemodel. However, guidelines usually con- tain incomplete and ambiguous knowledge. So, the resulting knowl- edge model will be incomplete and we will need to detect what parts of the
\\u000a In this paper, we will focus on the importance of management of knowledge held by the organization’s human resources and gained\\u000a through experience and practice. For this issue, we propose a model for software best practices’ integration in a Knowledge\\u000a Management System (KMS) of a Software Development Community of Practices (SDCoP). This model aims on the one hand, to integrate
Objective Surgical Process Models (SPMs) are models of surgical interventions. The objectives of this study are to validate acquisition methods for Surgical Process Models and to assess the performance of different observer populations. Design The study examined 180 SPM of simulated Functional Endoscopic Sinus Surgeries (FESS), recorded with observation software. About 150,000 single measurements in total were analyzed. Measurements Validation metrics were used for assessing the granularity, content accuracy, and temporal accuracy of structures of SPMs. Results Differences between live observations and video observations are not statistically significant. Observations performed by subjects with medical backgrounds gave better results than observations performed by subjects with technical backgrounds. Granularity was reconstructed correctly by 90%, content by 91%, and the mean temporal accuracy was 1.8 s. Conclusion The study shows the validity of video as well as live observations for modeling Surgical Process Models. For routine use, the authors recommend live observations due to their flexibility and effectiveness. If high precision is needed or the SPM parameters are altered during the study, video observations are the preferable approach.
Neumuth, Thomas; Jannin, Pierre; Strauss, Gero; Meixensberger, Juergen; Burgert, Oliver
A new computational knowledge-based model for emulating human performance in decision making tasks is proposed. This model is mainly based on the knowledge acquired through past experience, the knowledge extracted from the environment and the relationships between the concepts that represent these two kinds of knowledge. The proposed model divides the decision making process into two phases. The first phase lies in the estimation of the decision outcomes using a net of concepts. In the second phase, the proposed model uses a value function to score each possible alternative. The design of the model focuses on some psychological and neurophysiological evidence from current research. In order to validate the model, it is compared with other widely used models that implement different theories of decision making under risk and uncertainty. The model comparison is centered on a well defined task, the Iowa Gambling Task, used in several psychological experiments. The comparison applies an evaluation method based on the optimization of each model in order to emulate human performance individually starting both the participant and the model from the same environmentally available information. The results show that the performance of the proposed model is quantitatively better than the other compared models. Besides, using relevant concepts extracted from interviews with the participants increases the performance of the proposed model. PMID:22698633
Iglesias, A; Del Castillo, M D; Serrano, J I; Oliva, J
Current science education reform efforts promote inquiry-based learning, a goal that requires appropriate tools and instructional approaches. This study investigated the influence of the beliefs and knowledge of four experienced secondary chemistry teachers in their use of new instructional software that generates models of atoms and molecules based on quantum mechanics. The software, which was developed through a National Science Foundation funded project, Quantum Science Across Disciplines (QSAD), was designed to promote inquiry learning. Qualitative research methods were used for this multiple case study. Data from surveys, interviews, and extended classroom observations revealed a close correlation between a teacher's model of the learner and his or her model of teaching. Combined models of learner and teacher had the greatest influence on their decisions about implementing QSAD software. Teachers who espoused a constructivist model of learning and related models of teaching used the software to promote student investigations and inductive approaches to learning. Other factors that appeared to support the use of inquiry methods included sufficient time for students to investigate phenomena, the extent of the teacher's pedagogical content knowledge, and the amount of training using QSAD software. The Views-On-Science-Technology-Society (VOSTS) instrument was used to compare the informants' beliefs about the epistemology of science to their classroom practices. Data related to the role of teachers' beliefs about scientific knowledge were inconclusive, and VOSTS results were inconsistent with the informants' stated beliefs. All four cases revealed that the teachers acted as agents of the school culture. In schools that promoted development of critical thinking, questioning, and self-direction in students, teachers were more likely to use a variety of instructional methods and emphasize construction of knowledge. These findings suggest that educational reform efforts must take into account teachers' belief systems and the prevailing ethos of the school.
This guidebook and teacher's guide accompany a personal computer software program and introduce the key elements of financial projectionmodeling to project the financial statements of an industrial enterprise. The student will then build a model on an electronic spreadsheet. The guidebook teaches the purpose of a financial model and the steps…
This paper introduces a utilitarian confidence testing statistic called Risk Inclination Model (RIM) which indexes all possible confidence wagering combinations within the confines of a defined symmetrically point-balanced test environment. This paper presents the theoretical underpinnings, a formal derivation, a hypothetical application, and…
Abstract Description Logics with concrete domains,present an approach to realize a general engineering workbench. They provide a representation language,that enables us to describe in a uniform way devices, assemblies and components along with their structure, constraints on attributes and physical laws as well as models of their correct and faulty behavior. Furthermore, sound and complete algorithms can be given for
The increase of technological complexity in surgery has created a need for novel man-machine interaction techniques. Specifically, context-aware systems which automatically adapt themselves to the current circumstances in the OR have great potential in this regard. To create such systems, models of surgical procedures are vital, as they allow analyzing the current situation and assessing the context. For this purpose, we have developed a Surgical Process Model based on Description Logics. It incorporates general medical background knowledge as well as intraoperatively observed situational knowledge. The representation consists of three parts: the Background KnowledgeModel, the Preoperative Process Model and the Integrated Intraoperative Process Model. All models depend on each other and create a concise view on the surgery. As a proof of concept, we applied the system to a specific intervention, the laparoscopic distal pancreatectomy.
To facilitate the professional development of service providers, the Virginia project on geriatric alcohol abuse and alcoholism developed and used an informational booklet, brochure, and video in a "train the trainer" model. A core group received extensive training, and then trained colleagues in their local communities. Knowledge gains were documented among both trainers and trainees. Follow-up interviews with agency personnel revealed substantial impact on a broad spectrum of service systems and improvements in interagency coordination. Results are discussed in terms of the educational needs of professional service providers regarding the unique aspects of alcoholism and alcohol abuse in the older population. PMID:10800863
We used four different methods to determine the best means of assessing over 200 preservice elementary teachers' growth in knowledge of models and their use in K-8 classrooms while participating in the Science Capstone course that focused on the unifying themes of models in science. Each assessment method probed a different aspect of models (from…
Everett, Susan A.; Otto, Charlotte A.; Luera, Gail R.
This paper presents an approach to integrate OR-expert's knowledge about the modelling of real world problems into a Decision Support System (DSS). First, the problem of modelling is considered and a short survey over the structure of DSS is given. Then a model of different phases of interaction between a DSS and its user is discussed. In the framework of
We modelled the distribution of soil properties across the agricultural zone on the Australian continent using data mining and knowledge discovery from databases (DM&KDD) tools. Piecewise linear tree models were built choosing from 19 climate variables, digital elevation model (DEM) and derived terrain attributes, four Landsat multi-spectral scanner (MSS) bands, land use and lithology maps as predictors of topsoil and
Elisabeth N. Bui; Brent L. Henderson; Karin Viergever
There is an increasing interest on doing research in the field of information retrieval which aims to incorporate new dimensions, apart from text based retrieval, to the Web search engines. Geographical Information Retrieval (GIR) aims to index Web resources using a geographic context. The process of identifying the geographic context starts with the detection of different types of geographic references associated to the documents, as for example, the occurrence of place names. This paper presents a model for detecting geographic references in Web documents based on a set of heuristics. Moreover, new concepts and methods for disambiguation of many places with the same name are addressed. Finally, a prototype was built, called GeoSEn which aimed to validate the effectiveness of the proposed model.
Campelo, Cláudio Elizio Calazans; de Souza Baptista, Cláudio
The Fitzroy Valley Numeracy Project (FVNP) was designed to improve numeracy outcomes for Indigenous students by developing a systematic, co-ordinated approach to teaching primary school mathematics. In this study, using early project data, we examine FVNP teachers' self-reported pedagogic content knowledge and classroom practice from initial…
A project cost threshold must be determined in a relatively short time by a project owner as a reference for evaluating competitive bids. In practice, such a decision is mainly based on subjective experience. This work presents a novel systematic procedure for assessing a reasonable project cost threshold. The proposed procedure involves a utility-based multi-criteria evaluation model and a cost
The 20 essays in this collection are based on a project undertaken by the California Conservation Corps (CCC) and the Model Literacy Project in 1983-85. (The goal of the project was to institute changes within the CCC to enhance the literacy of corpsmembers.) Essays describe innovative approaches to literacy education, analyze bureaucratic…
The paper deals with important aspects of construction management key factors identification and their relative significance for the construction projects management effectiveness. The approach of artificial neural network allows the construction projects management effectiveness model to be built and to determine the key determinants from a host of possible management factors that influence the project effectiveness in terms of budget
The Physiome Projects are a diverse set of scientifically independent projects addressing integrative systems physiology and biology, conducted by individual investigators and teams from different countries. The emphasis in these projects is on medically related physiology and pharmacology. They gather modeling work, information processing methods and tools, databases, and other resources and make them available to a large research community.
Clustering high dimensional data is a big challenge in data mining due to the curse of dimensionality. To solve this problem, projective clustering has been defined as an extension of traditional clustering that seeks to find projected clusters in subsets of dimensions of a data space. In this paper, the problem of modelingprojected clusters is first discussed, and an
A new synthetical knowledge representation model that integrates the attribute grammar model with the semantic network model\\u000a was presented. The model mainly uses symbols of attribute grammar to establish a set of syntax and semantic rules suitable\\u000a for a semantic network. Based on the model, the paper introduces a formal method defining data flow diagrams (DFD) and also\\u000a simply explains
In the "i2010 eGovernment Action Plan" it is stated that: "Member States have committed themselves to inclusive eGovernment objectives to ensure that by 2010 all citizens [...] become major beneficiaries of eGovernment, and European public administrations deliver public information and services that are more easily accessible and increasingly trusted by the public, through innovative use of ICT, increasing awareness of the benefits of eGovernment and improved skills and support for all users" (Commission of the European Communities 2006). For example, in the latest study on e-Government in Switzerland conducted by the University of St. Gallen, it was stated for the first time that measures for e-Government quality improvement are change (42% of the Swiss cantons, 19% of the Swiss municipalities) and benchmarking (business) activities/processes (41% of the Swiss cantons, 50% of the Swiss municipalities). But in the same study, design and IT-supported processes are considered a huge challenge (Schedler et al. 2007a, b). Thus, what Becker et al. already described still holds true: Although the benefit of having formal models of business processes is well known in public administrations, too few processes have been modelled and lesser still have been automated (Becker et al. 2003).
Feldkamp, Daniela; Hinkelmann, Knut; Thönssen, Barbara
The IPCC AR4 not only provided conclusive evidence about anticipated global warming at century scales, but also indicated with a high level of certainty that the warming is caused by anthropogenic emissions. However, an outstanding knowledge-gap is to develop credible projections of climate extremes and their impacts. Climate extremes are defined in this context as extreme weather and hydrological events,
Engineers are an extraordinarily diverse group of professionals, but an attribute common to all engineers is their use of information. Engineering can be conceptualized as an information processing system that must deal with work-related uncertainty through patterns of technical communications. Throughout the process, data, information, and tacit knowledge are being acquired, produced, transferred, and utilized. While acknowledging that other models exist, we have chosen to view the information-seeking behavior of engineers within a conceptual framework of the engineer as an information processor. This article uses the chosen framework to discuss information-seeking behavior of engineers, reviewing selected literature and empirical studies from library and information science, management, communications, and sociology. The article concludes by proposing a research agenda designed to extend our current, limited knowledge of the way engineers process information.
Pinelli, Thomas E.; Bishop, Ann P.; Barclay, Rebecca O.; Kennedy, John M.
In this article, the economic evaluation of information system projects using present value is analyzed based on triangular fuzzy numbers. Information system projects usually have numerous uncertainties and several conditions of risk that make their economic evaluation a challenging task. Each year, several information system projects are cancelled before completion as a result of budget overruns at a cost of several billions of dollars to industry. Although engineering economic analysis offers tools and techniques for evaluating risky projects, the tools are not enough to place information system projects on a safe budget/selection track. There is a need for an integrative economic analysis model that will account for the uncertainties in estimating project costs, benefits, and useful lives of uncertain and risky projects. In this study, we propose an approximate method of computing project present value using the concept of fuzzy modeling with special reference to information system projects. This proposed model has the potential of enhancing the project selection process by capturing a better economic picture of the project alternatives. The proposed methodology can also be used for other real-life projects with high degree of uncertainty and risk.
Omitaomu, Olufemi A [ORNL; Badiru, Adedeji B [ORNL
The Navy's Next Generation Computer Resources (NGCR) program set up a Project Support Environment Standards Working Group (PSESWG) to help in the task of establishing interface standards that will allow the U.S. Navy to more easily and effectively assemble softwar e-inten- sive Project Support Environments (PSEs) from commercial sources. A major focus of PSESWG is the development of a service-based
Alan W. Brown; David J. Carney; Peter H. Feiler; Patricia A. Oberndorf; Marvin V. Zelkowitz
Evaluating the energy of a protein molecule is one of the most computationally costly operations in many protein structure modeling applications. In this paper, we present an efficient implementation of knowledge-based energy functions by taking advantage of the recent Graphics Processing Unit (GPU) architectures. We use DFIRE, a knowledge-based all-atom potential, as an example to demonstrate our GPU implementations on
Research in the computerization of Clinical Guidelines (CG) has often opposed document-based approaches to knowledge-based\\u000a ones. In this paper, we suggest that both approaches can be used simultaneously to assess the contents of textual Clinical\\u000a Guidelines. In this first experiment, we investigate the mapping between a document model, which has been marked-up to structure\\u000a its recommendations, and a knowledge structure
This paper proposes a new model for team building, which enables teachers to build coherent teams rapidly and fairly for the term projects of software engineering courses. Moreover, the model can also be used to build teams for any type of project, if the team member candidates are students, or if they are inexperienced on a certain subject. The…
In this paper an estimation cost model for risk management projects, called SECOMO is presented. This model helps managers reasoning about the cost and schedule implications of network security decisions that security teams may need to make. It aims to achieve several objectives including: (1) providing accurate cost and scheduling estimates for currently security projects, and (2) providing a normative
Jihène Krichène; Noureddine Boudriga; Sihem Guemara El Fatmi
The SEG Advanced ModelingProject (SEAM) is a consortium being run by the Society of Exploration Geophysicists whose goal is to use numerical modeling to develop geophysical datasets that mimic those used for exploration and characterization of petroleum resources. Phase I of the project is underway and involves the calculation of a large 3D seismic exploration dataset in deepwater using
The U.S. aerospace industry is experiencing profound changes created by a combination of domestic actions and circumstances such as airline deregulation. Other changes result from external trends such as emerging foreign competition. These circumstances intensify the need to understand the production, transfer, and utilization of knowledge as a precursor to the rapid diffusion of technology. This article presents a conceptual framework for understanding the diffusion of aerospace knowledge. The framework focuses on the information channels and members of the social system associated with the aerospace knowledge diffusion process, placing particular emphasis on aerospace librarians as information intermediaries.
Pinelli, Thomas E.; Kennedy, John M.; Barclay, Rebecca O.
The United States aerospace industry is experiencing profound changes created by a combination of domestic actions and circumstances such as airline deregulation. Other changes result from external trends such as emerging foreign competition. These circumstances intensify the need to understand the production, transfer, and utilization of knowledge as a precursor to the rapid diffusion of technology. Presented here is a conceptual framework for understanding the diffusion of technology. A conceptual framework is given for understanding the diffusion of aerospace knowledge. The framework focuses on the information channels and members of the social system associated with the aerospace knowledge diffusion process, placing particular emphasis on aerospace librarians as information intermediaries.
Pinelli, Thomas E.; Kennedy, John M.; Barclay, Rebecca O.
Lake Superior State University, a comprehensive rural public university with approximately 10% Native-Americans enrolled, located in Michigan's eastern Upper Peninsula, U.S.A., has redesigned it's undergraduate geology major by developing an entire curriculum around a project-centered integration of geoscience sub-disciplines. Our model, adapted from modern educational theory, advocates sub-discipline integration by implementing problem-based learning through coursework that develops students' intellectual skills and engages them in using complex reasoning in real-world contexts. Students in this new curriculum will actively discover how to learn about a new geologic province, what questions to ask in approaching problems, where and how to find answers, and how to apply knowledge to solving problems. To accomplish our goals, we redesigned our pedagogy for all courses by creating active learning environments including cooperative learning, jigsaw methodologies, debates, investigation oriented laboratories, use of case studies, writing and communication intensive exercises, and research experiences. Fundamental sub-discipline concepts were identified by our national survey and are presented in the context of sequentially ordered problems that reflect increasing geological complexity. All courses above first year incorporate significant field experience. Our lower division courses include a two semester sequence of physical and historical geology in which physical processes are discussed in the context of their historical extension and one semester of structure/tectonics and mineralogy/petrology. The lower division culminates with a three week introductory field geology course. Our upper division courses include hydrologic systems, environmental systems, geochemical systems, tectonic systems, geophysical systems, clastic systems, carbonate systems, two seminar courses, and advanced field geology. The two field courses, offered in different geologic provinces, provide the opportunity for students to gather the field data that are an integral part of upper division course projects. These courses require an in-depth understanding of geologic principles and promote sophisticated integration and application of sub-disciplinary concepts.
With the rising incidence of HIV/AIDS in China, nurses will increasingly be caring for patients with HIV/AIDS. Thus, it is necessary that they have enough knowledge to reduce the risk of occupationally acquired HIV infection and that they change their attitude to care for HIV/AIDS patients. The objective of this study is to explore the relationship between student nurses' HIV/AIDS knowledge and their attitude using a structural equation model (SEM). A cross-sectional survey was conducted in January 2008 among 528 student nurses at the technical secondary school of the China Medical University. An SEM is proposed to determine the direction and magnitude of the interdependent effects between the latent factors. The SEM was built using LISREL version 8.5. The measurement properties of the latent factors underlying the questionnaire were based on a confirmatory factor analysis (CFA). Our results as following, HIV/AIDS knowledge and attitude may be measured by seven underlying constructs, namely, preventive knowledge, knowledge of transmission routes, specialty knowledge, knowledge of nontransmission routes, positive attitude toward HIV/AIDS, negative attitude toward HIV/AIDS, and occupational attitude. The SEM fits the data well. The interdependent relationships between these constructs identified the factors of preventive knowledge, specialty knowledge, and attitude toward HIV/AIDS as having both direct and indirect effects on occupational attitude. In conclusions, our results represent an initial effort to assess the relationship between student nurses' HIV/AIDS knowledge and their attitude toward the disease. CFA and SEM analysis have demonstrated their usefulness in evaluating multifactor complex constructs. PMID:20113151
The future is unknown and uncertain, and is confounded by reflexivity and beliefs based on different epistemologies, or knowledge systems. Indeed, because the future is yet to happen, there is no 'true' state of the future and as such, alternative models may collectively provide the most useful representation. Scenarios, known to be effective for offering contrasting exploratory models of how
We present mathematical learning models--predictions of student's knowledge vs amount of instruction--that are based on assumptions motivated by various theories of learning: tabula rasa, constructivist, and tutoring. These models predict the improvement (on the post-test) as a function of the pretest score due to intervening instruction and also…
Purpose – This paper aims to focus on the diffusion process of Knowledge management systems (KMSs). Specifically, to identify the sequence of stages of the process. Design\\/methodology\\/approach – The paper presents a six-stage model of the KMS diffusion process. It then provides an empirical test of the sequence of steps in the KMS diffusion process in Australia. Structural equation modelling
The methods of artificial intelligence are widely used in soft computing technology due to its remarkable prediction accuracy. However, artificial intelligent models are trained using large amount of data obtained from the operation of the off-road vehicle. In contrast, fuzzy knowledge-based models are developed by using the experience of the traction in order to maintain the vehicle traction as required
NE-KAMS knowledge base will assist computational analysts, physics model developers, experimentalists, nuclear reactor designers, and federal regulators by: (1) Establishing accepted standards, requirements and best practices for V&V and UQ of computational models and simulations, (2) Establishing accepted standards and procedures for qualifying and classifying experimental and numerical benchmark data, (3) Providing readily accessible databases for nuclear energy related experimental
The present study investigated beginning kindergarten student teachers' mental models of attachment as a part of their practical knowledge about caregiving. Mental models of attachment (i.e. how students construct their own attachment-related childhood experiences and relationships from their current perspective) were assessed with an Adult…
This paper presents several methods to automatically detecting students' mental models in MetaTutor, an intelligent tutoring system that teaches students self-regulatory processes during learning of complex science topics. In particular, we focus on detecting students' mental models based on student-generated paragraphs during prior knowledge…
The paper describes a systematic approach which combines model identification techniques that are appropriately tailored to accommodate a priori knowledge of plant behaviour. The plant under consideration is a high temperature gas-fired multi-zone industrial furnace. The aim is to obtain a model which replicates the nonlinear temperature control loop to be used for improved control system design. The approach is
Benoit Vinsonneau; David P. Goodall; K. J. Burnham; D. Brie
The general operation of KATE, an artificial intelligence controller, is outlined. A shuttle environmental control system (ECS) demonstration system for KATE is explained. The knowledge base model for this system is derived. An experimental test procedure is given to verify parameters in the model.
In this article Donald Philip describes Knowledge Building, a pedagogy based on the way research organizations function. The global economy, Philip argues, is driving a shift from older, industrial models to the model of the business as a learning organization. The cognitive patterns of today's Net Generation students, formed by lifetime exposure…
In the automotive and the aerospace industry large amounts of expensively gathered experimental data are stored in huge databases. The real worth of these databases lies not only in easy data access, but also in the additional possibility of extracting the engineering knowledge implicitly contained in these data. As analytical modeling techniques in engineering are usually limited in model complexity,
The process of adopting research findings in the clinical setting is challenging, regardless of the area of practice. One strategy to facilitate this process is the use of knowledge brokering. Knowledge brokers (KBs) are individuals who work to bridge the gap between researchers and knowledge users. In the health care setting, KBs work closely with clinicians to facilitate enhanced uptake of research findings into clinical practice. They also work with researchers to ensure research findings are translatable and meaningful to clinical practice. The present article discusses a KB’s role in a respiratory care setting. Working closely with both researchers and clinicians, the KB has led teams in the process of conceptualizing, developing, testing, disseminating and evaluating several projects related to respiratory care, including projects related to mobility in critical care settings and acute exacerbations of chronic obstructive pulmonary disease; inspiratory muscle training; and the use of incentive spirometry in postsurgical populations. The KB role has provided an important communication link between researcher and knowledge user that has facilitated evidence-informed practice to improve patient care.
Background Bayesian Network (BN) is a powerful approach to reconstructing genetic regulatory networks from gene expression data. However, expression data by itself suffers from high noise and lack of power. Incorporating prior biological knowledge can improve the performance. As each type of prior knowledge on its own may be incomplete or limited by quality issues, integrating multiple sources of prior knowledge to utilize their consensus is desirable. Results We introduce a new method to incorporate the quantitative information from multiple sources of prior knowledge. It first uses the Naïve Bayesian classifier to assess the likelihood of functional linkage between gene pairs based on prior knowledge. In this study we included cocitation in PubMed and schematic similarity in Gene Ontology annotation. A candidate network edge reservoir is then created in which the copy number of each edge is proportional to the estimated likelihood of linkage between the two corresponding genes. In network simulation the Markov Chain Monte Carlo sampling algorithm is adopted, and samples from this reservoir at each iteration to generate new candidate networks. We evaluated the new algorithm using both simulated and real gene expression data including that from a yeast cell cycle and a mouse pancreas development/growth study. Incorporating prior knowledge led to a ~2 fold increase in the number of known transcription regulations recovered, without significant change in false positive rate. In contrast, without the prior knowledge BN modeling is not always better than a random selection, demonstrating the necessity in network modeling to supplement the gene expression data with additional information. Conclusion our new development provides a statistical means to utilize the quantitative information in prior biological knowledge in the BN modeling of gene expression data, which significantly improves the performance.
This paper explores three factors-public policy, the Japanese (national) innovation system, and knowledge-that influence technological innovation in Japan. To establish a context for the paper, we examine Japanese culture and the U.S. and Japanese patent systems in the background section. A brief history of the Japanese aircraft industry as a source of knowledge and technology for other industries is presented. Japanese and U.S. alliances and linkages in three sectors-biotechnology, semiconductors, and large commercial aircraft (LCA)-and the importation, absorption, and diffusion of knowledge and technology are examined next. The paper closes with implications for diffusing knowledge and technology, U.S. public policy, and LCA.
Pinelli, Thomas E.; Barclay, Rebecca O.; Kotler, Mindy L.
This document directly reviews the current Defense Threat Reduction Agency (DTRA) PRDA contracts and describes how they can best be integrated with the DOE CTBT R&D Knowledge Base. Contract descriptions and numbers listed below are based on the DOE CTBT R&D Web Site - http://www.ctbt.rnd.doe.gov. More detailed information on the nature of each contract can be found through this web site. In general, the location related PRDA contracts provide products over a set of categories. These categories can be divided into five areas, namely: Contextual map information; Reference event data; Velocity models; Phase detection/picking algorithms; and Location techniques.
Schultz, C.A.; Bhattacharyya, J.; Flanaga, M.; Goldstein, P.; Myers, S.; Swenson, J.
The implications of a changing climate have a profound impact on human life, society, and policy making. The need for accurate climate prediction becomes increasingly important as we better understand these implications. Currently, the most widely used climate prediction relies on the synthesis of climate model simulations organized by the Coupled Model Intercomparison Project (CMIP); these simulations are ensemble-averaged to construct projections for the 21st century climate. However, a significant degree of bias and variability in the model simulations for the 20th century climate is well-known at both global and regional scales. Based on that insight, this study provides an alternative approach for constructing climate projections that incorporates knowledge of model bias. This approach is demonstrated to be a viable alternative which can be easily implemented by water resource managers for potentially more accurate projections. Tests of the new approach are provided on a global scale with an emphasis on semiarid regional studies for their particular vulnerability to water resource changes, using both the former CMIP Phase 3 (CMIP3) and current Phase 5 (CMIP5) model archives. This investigation is accompanied by a detailed analysis of the dynamical processes and water budget to understand the behaviors and sources of model biases. Sensitivity studies of selected CMIP5 models are also performed with an atmospheric component model by testing the relationship between climate change forcings and model simulated response. The information derived from each study is used to determine the progressive quality of coupled climate models in simulating the global water cycle by rigorously investigating sources of model bias related to the moisture budget. As such, the conclusions of this project are highly relevant to model development and potentially may be used to further improve climate projections.
Abstract Recently, enterprise systems have been extensively adopted to boost enterprise competitiveness. The development and implementation of enterprise systems is a knowledge intensive procedure, being related to enterprise processes and involving information, system and software engineering technologies. Consequently, knowledge management is required to enhance the effectiveness of enterprise system development and implementation, thus helping to increase industrial competitiveness. This study
Cheng-ter Ho; Yuh-min Chen; Yuh-jen Chen; Chin-bin Wang
The Hanford Environmental Dose Reconstruction (HEDR) Project has developed a set of computer models for estimating the possible radiation doses that individuals may have received from past Hanford Site operations. This document describes the validation of these models. In the HEDR Project, the model validation exercise consisted of comparing computational model predictions with limited historical field measurements and experimental measurements that are independent of those used to develop the models. The results of any one test do not mean that a model is valid. Rather, the collection of tests together provide a level of confidence that the HEDR models are valid.
Background: Effective cancer pain management requires accurate knowledge, attitudes, and assessment skills. The purpose of this study was to obtain information about the knowledge and attitudes of nurses concerning cancer pain management with the use Health Belief Model (HBM) as conceptual framework. Materials and Methods: The study was a descriptive survey and included 98 randomly selected nurses from Alzahra hospital, Isfahan, Iran. A self-administered questionnaire which was designed on the basis of HBM was used to collect the data. Knowledge, attitudes, and HBM constructs regarding cancer pain were the main research variables. The obtained data were analyzed by SPSS (version11.5) using descriptive statistics, independent t–test, and Pearson correlation at the significant level of ?=0.05. Results: Ninety-eight nurses aged 38.7 ± 7.04 years were studied in this survey. From the 10 pain knowledge questions assessed, the mean number of correctly answered question was 61.2 (SD=16.5), with a range of 30–100. There was a direct correlation between knowledge and attitude of nurses with HBM constructs except for perceived barriers and perceived threat. Among the HBM constructs, the highest score was related to self-efficacy with mean score of 87.2 (SD=16.4). Conclusions: The findings support the concern of inadequate knowledge and attitudes in relation to cancer pain management. We believe that basic and continuing education programs may improve the knowledge level of nursing about pain management.
We address the automatic generation of large geometric models. This is important in visualization for several reasons. First, many applications need access to large but interesting data models. Second, we often need such data sets with particular characteristics (e.g., urban models, park and recreation landscape). Thus we need the ability to generate models with different parameters. We propose a new
Chee Keng Yap; Henning Biermann; Aaron Hertzmann; Chen Li; Jon Meyer; Hsing-Kuo Pao; Salvatore Paxia
The consequences of an anticipated increase in atmospheric COâ concentrations from the combustion of fossil fuels and other anthropogenic activities cannot be adequately evaluated without realistic estimates of future COâ levels. Historic fossil fuel usage to the present, growing at a rate of 4.5% per year until 1973 and at a slower rate of 1.9% after 1973, is projected at
J. R. Trabalka; W. R. Emanuel; R. H. Gardner; D. E. Reichle; J. Edmonds; J. Reilly; B. Moore
As environmental concepts, the ozone layer and ozone hole are important to understand because they can profoundly influence our health. In this paper, we examined: (a) children's and adults' knowledge of the ozone layer and its depletion, and whether this knowledge increases with age' and (b) how the 'ozone layer' and 'ozone hole' might be structured as scientific concepts. We generated a standardized set of questions and used it to interview 24 kindergarten students, 48 Grade 3 students, 24 Grade 5 students, and 24 adults in university, in Canada. An analysis of participants' responses revealed that adults have more knowledge than children about the ozone layer and ozone hole, but both adults and children exhibit little knowledge about protecting themselves from the ozone hole. Moreover, only some participants exhibited 'mental models' in their conceptual understanding of the ozone layer and ozone hole. The implications of these results for health professionals, educators, and scientists are discussed.
In this collaborative research project between Pennsylvania State University, Colorado State University and Florida State University, we mainly focused on developing multi-resolution algorithms which are suitable to regional ocean modeling. We developed h...
\\u000a We discuss causal structure learning based on linear structural equation models. Conventional learning methods most often\\u000a assume Gaussianity and create many indistinguishable models. Therefore, in many cases it is difficult to obtain much information\\u000a on the structure. Recently, a non-Gaussian learning method called LiNGAM has been proposed to identify the model structure\\u000a without using prior knowledge on the structure. However,
We address the automatic generation of large geometric models. This is important in visualization for several reasons. First, many applications need access to large but interesting data models. Second, we often need such data sets with particular characteristics (e.g., urban models, park and recreation landscape). Thus we need the ability to generate models with different parameters. We propose a new approach for generating such models. It is based on a top-down propagation of statistical parameters. We illustrate the method in the generation of a statistical model of Manhattan. But the method is generally applicable in the generation of models of large geographical regions. Our work is related to the literature on generating complex natural scenes (smoke, forests, etc) based on procedural descriptions. The difference in our approach stems from three characteristics: modeling with statistical parameters, integration of ground truth (actual map data), and a library-based approach for texture mapping.
The present paper examines the tie between knowledge and behavior in a noun generalization context. An experiment directly comparing noun generalizations of children at the same point in development in forced choice and yes/no tasks reveals task-specific differences in the way children's knowledge of nominal categories is brought to bear in a moment. To understand the cognitive system that produced these differences, the real-time decision processes in these tasks were instantiated in a dynamic field model. The model captures both qualitative and quantitative differences in performance across tasks and reveals constraints on the nature of children's accumulated knowledge. Additional simulations of developmental change in the yes/no task between 2 and 4 years of age illustrate how changes in children's representations translate into developmental changes in behavior. Together, the empirical data and model demonstrate the dynamic nature of knowledge and are consistent with the perspective that knowledge cannot be separated from the task-specific processes that create behavior in the moment.
Samuelson, Larissa K.; Schutte, Anne R.; Horst, Jessica S.
To quickly find relevant information from huge amounts of data is a very challenging issue for intelligence analysts. Most employ their prior domain knowledge to improve their process of finding relevant information. In this paper, we explore the influences of a user's prior domain knowledge on the effectiveness of an information seeking task by using seed user models in an enhanced information retrieval system. In our approach, a user model is created to capture a user's intent in an information seeking task. The captured user intent is then integrated with the attributes describing an information retrieval system in a decision theoretic framework. Our test bed consists of two benchmark collections from the information retrieval community: MEDLINE and CACM. We divide each query set from a collection into two subsets: training set and testing set. We use three different approaches to selecting the queries for the training set: (1) the queries generating large domain knowledge, (2) the queries relating to many other queries, and (3) a mixture of (1) and (2). Each seed user model is created by running our enhanced information retrieval system through such a training set. We assess the effects of having more domain knowledge, or more relevant domain knowledge, or a mixture of both on the effectiveness of a user in an information seeking task.
The problem of acquiring a simple but sufficiently accurate model of a dynamic system is made more difficult when the dynamic system of interest is a multibody system comprised of several components. A low order system model may be created by reducing the order of the component models and making use of various available multibody dynamics programs to assemble them into a system model. The difficulty is in choosing the reduced order component models to meet system level requirements. The projection and assembly method, proposed originally by Eke, solves this difficulty by forming the full order system model, performing model reduction at the the system level using system level requirements, and then projecting the desired modes onto the components for component level model reduction. The projection and assembly method is analyzed to show the conditions under which the desired modes are captured exactly; to the numerical precision of the algorithm.
This paper presents a hybrid neuro-fuzzy network-priori knowledgemodel in temperature control of a gas water heater system. The hybrid model consists in a cascade connection of two blocks: an approximate first principles model (FPM) and an unknown block. The first principles model is constructed based in the balance equations of the system and in a priori knowledge. The unknown
Jose Antonio Vieira; Fernando Morgado Dias; Alexandre Manuel Mota
Knowledge utilization (KU) is an essential component of today's nursing practice and healthcare system. Despite advances in knowledge generation, the gap in knowledge transfer from research to practice continues. KU models have moved beyond factors affecting the individual nurse to a broader perspective that includes the practice environment and the socio-political context. This paper proposes one such theoretical model the Joint Venture Model of Knowledge Utilization (JVMKU). Key components of the JVMKU that emerged from an extensive multidisciplinary review of the literature include leadership, emotional intelligence, person, message, empowered workplace and the socio-political environment. The model has a broad and practical application and is not specific to one type of KU or one population. This paper provides a description of the JVMKU, its development and suggested uses at both local and organizational levels. Nurses in both leadership and point-of-care positions will recognize the concepts identified and will be able to apply this model for KU in their own workplace for assessment of areas requiring strengthening and support. PMID:16761801
This paper explores three factors-public policy, the Japanese (national) innovation system, and knowledge-that influence technological innovation in Japan. To establish a context for the paper, we examine Japanese culture and the U.S. and Japanese patent ...
This paper explores three factors-public policy, the Japanese (national) innovation system, and knowledge-that influence technological innovation in Japan. To establish a context for the paper, we examine Japanese culture and the U.S. and Japanese patent ...
We implement projected Hartree-Fock in a shell-model basis and compare against exact numerical results from full space diagonalization. We consider the accuracy of projected Hartree-Fock for the excited state spectrum, the moment of inertia, and also odd-even staggering. )
In the process of information project implementation of manufacturing enterprises, there are many risks, to set up a set of risk management mechanism is guaranteed for them to succeed in ERP implementation. Firstly, based on the analysis of ERP project implementation of China, the paper introduced the risk management methodology systems and the relative model, which includes three main steps,
Society relies on Earth system models (ESMs) to project future climate and carbon (C) cycle feedbacks. However, the soil C response to climate change is highly uncertain in these models and they omit key biogeochemical mechanisms. Specifically, the traditional approach in ESMs lacks direct microbial control over soil C dynamics. Thus, we tested a new model that explicitly represents microbial mechanisms of soil C cycling on the global scale. Compared with traditional models, the microbial model simulates soil C pools that more closely match contemporary observations. It also projects a much wider range of soil C responses to climate change over the twenty-first century. Global soils accumulate C if microbial growth efficiency declines with warming in the microbial model. If growth efficiency adapts to warming, the microbial modelprojects large soil C losses. By comparison, traditional modelsproject modest soil C losses with global warming. Microbes also change the soil response to increased C inputs, as might occur with CO2 or nutrient fertilization. In the microbial model, microbes consume these additional inputs; whereas in traditional models, additional inputs lead to C storage. Our results indicate that ESMs should simulate microbial physiology to more accurately project climate change feedbacks.
Wieder, William R.; Bonan, Gordon B.; Allison, Steven D.
The design of a variable structure model reference robust control without a prior knowledge of high frequency gain sign is presented. Based on an appropriate monitoring function, a switching scheme for some control signals is proposed. It is shown that after a finite number of switching, the tracking error converges to zero at least exponentially for plants with relative degree
Lin Yan; Liu Hsu; Ramon R. Costa; Fernando C. Lizarralde
Purpose: This paper aims to review the concepts and constructs of some common models and frameworks advocated for knowledge management (KM) and organisational learning (OL) in literature. It sets forth a critical enquiry towards the integration of KM and OL practices and their relationship with the concepts of the learning organisation (LO) and…
In this study, a comprehensive educational effectiveness model is tested in relation to student's civic knowledge. Multilevel analysis was applied on the dataset of the IEA Civic Education Study (CIVED; Torney-Purta, Lehmann, Oswald, & Schulz, 2001), which was conducted among junior secondary-school students (age 14), their schools, and their…
Isac, Maria Magdalena; Maslowski, Ralf; van der Werf, Greetje
This paper proposes a time series knowledge mining framework, designed to favor the synergy between subsequence time series clustering and predictive tools such as Hidden Markov Models. Many tasks for temporal data mining rely heavily on the choice of the representation scheme and the dissimilarity measure. The first part is concerned with detailed representation taxonomy for numeric and symbolic time
Emerging from a human constructivist view of learning and a punctuated model of conceptual change, these studies explored differences in the structural complexity and content validity of knowledge about prehistoric life depicted in concept maps by learners ranging in age from approximately 10 to 20 years. Study 1 (cross-age) explored the…
This paper presents analysis and implementation of Exact ModelKnowledge (EMK) and Direct Adaptive control schemes on the 4th order ball and beam system in which the dynamics of the ball position and the dynamics of the beam angle are cascaded. For the controller analysis the error dynamic equations for ball position and the beam angle are derived for both
Turker Turker; H. Gorgun; E. Zergeroglu; G. Cansever
In the context of the global knowledge economy, the three major players--university, industry, and government--are becoming increasingly interdependent. As more intensified interactions and relationships of increasing complexity among the institutions evolve, the Triple Helix model attempts to describe not only interactions among university,…
This qualitative study investigates the interaction between local and imported knowledges in a specific case of transnational importation; the whole-sale importation of the American medical learning disabilities (LDs) model in Kuwait. A discourse analysis of the narratives of local educators at the only school for LDs in the country reveals a…
In this article, the author presents his comments on Hisham Ghassib's article entitled "Where Does Creativity Fit into the Productivist Industrial Model of Knowledge Production?" Ghassib (2010) describes historical transformations of science from a marginal and non-autonomous activity which had been constrained by traditions to a self-autonomous,…
Improved assessment of flow, sediment, and nutrient losses from watersheds with computer simulation models is needed in order to identify and control nonpoint source pollution. One model, currently under consideration by the U.S. Environmental Protection Agency for watershed assessments, is the Soil Water and Assessment Tool (SWAT). In this report, the authors describe an application of SWAT for the Sny
In the automotive and the aerospace industry large amounts of expensively gathered experimental data are stored in huge databases. The real worth of these databases lies not only in easy data access, but also in the additional possibility of extracting the engineering knowledge implicitly contained in these data. As analytical modeling techniques in engineering are usually limited in model complexity, data driven techniques gain more and more importance in this kind of modeling. Using additional engineering knowledge such as dimensional information, the data driven modeling process has a great potential for saving modeling as well as experimental effort and may therefore help to generate financial benefit. In a technical context, knowledge is often represented as numerical attribute-value pairs with corresponding measurement units. The database fields form the so-called relevance list which is the only information needed to find the set of dimensionless parameters for the problem. The Pi- Theorem of Buckingham guarantees that for each complete relevance list a set of dimensionless groups exists. The number of these dimensionless parameters is less than the number of dimensional parameters in the dimensional formulation, thus a dimensionality reduction can easily be accomplished. Additionally, dimensional analysis allows a hierarchical modeling technique, first creating models of subsystems and then aggregating them consecutively into the overall model using coupling numbers. This paper gives a brief introduction into dimensional analysis and then shows the procedure of hierarchical modeling, its implications, as well as its application to knowledge discovery in scientific data. The proposed method is illustrated in a simplified example from the aerospace industry.
This report documents a reference model that describes the full scope of functionality that is expected of a Project Support Environment (PSE). The scope includes support for System Engineering, Software Engineering, and Life-Cycle Process Engineering as ...
The Hanford Environmental Dose Reconstruction (HEDR) Project has developed a set of computer models for estimating the possible radiation doses that individuals may have received from past Hanford Site operations. This document describes the validation of...
B. A. Napier J. C. Simpson P. W. Eslinger J. V. Ramsdell M. E. Thiede
Federally-funded research and development (R&D) represents a significant annual investment (approximately $79 billion in fiscal year 1996) on the part of U.S. taxpayers. Based on the results of a 10-year study of knowledge diffusion in U.S. aerospace industry, the authors take the position that U.S. competitiveness will be enhanced if knowledge management strategies, employed within a capability-enhancing U.S. technology policy framework, are applied to diffusing the results of federally-funded R&D. In making their case, the authors stress the importance of knowledge as the source of competitive advantage in today's global economy. Next, they offer a practice-based definition of knowledge management and discuss three current approaches to knowledge management implementation-mechanistic, "the learning organization," and systemic. The authors then examine three weaknesses in existing U.S. public policy and policy implementation-the dominance of knowledge creation, the need for diffusion-oriented technology policy, and the prevalence of a dissemination model- that affect diffusion of the results of federally-funded R&D. To address these shortcomings, they propose the development of a knowledge management framework for diffusing the results of federally-funded R&D. The article closes with a discussion of some issues and challenges associated with implementing a knowledge management framework for diffusing the results of federally-funded R&D.
The Hanford Environmental Dose Reconstruction (HEDR) Project has developed a set of computational ``tools`` for estimating the possible radiation dose that individuals may have received from past Hanford Site operations. This document describes the planned activities to ``validate`` these tools. In the sense of the HEDR Project, ``validation`` is a process carried out by comparing computational model predictions with field observations and experimental measurements that are independent of those used to develop the model.
\\u000a As fuzzy rules can be properly and effectively extracted from the data even if the data are gained from testing and inconsistent\\u000a information can be conveniently treated by the established fuzzy system model, the system properties can be objectively characterized\\u000a by fuzzy systems. This paper discussed the descriptive method, consistency about knowledgemodel of fuzzy systems, drawn a\\u000a concept and
Huimin Zhao; Mingyan Ding; Wu Deng; Xiumei Li; Wen Li
The National League of Cities's American Institute for Municipal Research, education and training undertook to provide consulting services to the Council of American Building Officials (CABO) in support of the Model Solar Energy Document. The tasks undertaken are discussed.
Focuses on the construction and use of a schoolyard model of the Morrow Bay watershed in California. Describes the design and use of materials that include styrofoam insulation, crushed granite, cement, and stucco. (DDR)
Purpose: The current paper aims at contributing to the understanding of interorganizational knowledge integration by highlighting the role of individuals' understandings of the task and how they shape knowledge integrating behaviours. Design/methodology/approach: The paper presents a framework of knowledge integration as heedful interrelating.…
This presentation describes the design and implementation of a knowledge based physiologic modeling system (KBPMS) and a preliminary evaluation of its use as a learning resource within the context of an experimental medical curriculum -- the Harvard New Pathway. KBPMS possesses combined numeric and qualitative simulation capabilities and can provide explanations of its knowledge and behaviour. It has been implemented on a microcomputer with a user interface incorporating interactive graphics. The preliminary evaluation of KBPMS is based on anecdotal data which suggests that the system might have pedagogic potential. Much work remains to be done in enhancing and further evaluating KBPMS.
Sixty-three college students worked with a computer simulation in which gradually increasing model complexity (model progression), model progression plus small assignments, or a control condition without either were used. Definition knowledge increased for all three conditions, but intuitive knowledge gain was greater for the two experimental…
Swaak, Janine; Van Joolingen, Wouter R.; de Jong, Ton
This paper shares the findings of NASA's Integrated Learning and Development Program (ILDP) in its effort to reinvigorate the HANDS-ON practice of space systems engineering and project/program management through focused coursework, training opportunities, on-the job learning and special assignments. Prior to March 2005, NASA responsibility for technical workforce development (the program/project manager, systems engineering, discipline engineering, discipline engineering and associated communities) was executed by two parallel organizations. In March 2005 these organizations merged. The resulting program-ILDP-was chartered to implement an integrated competency-based development model capable of enhancing NASA's technical workforce performance as they face the complex challenges of Earth science, space science, aeronautics and human spaceflight missions. Results developed in collaboration with NASA Field Centers are reported on. This work led to definition of the agency's first integrated technical workforce development model known as the Requisite Occupation Competence and Knowledge (the ROCK). Critical processes and products are presented including: 'validation' techniques to guide model development, the Design-A-CUrriculuM (DACUM) process, and creation of the agency's first systems engineering body-of-knowledge. Findings were validated via nine focus groups from industry and government, validated with over 17 space-related organizations, at an estimated cost exceeding $300,000 (US). Masters-level programs and training programs have evolved to address the needs of these practitioner communities based upon these results. The ROCK reintroduced rigor and depth to the practitioner's development in these critical disciplines enabling their ability to take mission concepts from imagination to reality.
The goal of this study was to assess the longitudinal effectiveness and impact of study abroad programs on teachers' content knowledge and professional perspectives. The study focused on a recent Fulbright-Hays Group Project Abroad to Botswana (summer 2011) and compares results with an earlier Fulbright-Hays program to Singapore and Malaysia…
The Lake Model Intercomparison Project (LakeMIP) is an international project initiated by participants of the workshop "Parameterization of Lakes in Numerical Weather Prediction and Climate Modelling" held in September 2008 in St. Petersburg (Zelenogorsk), Russia. LakeMIP offers an opportunity for a comprehensive evaluation and validation of many lake-model formulations and relevant physical parameterizations over a wide range of lake environments, and is expected to provide valuable information for further developments as well as guidance for their application within climate and NWP models. LakeMIP emphasizes on the crucial physical mechanisms and their parameterizations such as vertical turbulence mixing (both shear- and buoyancy-driven) in lakes, solar radiation absorption in the water column, heat exchange at the water-sediments interface, the effects of ice and snow cover, and others. Heat, moisture and momentum fluxes in the water-air interface are of particular interest, given their importance for the coupling with atmospheric models. A special attention in the project is paid at the computational efficiency of lake models, since there is significant limitation of computational resources for additional routines in climate and NWP models. Currently, seven one-dimensional lake models participate in the LakeMIP project. The first phase of the project includes testing different lake-model formulations, using observed meteorological data to drive them in the context of off-line simulations (i.e. detached from atmospheric models). Lake model outputs (water temperature profiles, surface temperature, heat fluxes, ice thickness, etc.) are compared among them and with available observations revealing their capabilities and limitations. Lakes, representing different climate conditions and mixing regimes, were chosen for simulation in order to encompass the variety of hydrological and thermodynamic regimes. Two lakes have currently been taken into account: Sparkling Lake in Wisconsin, USA and Kossenblatter See in Germany. Other lakes are considered for further experiments, including Toolik Lake in Alaska, USA and Laurentian Great Lakes. Further steps will consider surface parameterisation schemes for heat flux, momentum transfer as well as different representations for ice and snow cover. Atmospheric models, coupled with one-dimensional lake models as well as with 3D lake models will also be involved in LakeMIP. This voluntary project is open to all researchers, interested in testing their lake models in standardized conditions and in comparing their performance with other models.
Building Information Modelling (BIM) is an information technology (IT) enabled approach to managing design data in the AEC\\/FM (Architecture, Engineering and Construction\\/ Facilities Management) industry. BIM enables improved inter- disciplinary collaboration across distributed teams, intelligent documentation and information retrieval, greater consistency in building data, better conflict detection and enhanced facilities management. Despite the apparent benefits the adoption of BIM in
Kerry London; Vishal Singh; Claudelle Taylor; Ning Gu
An efficient three-dimensional, time dependent prognostic model of the Gulf of Mexico has been developed. The model is driven by winds and surface heat flux derived from climatological, atmospheric surface data, the result of an intensive data analysis study. Mean velocity, temperature, salinity, turbulence kinetic energy and turbulence macroscale are the prognostic variables. Lateral boundary conditions for temperature and salinity and geostrophically derived velocity at the Straits of Yucatan and Florida are obtained from climatological ocean data. An analytical second moment turbulence closure scheme embedded within the model provides realistic surface mixed layer dynamics. Free surface elevation distributions are calculated with an algorithm which calculates the external (tidal) mode separately from the internal mode. The external mode, an essentially two-dimensional calculation, requires a short integrating timestep whereas the more costly, three-dimensional, internal mode can be executed with a long step. The result is a fully three-dimensional code which includes a free surface at no sacrifice in computer cost compared to rigid lid models.
The sewage sludge pathogen transport model predicts the number of Salmonella, Ascaris, and polioviruses which might be expected to occur at various points in the environment along 13 defined pathways. These pathways describe the use of dried or liquid, raw or anaerobically digest...
Motivate, Innovate, Celebrate: an innovative shared governance model through the establishment of continuous quality improvement (CQI) councils was implemented across the London Health Sciences Centre (LHSC). The model leverages agent-specific knowledge at the point of care and provides a structure aimed at building human resources capacity and sustaining enhancements to quality and safe care delivery. Interprofessional and cross-functional teams work through the CQI councils to identify, formulate, execute and evaluate CQI initiatives. In addition to a structure that facilitates collaboration, accountability and ownership, a corporate CQI Steering Committee provides the forum for scaling up and spreading this model. Point-of-care staff, clinical management and educators were trained in LEAN methodology and patient experience-based design to ensure sufficient knowledge and resources to support the implementation. PMID:24860947
Numerical simulation is a standard practice used to support designing, operating, and monitoring CO2 injection projects. Although a variety of computational tools have been developed that support the numerical simulation process, many are single-purpose or platform specific and have a prescribed workflow that may or may not be suitable for a particular project. We are developing an open-source, flexible framework named Velo that provides a knowledge management infrastructure and tools to support modeling and simulation for various types of projects in a number of scientific domains. The Geologic Sequestration Software Suite (GS3) is a version of this framework with features and tools specifically tailored for geologic sequestration studies. Because of its general nature, GS3 is being employed in a variety of ways on projects with differing goals. GS3 is being used to support the Sim-SEQ international model comparison study, by providing a collaborative framework for the modeling teams and providing tools for model comparison. Another customized deployment of GS3 has been made to support the permit application process. In this case, GS3 is being used to manage data in support of conceptual model development and provide documentation and provenance for numerical simulations. An additional customized deployment of GS3 is being created for use by the United States Environmental Protection Agency (US-EPA) to aid in the CO2 injection permit application review process in one of its regions. These use cases demonstrate GS3’s flexibility, utility, and broad applicability
The problem addressed is that of obtaining reduced-order component models for use in simulating the dynamics of a multibody system. In certain cases, nonlinear system models may be constructed using linear dynamic models for each component, but allowing large angle motion between components. Without some form of model reduction, system models constructed in this manner may be too large for use in control system design and simulation trades. This paper analyzes one method of component model reduction that allows systems level requirements (e.g., capturing the effect of body 1 reaction wheel noise on body 2 camera pointing) to aid in the selection of the reduced-order component models. Briefly stated, important modes are selected at the system level and projected onto the components, and reduced-order components are then assembled into a reduced-order system model that retains the projected modes.
Background The ever-growing wealth of biological information available through multiple comprehensive database repositories can be leveraged for advanced analysis of data. We have now extensively revised and updated the multi-purpose software tool Biofilter that allows researchers to annotate and/or filter data as well as generate gene-gene interaction models based on existing biological knowledge. Biofilter now has the Library of Knowledge Integration (LOKI), for accessing and integrating existing comprehensive database information, including more flexibility for how ambiguity of gene identifiers are handled. We have also updated the way importance scores for interaction models are generated. In addition, Biofilter 2.0 now works with a range of types and formats of data, including single nucleotide polymorphism (SNP) identifiers, rare variant identifiers, base pair positions, gene symbols, genetic regions, and copy number variant (CNV) location information. Results Biofilter provides a convenient single interface for accessing multiple publicly available human genetic data sources that have been compiled in the supporting database of LOKI. Information within LOKI includes genomic locations of SNPs and genes, as well as known relationships among genes and proteins such as interaction pairs, pathways and ontological categories. Via Biofilter 2.0 researchers can: • Annotate genomic location or region based data, such as results from association studies, or CNV analyses, with relevant biological knowledge for deeper interpretation • Filter genomic location or region based data on biological criteria, such as filtering a series SNPs to retain only SNPs present in specific genes within specific pathways of interest • Generate Predictive Models for gene-gene, SNP-SNP, or CNV-CNV interactions based on biological information, with priority for models to be tested based on biological relevance, thus narrowing the search space and reducing multiple hypothesis-testing. Conclusions Biofilter is a software tool that provides a flexible way to use the ever-expanding expert biological knowledge that exists to direct filtering, annotation, and complex predictive model development for elucidating the etiology of complex phenotypic outcomes.
Photograph of modelprojected new hospital building and new landscaping for area north of building 500. Model displayed on the mezzanine level of building 500. - Fitzsimons General Hospital, Bounded by East Colfax to south, Peoria Street to west, Denver City/County & Adams County Line to north, & U.S. Route 255 to east, Aurora, Adams County, CO
A "five-claw" model which is based on the idea of risk profile in NASA, common risk sources of equipment development in GJB 5852-2006 and "nine-new" analysis method in CASC is put forward. The "five-claw" risk assessment model uses design & test, manufacturing, basic parts & general parts to describe aerospace projects' risk. In order to assess the extent of a risk described by this new model, a semi-quantitative method is put forward. It could avoid lacking objectivity of qualitative method and the difficulty in data-obtaining of quantitative method. Although the model and method present in this paper may not be fully practical in major aerospace projects, it just provides a new idea for risk assessment during the development of major aerospace projects
We present mathematical learning modelsâpredictions of studentâs knowledge vs amount of instructionâthat are based on assumptions motivated by various theories of learning: tabula rasa, constructivist, and tutoring. These models predict the improvement (on the post-test) as a function of the pretest score due to intervening instruction and also depend on the type of instruction. We introduce a connectedness model whose connectedness parameter measures the degree to which the rate of learning is proportional to prior knowledge. Over a wide range of pretest scores on standard tests of introductory physics concepts, it fits high-quality data nearly within error. We suggest that data from MIT have low connectedness (indicating memory-based learning) because the test used the same context and representation as the instruction and that more connected data from the University of Minnesota resulted from instruction in a different representation from the test.
The Phase-0 CSI Evolutionary Model (CEM) is a testbed for the study of space platform global line-of-sight (LOS) pointing. Now that the tests have been completed, a summary of hardware and closed-loop test experiences is necessary to insure a timely dissemination of the knowledge gained. The testbed is described and modeling experiences are presented followed by a summary of the research performed by various investigators. Some early lessons on implementing the closed-loop controllers are described with particular emphasis on real-time computing requirements. A summary of closed-loop studies and a synopsis of test results are presented. Plans for evolving the CEM from phase 0 to phases 1 and 2 are also described. Subsequently, a summary of knowledge gained from the design and testing of the Phase-0 CEM is made.
Belvin, W. Keith; Elliott, Kenny B.; Horta, Lucas G.
Building knowledge-based telemedicine systems to deliver high quality services is still a challenge. The availability and capability of different human, communication and material resources play an important role in the telemedical task management process especially in emergency scenarios. In this paper we propose a knowledgemodel enabling intelligent, ubiquitous telemedicine tasks management. The objective of this model is to support the quality of telemedical services delivered by web-based telemedicine applications. The methodology is based on a telemedicine tasks ontology representing the concepts and their interrelations, and on a set of rules that shall be applied by a Reasoner for decision making. This architecture design shall optimize the messages exchange among the different actors in the telemedicine systems, consequently providing more rapid and reliable telemedicine assistance. PMID:19745351
SHARP is a European INTERREG IVc Program. It focuses on the exchange of innovative technologies to protect groundwater resources for future generations by considering the climate change and the different geological and geographical conditions. Regions involved are Austria, United Kingdom, Poland, Italy, Macedonia, Malta, Greece and Germany. They will exchange practical know-how and also determine know-how demands concerning SHARP’s key contents: general groundwater management tools, artificial groundwater recharge technologies, groundwater monitoring systems, strategic use of groundwater resources for drinking water, irrigation and industry, techniques to save water quality and quantity, drinking water safety plans, risk management tools and water balance models. SHARP Outputs & results will influence the regional policy in the frame of sustainable groundwater management to save and improve the quality and quantity of groundwater reservoirs for future generations. The main focus of the Saxon State Office for Environment, Agriculture and Landscape in this project is the enhancement and purposive use of water balance models. Already since 1992 scientists compare different existing water balance models on different scales and coupled with groundwater models. For example in the KLIWEP (Assessment of Impacts of Climate Change Projections on Water and Matter Balance for the Catchment of River Parthe in Saxony) project the coupled model WaSiM-ETH - PCGEOFIM® has been used to study the impact of climate change on water balance and water supplies. The project KliWES (Assessment of the Impacts of Climate Change Projections on Water and Matter Balance for Catchment Areas in Saxony) still running, comprises studies of fundamental effects of climate change on catchments in Saxony. Project objective is to assess Saxon catchments according to the vulnerability of their water resources towards climate change projections in order to derive region-specific recommendations for management actions. The model comparisons within reference areas showed significant differences in outcome. The values of water balance components calculated with different models partially fluctuate by a multiple of their value. The SHARP project was prepared in several previous projects that were testing suitable water balance models and is now able to assist the knowledge transfer.
The Nordic protein project demonstrates a model for the process of achieving analytical quality. Goal: based on use of common reference intervals leading to the quality specifications. Creation of quality: through common high quality calibrator (with IFCC-values) (external factor) and individual trouble-shooting and guidelines (internal factor). Control of quality: with specially designed set of control samples and problem-related evaluation of control data. Establishing common reference intervals: through associated projects. PMID:8465144
Water problems are often bigger than technical and data challenges associated in representing a water system using a model. Controversy and complexity is inherent when water is to be allocated among different uses making difficult to maintain coherent and productive discussions on addressing water problems. Quantification of a water supply system through models has proven to be helpful to improve understanding, explore and develop adaptable solutions to water problems. However, models often become too large and complex and become hostages of endless discussions of the assumptions, their algorithms and their limitations. Data management organization and documentation keep model flexible and useful over time. The UC Davis HOBBES project is a new approach, building models from the bottom up. Reversing the traditional model development, where data are arranged around a model algorithm, in Hobbes the data structure, organization and documentation are established first, followed by application of simulation or optimization modeling algorithms for a particular problem at hand. The HOBBES project establishes standards for storing, documenting and sharing datasets on California water system. This allows models to be developed and modified more easily and transparently, with greater comparability. Elements in the database have a spatial definition and can aggregate several infrastructural elements into detailed to coarse representations of the water system. Elements in the database represent reservoirs, groundwater basins, pumping stations, hydropower and water treatment facilities, demand areas and conveyance infrastructure statewide. These elements also host time series, economic and other information from hydrologic, economic, climate and other models. This presentation provides an overview of the project HOBBES project, its applications and prospects for California and elsewhere. The HOBBES Project
Medellin-Azuara, J.; Sandoval Solis, S.; Lund, J. R.; Chu, W.
Capability Maturity Model Integration (CMMI) is one of the well-known models that provide best practices for software quality improvement. Many articles praise the benefits of CMMI adoption, such as enhanced knowledge management of software development, improved software quality and increased efficiency of software development. However, these intangibles, especially those relating to knowledge management, have not been investigated yet. To build
This paper utilizes Moran's I Index to investigate the spatial correlation of the regional technology innovation in China. Referring to the knowledge production function (KPF) by Griliches-Jaffe and the spatial lag econometric models based on Panel Data, the effects of four domestic factors is checked through the panel data of 29 provinces (cities\\/municipals) for the period 2000~2008. The study has
Purpose – This paper aims to review the concepts and constructs of some common models and frameworks advocated for knowledge management (KM) and organisational learning (OL) in literature. It sets forth a critical enquiry towards the integration of KM and OL practices and their relationship with the concepts of the learning organisation (LO) and chaordic organisation\\/enterprise (CO\\/CE). Design\\/methodology\\/approach – A
The electroplating industry of over 10,000 planting plants nationwide is one of the major waste generators in the industry. Large quantities of wastewater, spent solvents, spent process solutions, and sludge are the major wastes generated daily in plants, which costs the industry tremendously for waste treatment and disposal and hinders the further development of the industry. It becomes, therefore, an urgent need for the industry to identify technically most effective and economically most attractive methodologies and technologies to minimize the waste, while the production competitiveness can be still maintained. This dissertation aims at developing a novel WM methodology using artificial intelligence, fuzzy logic, and fundamental knowledge in chemical engineering, and an intelligent decision support tool. The WM methodology consists of two parts: the heuristic knowledge-based qualitative WM decision analysis and support methodology and fundamental knowledge-based quantitative process analysis methodology for waste reduction. In the former, a large number of WM strategies are represented as fuzzy rules. This becomes the main part of the knowledge base in the decision support tool, WMEP-Advisor. In the latter, various first-principles-based process dynamic models are developed. These models can characterize all three major types of operations in an electroplating plant, i.e., cleaning, rinsing, and plating. This development allows us to perform a thorough process analysis on bath efficiency, chemical consumption, wastewater generation, sludge generation, etc. Additional models are developed for quantifying drag-out and evaporation that are critical for waste reduction. The models are validated through numerous industrial experiments in a typical plating line of an industrial partner. The unique contribution of this research is that it is the first time for the electroplating industry to (i) use systematically available WM strategies, (ii) know quantitatively and accurately what is going on in each tank, and (iii) identify all WM opportunities through process improvement. This work has formed a solid foundation for the further development of powerful WM technologies for comprehensive WM in the following decade.
This paper contributes to the area of software engineering for Seman- tic Web development. We describe how to apply MAS-CommonKADS, an agent-oriented extension of CommonKADS, to the development of the ITtalks Web Portal. Domain-specific knowledge is modelled by reusing well-known ontologies such as FOAF and RDFiCal. We also describe how to specify CommonKADS problem-solving methods as web services, expressed using
Knowledge about ecological concepts, about wildlife and about endangered and threatened species was measured using over 1,300 eighth graders in Broward County, Florida. Knowledge scores were correlated with attitudes, non-consumptive attitude orientations, demographic characteristics, level of animal activities, and other variables. Study results…
We describe a collection of MATLAB functions for model reduction of linear, time-invariant systems. All MATLAB functions described here employ in one or the other way spectral projection methods such as the sign function. Included are implementations of modal and balanced truncation as well as selected balancing-related model reduction algorithms. Several of the balancing-related model reduction functions are not yet
The Life-Involvement Model (LIM) Project consisted of four major thrusts: a school-based operational program; a university-based, but field-centered, preservice teacher education program; a school-based in-service teacher education program; and curricular and instructional developmental work designed to support the above three operational programs…
\\u000a This chapter provides an overview of the Igliniit project, an International Polar Year (IPY) project that took place in Clyde\\u000a River, Nunavut, from 2006 to 2010. As part of the larger IPY projects, SIKU and ISIUOP, the Igliniit project brought Inuit\\u000a hunters and geomatics engineering students together to design, build, and test a tool to assist hunters in documenting their
This study sought to understand the perceptions of American Indian educators as they made their way through a pre-service school administrator preparation program at a large, public research university. The Model of American Indian School Administrators, or "Project MAISA", prepares American Indian/Alaska Native teachers to obtain Master's degrees…
Christman, Dana; Guillory, Raphael; Fairbanks, Anthony; Gonzalez, Maria Luisa
This paper reports on the work of a European Commission DG Education and Culture co-financed project PBP-VC, Promoting Best Practice in Virtual Campuses, which is aimed at providing a deeper understanding of the key issues and critical success factors underlying the implementation of virtual campuses. The paper outlines a tentative model of issues…
The SEG Advanced ModelingProject (SEAM) is a consortium being run by the Society of Exploration Geophysicists whose goal is to use numerical modeling to develop geophysical datasets that mimic those used for exploration and characterization of petroleum resources. Phase I of the project is underway and involves the calculation of a large 3D seismic exploration dataset in deepwater using acoustic-wave modeling. There are several aspects of this phase that include: (1) the development of a realistic geological and geophysical model of a region that contains a salt body with subsalt petroleum reservoirs. The salt structure is derived from a well- characterized salt body in the Gulf of Mexico. Surrounding sediments and structures within the sediments are based on geological and seismological information that represent typical Gulf of Mexico geology. The initial model will be acoustic with variable density and compressional-wave velocities; a subsequent model will be elastic and possibly contain anisotropy. EM and gravity models are also being considered (2) Characterization and development of a suitable numerical modeling scheme. The scheme must be capable of reproducing reliable amplitude and traveltime information for wave propagation in the realistic model for frequencies up to approximately 20 Hz and propagation distances of several hundred wavelengths. (3) Development of an acquisition scheme that includes shotpoints and locations of receivers on both the surface and within the Earth's subsurface. (4) Numerical modeling for tens of thousands shot points located on the Earth's surface, and (5) Storage of data and its distribution to industrial, academic and government laboratory investigators who will use the data for a variety of individually-developed research projects. The overall goal of Phase I the project is to provide a large, relatively high-frequency, high-fidelity dataset for a realistic petroleum resource located in deep water that researchers can use for development of improved approaches for analyzing seismic data using multiple approaches for petroleum exploration and reservoir characterization.
Including the impacts of climate change in decision making and planning processes is a challenge facing many regional governments including the New South Wales (NSW) and Australian Capital Territory (ACT) governments in Australia. NARCliM (NSW/ACT Regional Climate Modellingproject) is a regional climate modellingproject that aims to provide a comprehensive and consistent set of climate projections that can be used by all relevant government departments when considering climate change. To maximise end user engagement and ensure outputs are relevant to the planning process, a series of stakeholder workshops were run to define key aspects of the model experiment including spatial resolution, time slices, and output variables. As with all such experiments, practical considerations limit the number of ensembles members that can be simulated such that choices must be made concerning which Global Climate Models (GCMs) to downscale from, and which Regional Climate Models (RCMs) to downscale with. Here a methodology for making these choices is proposed that aims to sample the uncertainty in both GCMs and RCMs, as well as spanning the range of future climate projections present in the full GCM ensemble. The created ensemble provides a more robust view of future regional climate changes.
Evans, J. P.; Ji, F.; Lee, C.; Smith, P.; Argüeso, D.; Fita, L.
UNESCO became the umbrella organization for SESAME at its Executive Board 164th session, May 2002. The following comments about SESAME were made by this board: ``a quintessential UNESCO project combining capacity building with vital peace-building through science'' and ``a modelproject for other regions.'' Now that SESAME is well underway, other regions (e.g.; Africa and Central Asia) should be made aware of this progress, and they should be welcomed to join SESAME as a first step in developing similar projects in their region. Students and scientists from other regions should be encouraged to attend SESAME Users' meeting, schools, workshops, etc. where they can learn about synchrotron radiation sources, beamlines, and science. They should be invited to join SESAME scientists in designing and commissioning accelerators and beamlines, gaining relevant experience for their own projects and helping SESAME in the process.
Irreducible uncertainties due to knowledge's limitations, chaotic nature of climate system and human decision-making process drive uncertainties in Climate Change projections. Such uncertainties affect the impact studies, mainly when associated to extreme events, and difficult the decision-making process aimed at mitigation and adaptation. However, these uncertainties allow the possibility to develop exploratory analyses on system's vulnerability to different sceneries. The use of different climate model's projections allows to aboard uncertainties issues allowing the use of multiple runs to explore a wide range of potential impacts and its implications for potential vulnerabilities. Statistical approaches for analyses of extreme values are usually based on stationarity assumptions. However, nonstationarity is relevant at the time scales considered for extreme value analyses and could have great implications in dynamic complex systems, mainly under climate change transformations. Because this, it is required to consider the nonstationarity in the statistical distribution parameters. We carried out a study of the dispersion in hydrological extremes projections using climate change projections from several climate models to feed the Distributed Hydrological Model of the National Institute for Spatial Research, MHD-INPE, applied in Amazonian sub-basins. This model is a large-scale hydrological model that uses a TopModel approach to solve runoff generation processes at the grid-cell scale. MHD-INPE model was calibrated for 1970-1990 using observed meteorological data and comparing observed and simulated discharges by using several performance coeficients. Hydrological Model integrations were performed for present historical time (1970-1990) and for future period (2010-2100). Because climate models simulate the variability of the climate system in statistical terms rather than reproduce the historical behavior of climate variables, the performances of the model's runs during the historical period, when feed with climate model data, were tested using descriptors of the Flow Duration Curves. The analyses of projected extreme values were carried out considering the nonstationarity of the GEV distribution parameters and compared with extremes events in present time. Results show inter-model variability in a broad dispersion on projected extreme's values. Such dispersion implies different degrees of socio-economic impacts associated to extreme hydrological events. Despite the no existence of one optimum result, this variability allows the analyses of adaptation strategies and its potential vulnerabilities.
Andres Rodriguez, Daniel; Garofolo, Lucas; Lázaro de Siqueira Júnior, José; Samprogna Mohor, Guilherme; Tomasella, Javier
The purpose of this study was to explore the relationship between teachers’ ( N = 69) participation in constructivist chemistry professional development (PD) and enhancement of content (CK) and pedagogical content knowledge (PCK) (representational thinking and conceptual change strategies) and self-efficacy (PSTE). Quantitative measures assessed CK, PCK, and PSTE. Document analysis focused on PCK. Elementary teachers gained CK, PCK, PSTE, and designed lessons to advance thinking from macroscopic to abstract models. Middle/secondary teachers gained PSTE, PCK, and introduced macroscopic models to develop understanding of previously taught abstract models. All implemented representational thinking and conceptual change strategies. Results suggest that: (1) constructivist PD meets the needs of teachers of varying CK, and (2) instruction should connect representational models with alternative conceptions, integrating radical and social constructivism.
This paper presents a discussion of current work in the area of graphical modeling and model-based reasoning being undertaken by the Automation Technology Section, Code 522.3, at Goddard. The work was initially motivated by the growing realization that the knowledge acquisition process was a major bottleneck in the generation of fault detection, isolation, and repair (FDIR) systems for application in automated Mission Operations. As with most research activities this work started out with a simple objective: to develop a proof-of-concept system demonstrating that a draft rule-base for a FDIR system could be automatically realized by reasoning from a graphical representation of the system to be monitored. This work was called Knowledge From Pictures (KFP) (Truszkowski et. al. 1992). As the work has successfully progressed the KFP tool has become an environment populated by a set of tools that support a more comprehensive approach to model-based reasoning. This paper continues by giving an overview of the graphical modeling objectives of the work, describing the three tools that now populate the KFP environment, briefly presenting a discussion of related work in the field, and by indicating future directions for the KFP environment.
Bailin, Sydney; Paterra, Frank; Henderson, Scott; Truszkowski, Walt
Process simulation models are usually empirical, therefore there is an inherent difficulty in serving as carriers for knowledge acquisition and technology transfer, since their parameters have no physical meaning to facilitate verification of the dependence on the production conditions; in such a case, a 'black box' regression model or a neural network might be used to simply connect input-output characteristics. In several cases, scientific/mechanismic models may be proved valid, in which case parameter identification is required to find out the independent/explanatory variables and parameters, which each parameter depends on. This is a difficult task, since the phenomenological level at which each parameter is defined is different. In this paper, we have developed a methodological framework under the form of an algorithmic procedure to solve this problem. The main parts of this procedure are: (i) stratification of relevant knowledge in discrete layers immediately adjacent to the layer that the initial model under investigation belongs to, (ii) design of the ontology corresponding to these layers, (iii) elimination of the less relevant parts of the ontology by thinning, (iv) retrieval of the stronger interrelations between the remaining nodes within the revised ontological network, and (v) parameter identification taking into account the most influential interrelations revealed in (iv). The functionality of this methodology is demonstrated by quoting two representative case examples on wastewater treatment.
Project W-320 will be retrieving waste from Tank 241-C-106 and transferring the waste to Tank 241-AY-102. Waste in both tanks must be maintained below applicable thermal limits during and following the waste transfer. Thermal hydraulic process control models will be used for process control of the thermal limits. This report documents the process control models and presents a benchmarking of the models with data from Tanks 241-C-106 and 241-AY-102. Revision 1 of this report will provide a baselining of the models in preparation for the initiation of sluicing.
It is well known that the fair competition is the basic principle of market economy, the competition and the cooperation coexists is a tendency which the future will develop, and to achieve double-win results will be the foundation of cooperate for both game players. This paper has applied game theory in the construction project to establish tender model for proprietor
Ai-Zu Chen; Chun-Dong Guo; Shi-Min Wang; Jun Zhang
Accessing Curriculum Through Technology Tools (ACTTT), a project funded by the U.S. Office of Special Education Programs (OSEP), developed and tested a model designed to allow children in early elementary school, including those "at risk" and with disabilities, to better access, participate in, and benefit from the general curriculum. Children in…
Daytner, Katrina M.; Johanson, Joyce; Clark, Letha; Robinson, Linda
This document contains materials and information used during and developed by a model 2+2 electronics technology program development project conducted by Gainesville High School and Cooke County College, Texas. A procedures manual provides information on grant application, surveys, committees, curricula, articulation agreement, and goals and…
With most of the data available from the Lake Michigan Mass Balance Project field program, the modeling efforts have begun in earnest. The tributary and atmospheric load estimates are or will be completed soon, so realistic simulations for calibration are beginning. A Quality Ass...
This report describes 38 model school-business partnerships that are being conducted in South Carolina. The 38 reports were gathered from 24 school districts and 3 statewide projects. Criteria for selection were that the partnerships must be in some way exemplary of the program and the school district must have reported in some detail their…
South Carolina State Dept. of Education, Columbia. Div. of Public Accountability.
This project was designed to be used in a freshman calculus class whose students had already been introduced to logistic functions and basic data modeling techniques. It need not be limited to such an audience, however; it has also been implemented in a topics in mathematics class for college upperclassmen. Originally intended to be presented in…
Kramlich, Gary R., II; Braunstein Fierson, Janet L.; Wright, J. Adam
The application of knowledge-based expert systems to the productivity analysis of construction operations was studied. Prototype knowledge-based systems were developed for the precast-concrete-erection construction operation. A diagnostic knowledge-based system, which gave advice concerning expected productivity and the effects of production problems on productivity was developed. A predictive knowledge-based was also developed that used expert knowledge about productivity to predict the duration of the operation. The predictive knowledge-based system was found to give reasonable predictions of project duration. A SIMAN simulation model of the precast concrete construction operation was also developed to predict the duration of the operation. The input to the simulation model was derived from observations of installation durations at a precast-concrete construction project. It was found that the knowledge-based system and simulation models produced similar estimates of project duration.
This simulation-based module illustrates the universe as envisioned by early thinkers, culminating in a detailed look at the geocentric and heliocentric models. The models demonstrate the paths of the planets and the sun during their orbits, and also trace planetary and solar paths through the zodiac. Instructor resources are available including student manuals, assessment materials, and a list of the assumptions used. This resource is part of a larger collection of packaged curriculum materials by the Nebraska Astronomy Applet Project.
The Triple Helix model of university-industry-government relations can be generalized from a neo-institutional model of networks of relations to a neo-evolutionary model of how three (or more) social coordination mechanisms operate as selection environments upon one another. The mutual information among the three contexts (wealth generation, knowledge production, and political control) provides us with an indicator of the knowledge base
PHRplus is required to produce a Knowledge Building Agenda to guide research activities under the project. The first part (Part A) of the Knowledge Building Agenda reviews (1) ongoing PHRplus work (primarily technical assistance activities) with a view to...
S. Bennett D. Brinkerhoff L. Franco C. Leighton M. Paterson N. Rafeh
PROJECT was developed to assist EPA`s enforcement program in its penalty assessment responsibilities. The PROJECTModel evaluates the after-tax, net present value of supplemental environmental project (SEP). Violators propose SEPs to mitigate their penalty liability. SEP`s are appearing with increasing frequency in enforcement negotiations. In addition, SEPs allow the Agency to effect some extra environmental improvement. This model allows personnel without any background in accounting or corporate finance to determine the real cost of a SEP to a defendant for sett