Expert Knowledge-Based Automatic Sleep Stage Determination by Multi-Valued Decision Making Method
NASA Astrophysics Data System (ADS)
Wang, Bei; Sugi, Takenao; Kawana, Fusae; Wang, Xingyu; Nakamura, Masatoshi
In this study, an expert knowledge-based automatic sleep stage determination system working on a multi-valued decision making method is developed. Visual inspection by a qualified clinician is adopted to obtain the expert knowledge database. The expert knowledge database consists of probability density functions of parameters for various sleep stages. Sleep stages are determined automatically according to the conditional probability. Totally, four subjects were participated. The automatic sleep stage determination results showed close agreements with the visual inspection on sleep stages of awake, REM (rapid eye movement), light sleep and deep sleep. The constructed expert knowledge database reflects the distributions of characteristic parameters which can be adaptive to variable sleep data in hospitals. The developed automatic determination technique based on expert knowledge of visual inspection can be an assistant tool enabling further inspection of sleep disorder cases for clinical practice.
Non-Linear Effects in Knowledge Production
NASA Astrophysics Data System (ADS)
Purica, Ionut
2007-04-01
The generation of technological knowledge is paramount to our present development; the production of technological knowledge is governed by the same Cobb Douglas type model, with the means of research and the intelligence level replacing capital, respectively labor. We are exploring the basic behavior of present days' economies that are producing technological knowledge, along with the `usual' industrial production and determine a basic behavior that turns out to be a `Henon attractor'. Measures are introduced for the gain of technological knowledge and for the information of technological sequences that are based respectively on the underlying multi-valued modal logic of the technological research and on nonlinear thermodynamic considerations.
Evolutionary Oseen Model for Generalized Newtonian Fluid with Multivalued Nonmonotone Friction Law
NASA Astrophysics Data System (ADS)
Migórski, Stanisław; Dudek, Sylwia
2018-03-01
The paper deals with the non-stationary Oseen system of equations for the generalized Newtonian incompressible fluid with multivalued and nonmonotone frictional slip boundary conditions. First, we provide a result on existence of a unique solution to an abstract evolutionary inclusion involving the Clarke subdifferential term for a nonconvex function. We employ a method based on a surjectivity theorem for multivalued L-pseudomonotone operators. Then, we exploit the abstract result to prove the weak unique solvability of the Oseen system.
NASA Astrophysics Data System (ADS)
Shim, Jaewoo; Oh, Seyong; Kang, Dong-Ho; Jo, Seo-Hyeon; Ali, Muhammad Hasnain; Choi, Woo-Young; Heo, Keun; Jeon, Jaeho; Lee, Sungjoo; Kim, Minwoo; Song, Young Jae; Park, Jin-Hong
2016-11-01
Recently, negative differential resistance devices have attracted considerable attention due to their folded current-voltage characteristic, which presents multiple threshold voltage values. Because of this remarkable property, studies associated with the negative differential resistance devices have been explored for realizing multi-valued logic applications. Here we demonstrate a negative differential resistance device based on a phosphorene/rhenium disulfide (BP/ReS2) heterojunction that is formed by type-III broken-gap band alignment, showing high peak-to-valley current ratio values of 4.2 and 6.9 at room temperature and 180 K, respectively. Also, the carrier transport mechanism of the BP/ReS2 negative differential resistance device is investigated in detail by analysing the tunnelling and diffusion currents at various temperatures with the proposed analytic negative differential resistance device model. Finally, we demonstrate a ternary inverter as a multi-valued logic application. This study of a two-dimensional material heterojunction is a step forward toward future multi-valued logic device research.
Shim, Jaewoo; Oh, Seyong; Kang, Dong-Ho; Jo, Seo-Hyeon; Ali, Muhammad Hasnain; Choi, Woo-Young; Heo, Keun; Jeon, Jaeho; Lee, Sungjoo; Kim, Minwoo; Song, Young Jae; Park, Jin-Hong
2016-01-01
Recently, negative differential resistance devices have attracted considerable attention due to their folded current–voltage characteristic, which presents multiple threshold voltage values. Because of this remarkable property, studies associated with the negative differential resistance devices have been explored for realizing multi-valued logic applications. Here we demonstrate a negative differential resistance device based on a phosphorene/rhenium disulfide (BP/ReS2) heterojunction that is formed by type-III broken-gap band alignment, showing high peak-to-valley current ratio values of 4.2 and 6.9 at room temperature and 180 K, respectively. Also, the carrier transport mechanism of the BP/ReS2 negative differential resistance device is investigated in detail by analysing the tunnelling and diffusion currents at various temperatures with the proposed analytic negative differential resistance device model. Finally, we demonstrate a ternary inverter as a multi-valued logic application. This study of a two-dimensional material heterojunction is a step forward toward future multi-valued logic device research. PMID:27819264
NASA Astrophysics Data System (ADS)
Peng, Juan-juan; Wang, Jian-qiang; Yang, Wu-E.
2017-01-01
In this paper, multi-criteria decision-making (MCDM) problems based on the qualitative flexible multiple criteria method (QUALIFLEX), in which the criteria values are expressed by multi-valued neutrosophic information, are investigated. First, multi-valued neutrosophic sets (MVNSs), which allow the truth-membership function, indeterminacy-membership function and falsity-membership function to have a set of crisp values between zero and one, are introduced. Then the likelihood of multi-valued neutrosophic number (MVNN) preference relations is defined and the corresponding properties are also discussed. Finally, an extended QUALIFLEX approach based on likelihood is explored to solve MCDM problems where the assessments of alternatives are in the form of MVNNs; furthermore an example is provided to illustrate the application of the proposed method, together with a comparison analysis.
NASA Astrophysics Data System (ADS)
Ghosh, Amal K.; Basuray, Amitabha
2008-11-01
The memory devices in multi-valued logic are of most significance in modern research. This paper deals with the implementation of basic memory devices in multi-valued logic using Savart plate and spatial light modulator (SLM) based optoelectronic circuits. Photons are used here as the carrier to speed up the operations. Optical tree architecture (OTA) has been also utilized in the optical interconnection network. We have exploited the advantages of Savart plates, SLMs and OTA and proposed the SLM based high speed JK, D-type and T-type flip-flops in a trinary system.
Blur identification by multilayer neural network based on multivalued neurons.
Aizenberg, Igor; Paliy, Dmitriy V; Zurada, Jacek M; Astola, Jaakko T
2008-05-01
A multilayer neural network based on multivalued neurons (MLMVN) is a neural network with a traditional feedforward architecture. At the same time, this network has a number of specific different features. Its backpropagation learning algorithm is derivative-free. The functionality of MLMVN is superior to that of the traditional feedforward neural networks and of a variety kernel-based networks. Its higher flexibility and faster adaptation to the target mapping enables to model complex problems using simpler networks. In this paper, the MLMVN is used to identify both type and parameters of the point spread function, whose precise identification is of crucial importance for the image deblurring. The simulation results show the high efficiency of the proposed approach. It is confirmed that the MLMVN is a powerful tool for solving classification problems, especially multiclass ones.
Dynamic clustering detection through multi-valued descriptors of dermoscopic images.
Cozza, Valentina; Guarracino, Maria Rosario; Maddalena, Lucia; Baroni, Adone
2011-09-10
This paper introduces a dynamic clustering methodology based on multi-valued descriptors of dermoscopic images. The main idea is to support medical diagnosis to decide if pigmented skin lesions belonging to an uncertain set are nearer to malignant melanoma or to benign nevi. Melanoma is the most deadly skin cancer, and early diagnosis is a current challenge for clinicians. Most data analysis algorithms for skin lesions discrimination focus on segmentation and extraction of features of categorical or numerical type. As an alternative approach, this paper introduces two new concepts: first, it considers multi-valued data that scalar variables not only describe but also intervals or histogram variables; second, it introduces a dynamic clustering method based on Wasserstein distance to compare multi-valued data. The overall strategy of analysis can be summarized into the following steps: first, a segmentation of dermoscopic images allows to identify a set of multi-valued descriptors; second, we performed a discriminant analysis on a set of images where there is an a priori classification so that it is possible to detect which features discriminate the benign and malignant lesions; and third, we performed the proposed dynamic clustering method on the uncertain cases, which need to be associated to one of the two previously mentioned groups. Results based on clinical data show that the grading of specific descriptors associated to dermoscopic characteristics provides a novel way to characterize uncertain lesions that can help the dermatologist's diagnosis. Copyright © 2011 John Wiley & Sons, Ltd.
Minimally inconsistent reasoning in Semantic Web.
Zhang, Xiaowang
2017-01-01
Reasoning with inconsistencies is an important issue for Semantic Web as imperfect information is unavoidable in real applications. For this, different paraconsistent approaches, due to their capacity to draw as nontrivial conclusions by tolerating inconsistencies, have been proposed to reason with inconsistent description logic knowledge bases. However, existing paraconsistent approaches are often criticized for being too skeptical. To this end, this paper presents a non-monotonic paraconsistent version of description logic reasoning, called minimally inconsistent reasoning, where inconsistencies tolerated in the reasoning are minimized so that more reasonable conclusions can be inferred. Some desirable properties are studied, which shows that the new semantics inherits advantages of both non-monotonic reasoning and paraconsistent reasoning. A complete and sound tableau-based algorithm, called multi-valued tableaux, is developed to capture the minimally inconsistent reasoning. In fact, the tableaux algorithm is designed, as a framework for multi-valued DL, to allow for different underlying paraconsistent semantics, with the mere difference in the clash conditions. Finally, the complexity of minimally inconsistent description logic reasoning is shown on the same level as the (classical) description logic reasoning.
Minimally inconsistent reasoning in Semantic Web
Zhang, Xiaowang
2017-01-01
Reasoning with inconsistencies is an important issue for Semantic Web as imperfect information is unavoidable in real applications. For this, different paraconsistent approaches, due to their capacity to draw as nontrivial conclusions by tolerating inconsistencies, have been proposed to reason with inconsistent description logic knowledge bases. However, existing paraconsistent approaches are often criticized for being too skeptical. To this end, this paper presents a non-monotonic paraconsistent version of description logic reasoning, called minimally inconsistent reasoning, where inconsistencies tolerated in the reasoning are minimized so that more reasonable conclusions can be inferred. Some desirable properties are studied, which shows that the new semantics inherits advantages of both non-monotonic reasoning and paraconsistent reasoning. A complete and sound tableau-based algorithm, called multi-valued tableaux, is developed to capture the minimally inconsistent reasoning. In fact, the tableaux algorithm is designed, as a framework for multi-valued DL, to allow for different underlying paraconsistent semantics, with the mere difference in the clash conditions. Finally, the complexity of minimally inconsistent description logic reasoning is shown on the same level as the (classical) description logic reasoning. PMID:28750030
Chaouiya, Claudine; Keating, Sarah M; Berenguier, Duncan; Naldi, Aurélien; Thieffry, Denis; van Iersel, Martijn P; Le Novère, Nicolas; Helikar, Tomáš
2015-09-04
Quantitative methods for modelling biological networks require an in-depth knowledge of the biochemical reactions and their stoichiometric and kinetic parameters. In many practical cases, this knowledge is missing. This has led to the development of several qualitative modelling methods using information such as, for example, gene expression data coming from functional genomic experiments. The SBML Level 3 Version 1 Core specification does not provide a mechanism for explicitly encoding qualitative models, but it does provide a mechanism for SBML packages to extend the Core specification and add additional syntactical constructs. The SBML Qualitative Models package for SBML Level 3 adds features so that qualitative models can be directly and explicitly encoded. The approach taken in this package is essentially based on the definition of regulatory or influence graphs. The SBML Qualitative Models package defines the structure and syntax necessary to describe qualitative models that associate discrete levels of activities with entity pools and the transitions between states that describe the processes involved. This is particularly suited to logical models (Boolean or multi-valued) and some classes of Petri net models can be encoded with the approach.
NASA Astrophysics Data System (ADS)
Ghosh, Amal K.; Bhattacharya, Animesh; Raul, Moumita; Basuray, Amitabha
2012-07-01
Arithmetic logic unit (ALU) is the most important unit in any computing system. Optical computing is becoming popular day-by-day because of its ultrahigh processing speed and huge data handling capability. Obviously for the fast processing we need the optical TALU compatible with the multivalued logic. In this regard we are communicating the trinary arithmetic and logic unit (TALU) in modified trinary number (MTN) system, which is suitable for the optical computation and other applications in multivalued logic system. Here the savart plate and spatial light modulator (SLM) based optoelectronic circuits have been used to exploit the optical tree architecture (OTA) in optical interconnection network.
Periodic activation function and a modified learning algorithm for the multivalued neuron.
Aizenberg, Igor
2010-12-01
In this paper, we consider a new periodic activation function for the multivalued neuron (MVN). The MVN is a neuron with complex-valued weights and inputs/output, which are located on the unit circle. Although the MVN outperforms many other neurons and MVN-based neural networks have shown their high potential, the MVN still has a limited capability of learning highly nonlinear functions. A periodic activation function, which is introduced in this paper, makes it possible to learn nonlinearly separable problems and non-threshold multiple-valued functions using a single multivalued neuron. We call this neuron a multivalued neuron with a periodic activation function (MVN-P). The MVN-Ps functionality is much higher than that of the regular MVN. The MVN-P is more efficient in solving various classification problems. A learning algorithm based on the error-correction rule for the MVN-P is also presented. It is shown that a single MVN-P can easily learn and solve those benchmark classification problems that were considered unsolvable using a single neuron. It is also shown that a universal binary neuron, which can learn nonlinearly separable Boolean functions, and a regular MVN are particular cases of the MVN-P.
Rahaman, Mijanur; Pang, Chin-Tzong; Ishtyak, Mohd; Ahmad, Rais
2017-01-01
In this article, we introduce a perturbed system of generalized mixed quasi-equilibrium-like problems involving multi-valued mappings in Hilbert spaces. To calculate the approximate solutions of the perturbed system of generalized multi-valued mixed quasi-equilibrium-like problems, firstly we develop a perturbed system of auxiliary generalized multi-valued mixed quasi-equilibrium-like problems, and then by using the celebrated Fan-KKM technique, we establish the existence and uniqueness of solutions of the perturbed system of auxiliary generalized multi-valued mixed quasi-equilibrium-like problems. By deploying an auxiliary principle technique and an existence result, we formulate an iterative algorithm for solving the perturbed system of generalized multi-valued mixed quasi-equilibrium-like problems. Lastly, we study the strong convergence analysis of the proposed iterative sequences under monotonicity and some mild conditions. These results are new and generalize some known results in this field.
NASA Astrophysics Data System (ADS)
Arestova, M. L.; Bykovskii, A. Yu
1995-10-01
An architecture is proposed for a specialised optoelectronic multivalued logic processor based on the Allen—Givone algebra. The processor is intended for multiparametric processing of data arriving from a large number of sensors or for tackling spectral analysis tasks. The processor architecture makes it possible to obtain an approximate general estimate of the state of an object being diagnosed on a p-level scale. Optoelectronic systems are proposed for MAXIMUM, MINIMUM, and LITERAL logic gates, based on optical-frequency encoding of logic levels. Corresponding logic gates form a complete set of logic functions in the Allen—Givone algebra.
The Teacher in a Multivalue Society.
ERIC Educational Resources Information Center
Shaver, James P.
Given the general recognition that what we do is influenced as much or more by our value commitments as by our factual knowledge, it is ironic that social studies, the area of the curriculum supposedly focused on citizenship education, has paid so little attention to values. There are many reasons for this, but one of them, the author believes, is…
Saddeek, Ali Mohamed
2017-01-01
Most mathematical models arising in stationary filtration processes as well as in the theory of soft shells can be described by single-valued or generalized multivalued pseudomonotone mixed variational inequalities with proper convex nondifferentiable functionals. Therefore, for finding the minimum norm solution of such inequalities, the current paper attempts to introduce a modified two-layer iteration via a boundary point approach and to prove its strong convergence. The results here improve and extend the corresponding recent results announced by Badriev, Zadvornov and Saddeek (Differ. Equ. 37:934-942, 2001).
NASA Astrophysics Data System (ADS)
Wang, Bei; Sugi, Takenao; Wang, Xingyu; Nakamura, Masatoshi
Data for human sleep study may be affected by internal and external influences. The recorded sleep data contains complex and stochastic factors, which increase the difficulties for the computerized sleep stage determination techniques to be applied for clinical practice. The aim of this study is to develop an automatic sleep stage determination system which is optimized for variable sleep data. The main methodology includes two modules: expert knowledge database construction and automatic sleep stage determination. Visual inspection by a qualified clinician is utilized to obtain the probability density function of parameters during the learning process of expert knowledge database construction. Parameter selection is introduced in order to make the algorithm flexible. Automatic sleep stage determination is manipulated based on conditional probability. The result showed close agreement comparing with the visual inspection by clinician. The developed system can meet the customized requirements in hospitals and institutions.
Quasilinear parabolic variational inequalities with multi-valued lower-order terms
NASA Astrophysics Data System (ADS)
Carl, Siegfried; Le, Vy K.
2014-10-01
In this paper, we provide an analytical frame work for the following multi-valued parabolic variational inequality in a cylindrical domain : Find and an such that where is some closed and convex subset, A is a time-dependent quasilinear elliptic operator, and the multi-valued function is assumed to be upper semicontinuous only, so that Clarke's generalized gradient is included as a special case. Thus, parabolic variational-hemivariational inequalities are special cases of the problem considered here. The extension of parabolic variational-hemivariational inequalities to the general class of multi-valued problems considered in this paper is not only of disciplinary interest, but is motivated by the need in applications. The main goals are as follows. First, we provide an existence theory for the above-stated problem under coercivity assumptions. Second, in the noncoercive case, we establish an appropriate sub-supersolution method that allows us to get existence, comparison, and enclosure results. Third, the order structure of the solution set enclosed by sub-supersolutions is revealed. In particular, it is shown that the solution set within the sector of sub-supersolutions is a directed set. As an application, a multi-valued parabolic obstacle problem is treated.
Fuzzy logic and causal reasoning with an 'n' of 1 for diagnosis and treatment of the stroke patient.
Helgason, Cathy M; Jobe, Thomas H
2004-03-01
The current scientific model for clinical decision-making is founded on binary or Aristotelian logic, classical set theory and probability-based statistics. Evidence-based medicine has been established as the basis for clinical recommendations. There is a problem with this scientific model when the physician must diagnose and treat the individual patient. The problem is a paradox, which is that the scientific model of evidence-based medicine is based upon a hypothesis aimed at the group and therefore, any conclusions cannot be extrapolated but to a degree to the individual patient. This extrapolation is dependent upon the expertise of the physician. A fuzzy logic multivalued-based scientific model allows this expertise to be numerically represented and solves the clinical paradox of evidence-based medicine.
Solution of the equations for one-dimensional, two-phase, immiscible flow by geometric methods
NASA Astrophysics Data System (ADS)
Boronin, Ivan; Shevlyakov, Andrey
2018-03-01
Buckley-Leverett equations describe non viscous, immiscible, two-phase filtration, which is often of interest in modelling of oil production. For many parameters and initial conditions, the solutions of these equations exhibit non-smooth behaviour, namely discontinuities in form of shock waves. In this paper we obtain a novel method for the solution of Buckley-Leverett equations, which is based on geometry of differential equations. This method is fast, accurate, stable, and describes non-smooth phenomena. The main idea of the method is that classic discontinuous solutions correspond to the continuous surfaces in the space of jets - the so-called multi-valued solutions (Bocharov et al., Symmetries and conservation laws for differential equations of mathematical physics. American Mathematical Society, Providence, 1998). A mapping of multi-valued solutions from the jet space onto the plane of the independent variables is constructed. This mapping is not one-to-one, and its singular points form a curve on the plane of the independent variables, which is called the caustic. The real shock occurs at the points close to the caustic and is determined by the Rankine-Hugoniot conditions.
Monotone viable trajectories for functional differential inclusions
NASA Astrophysics Data System (ADS)
Haddad, Georges
This paper is a study on functional differential inclusions with memory which represent the multivalued version of retarded functional differential equations. The main result gives a necessary and sufficient equations. The main result gives a necessary and sufficient condition ensuring the existence of viable trajectories; that means trajectories remaining in a given nonempty closed convex set defined by given constraints the system must satisfy to be viable. Some motivations for this paper can be found in control theory where F( t, φ) = { f( t, φ, u)} uɛU is the set of possible velocities of the system at time t, depending on the past history represented by the function φ and on a control u ranging over a set U of controls. Other motivations can be found in planning procedures in microeconomics and in biological evolutions where problems with memory do effectively appear in a multivalued version. All these models require viability constraints represented by a closed convex set.
Multiple positive solutions for a class of integral inclusions
NASA Astrophysics Data System (ADS)
Hong, Shihuang
2008-04-01
This paper deals with sufficient conditions for the existence of at least two positive solutions for a class of integral inclusions arising in the traffic theory. To show our main results, we apply a norm-type expansion and compression fixed point theorem for multivalued map due to Agarwal and O'Regan [A note on the existence of multiple fixed points for multivalued maps with applications, J. Differential Equation 160 (2000) 389-403].
Multi-valued logic gates based on ballistic transport in quantum point contacts.
Seo, M; Hong, C; Lee, S-Y; Choi, H K; Kim, N; Chung, Y; Umansky, V; Mahalu, D
2014-01-22
Multi-valued logic gates, which can handle quaternary numbers as inputs, are developed by exploiting the ballistic transport properties of quantum point contacts in series. The principle of a logic gate that finds the minimum of two quaternary number inputs is demonstrated. The device is scalable to allow multiple inputs, which makes it possible to find the minimum of multiple inputs in a single gate operation. Also, the principle of a half-adder for quaternary number inputs is demonstrated. First, an adder that adds up two quaternary numbers and outputs the sum of inputs is demonstrated. Second, a device to express the sum of the adder into two quaternary digits [Carry (first digit) and Sum (second digit)] is demonstrated. All the logic gates presented in this paper can in principle be extended to allow decimal number inputs with high quality QPCs.
NASA Astrophysics Data System (ADS)
Rapoport, Diego L.
2011-01-01
In this transdisciplinary article which stems from philosophical considerations (that depart from phenomenology—after Merleau-Ponty, Heidegger and Rosen—and Hegelian dialectics), we develop a conception based on topological (the Moebius surface and the Klein bottle) and geometrical considerations (based on torsion and non-orientability of manifolds), and multivalued logics which we develop into a unified world conception that surmounts the Cartesian cut and Aristotelian logic. The role of torsion appears in a self-referential construction of space and time, which will be further related to the commutator of the True and False operators of matrix logic, still with a quantum superposed state related to a Moebius surface, and as the physical field at the basis of Spencer-Brown's primitive distinction in the protologic of the calculus of distinction. In this setting, paradox, self-reference, depth, time and space, higher-order non-dual logic, perception, spin and a time operator, the Klein bottle, hypernumbers due to Musès which include non-trivial square roots of ±1 and in particular non-trivial nilpotents, quantum field operators, the transformation of cognition to spin for two-state quantum systems, are found to be keenly interwoven in a world conception compatible with the philosophical approach taken for basis of this article. The Klein bottle is found not only to be the topological in-formation for self-reference and paradox whose logical counterpart in the calculus of indications are the paradoxical imaginary time waves, but also a classical-quantum transformer (Hadamard's gate in quantum computation) which is indispensable to be able to obtain a complete multivalued logical system, and still to generate the matrix extension of classical connective Boolean logic. We further find that the multivalued logic that stems from considering the paradoxical equation in the calculus of distinctions, and in particular, the imaginary solutions to this equation, generates the matrix logic which supersedes the classical logic of connectives and which has for particular subtheories fuzzy and quantum logics. Thus, from a primitive distinction in the vacuum plane and the axioms of the calculus of distinction, we can derive by incorporating paradox, the world conception succinctly described above.
Polyhedral sweeping processes with unbounded nonconvex-valued perturbation
NASA Astrophysics Data System (ADS)
Tolstonogov, A. A.
2017-12-01
A polyhedral sweeping process with a multivalued perturbation whose values are nonconvex unbounded sets is studied in a separable Hilbert space. Polyhedral sweeping processes do not satisfy the traditional assumptions used to prove existence theorems for convex sweeping processes. We consider the polyhedral sweeping process as an evolution inclusion with subdifferential operators depending on time. The widely used assumption of Lipschitz continuity for the multivalued perturbation term is replaced by a weaker notion of (ρ - H) Lipschitzness. The existence of solutions is proved for this sweeping process.
Methods for the computation of the multivalued Painlevé transcendents on their Riemann surfaces
NASA Astrophysics Data System (ADS)
Fasondini, Marco; Fornberg, Bengt; Weideman, J. A. C.
2017-09-01
We extend the numerical pole field solver (Fornberg and Weideman (2011) [12]) to enable the computation of the multivalued Painlevé transcendents, which are the solutions to the third, fifth and sixth Painlevé equations, on their Riemann surfaces. We display, for the first time, solutions to these equations on multiple Riemann sheets. We also provide numerical evidence for the existence of solutions to the sixth Painlevé equation that have pole-free sectors, known as tronquée solutions.
Picturing Data With Uncertainty
NASA Technical Reports Server (NTRS)
Kao, David; Love, Alison; Dungan, Jennifer L.; Pang, Alex
2004-01-01
NASA is in the business of creating maps for scientific purposes to represent important biophysical or geophysical quantities over space and time. For example, maps of surface temperature over the globe tell scientists where and when the Earth is heating up; regional maps of the greenness of vegetation tell scientists where and when plants are photosynthesizing. There is always uncertainty associated with each value in any such map due to various factors. When uncertainty is fully modeled, instead of a single value at each map location, there is a distribution expressing a set of possible outcomes at each location. We consider such distribution data as multi-valued data since it consists of a collection of values about a single variable. Thus, a multi-valued data represents both the map and its uncertainty. We have been working on ways to visualize spatial multi-valued data sets effectively for fields with regularly spaced units or grid cells such as those in NASA's Earth science applications. A new way to display distributions at multiple grid locations is to project the distributions from an individual row, column or other user-selectable straight transect from the 2D domain. First at each grid cell in a given slice (row, column or transect), we compute a smooth density estimate from the underlying data. Such a density estimate for the probability density function (PDF) is generally more useful than a histogram, which is a classic density estimate. Then, the collection of PDFs along a given slice are presented vertically above the slice and form a wall. To minimize occlusion of intersecting slices, the corresponding walls are positioned at the far edges of the boundary. The PDF wall depicts the shapes of the distributions very dearly since peaks represent the modes (or bumps) in the PDFs. We've defined roughness as the number of peaks in the distribution. Roughness is another useful summary information for multimodal distributions. The uncertainty of the multi-valued data can also be interpreted by the number of peaks and the widths of the peaks as shown by the PDF walls.
Testability analysis on a hydraulic system in a certain equipment based on simulation model
NASA Astrophysics Data System (ADS)
Zhang, Rui; Cong, Hua; Liu, Yuanhong; Feng, Fuzhou
2018-03-01
Aiming at the problem that the complicated structure and the shortage of fault statistics information in hydraulic systems, a multi value testability analysis method based on simulation model is proposed. Based on the simulation model of AMESim, this method injects the simulated faults and records variation of test parameters ,such as pressure, flow rate, at each test point compared with those under normal conditions .Thus a multi-value fault-test dependency matrix is established. Then the fault detection rate (FDR) and fault isolation rate (FIR) are calculated based on the dependency matrix. Finally the system of testability and fault diagnosis capability are analyzed and evaluated, which can only reach a lower 54%(FDR) and 23%(FIR). In order to improve testability performance of the system,. number and position of the test points are optimized on the system. Results show the proposed test placement scheme can be used to solve the problems that difficulty, inefficiency and high cost in the system maintenance.
NASA Astrophysics Data System (ADS)
Kadhem, Hasan; Amagasa, Toshiyuki; Kitagawa, Hiroyuki
Encryption can provide strong security for sensitive data against inside and outside attacks. This is especially true in the “Database as Service” model, where confidentiality and privacy are important issues for the client. In fact, existing encryption approaches are vulnerable to a statistical attack because each value is encrypted to another fixed value. This paper presents a novel database encryption scheme called MV-OPES (Multivalued — Order Preserving Encryption Scheme), which allows privacy-preserving queries over encrypted databases with an improved security level. Our idea is to encrypt a value to different multiple values to prevent statistical attacks. At the same time, MV-OPES preserves the order of the integer values to allow comparison operations to be directly applied on encrypted data. Using calculated distance (range), we propose a novel method that allows a join query between relations based on inequality over encrypted values. We also present techniques to offload query execution load to a database server as much as possible, thereby making a better use of server resources in a database outsourcing environment. Our scheme can easily be integrated with current database systems as it is designed to work with existing indexing structures. It is robust against statistical attack and the estimation of true values. MV-OPES experiments show that security for sensitive data can be achieved with reasonable overhead, establishing the practicability of the scheme.
Van der Waals model for phase transitions in thermoresponsive surface films.
McCoy, John D; Curro, John G
2009-05-21
Phase transitions in polymeric surface films are studied with a simple model based on the van der Waals equation of state. Each chain is modeled by a single bead attached to the surface by an entropic-Hooke's law spring. The surface coverage is controlled by adjusting the chemical potential, and the equilibrium density profile is calculated with density functional theory. The interesting feature of this model is the multivalued nature of the density profile seen at low temperature. This van der Waals loop behavior is resolved with a Maxwell construction between a high-density phase near the wall and a low-density phase in a "vertical" phase transition. Signatures of the phase transition in experimentally measurable quantities are then found. Numerical calculations are presented for isotherms of surface pressure, for the Poisson ratio, and for the swelling ratio.
Ben Abdallah, Emna; Folschette, Maxime; Roux, Olivier; Magnin, Morgan
2017-01-01
This paper addresses the problem of finding attractors in biological regulatory networks. We focus here on non-deterministic synchronous and asynchronous multi-valued networks, modeled using automata networks (AN). AN is a general and well-suited formalism to study complex interactions between different components (genes, proteins,...). An attractor is a minimal trap domain, that is, a part of the state-transition graph that cannot be escaped. Such structures are terminal components of the dynamics and take the form of steady states (singleton) or complex compositions of cycles (non-singleton). Studying the effect of a disease or a mutation on an organism requires finding the attractors in the model to understand the long-term behaviors. We present a computational logical method based on answer set programming (ASP) to identify all attractors. Performed without any network reduction, the method can be applied on any dynamical semantics. In this paper, we present the two most widespread non-deterministic semantics: the asynchronous and the synchronous updating modes. The logical approach goes through a complete enumeration of the states of the network in order to find the attractors without the necessity to construct the whole state-transition graph. We realize extensive computational experiments which show good performance and fit the expected theoretical results in the literature. The originality of our approach lies on the exhaustive enumeration of all possible (sets of) states verifying the properties of an attractor thanks to the use of ASP. Our method is applied to non-deterministic semantics in two different schemes (asynchronous and synchronous). The merits of our methods are illustrated by applying them to biological examples of various sizes and comparing the results with some existing approaches. It turns out that our approach succeeds to exhaustively enumerate on a desktop computer, in a large model (100 components), all existing attractors up to a given size (20 states). This size is only limited by memory and computation time.
Dynamics and Stability of Acoustic Wavefronts in the Ocean
2011-09-01
developed to solve the eikonal equation and calculate wavefront and ray trajectory displacements, which are required to be small over a correlation length...with a direct modeling of acoustic wavefronts in the ocean through numerical solution of the eikonal equation lies in the eikonal (and acoustic...travel time) being a multi-valued function of position. A number of computational approaches to solve the eikonal equation without ray tracing have been
Fresch, Barbara; Bocquel, Juanita; Hiluf, Dawit; Rogge, Sven; Levine, Raphael D; Remacle, Françoise
2017-07-05
To realize low-power, compact logic circuits, one can explore parallel operation on single nanoscale devices. An added incentive is to use multivalued (as distinct from Boolean) logic. Here, we theoretically demonstrate that the computation of all the possible outputs of a multivariate, multivalued logic function can be implemented in parallel by electrical addressing of a molecule made up of three interacting dopant atoms embedded in Si. The electronic states of the dopant molecule are addressed by pulsing a gate voltage. By simulating the time evolution of the non stationary electronic density built by the gate voltage, we show that one can implement a molecular decision tree that provides in parallel all the outputs for all the inputs of the multivariate, multivalued logic function. The outputs are encoded in the populations and in the bond orders of the dopant molecule, which can be measured using an STM tip. We show that the implementation of the molecular logic tree is equivalent to a spectral function decomposition. The function that is evaluated can be field-programmed by changing the time profile of the pulsed gate voltage. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Wu, Jun; Li, Chengbing; Huo, Yueying
2014-01-01
Safety of dangerous goods transport is directly related to the operation safety of dangerous goods transport enterprise. Aiming at the problem of the high accident rate and large harm in dangerous goods logistics transportation, this paper took the group decision making problem based on integration and coordination thought into a multiagent multiobjective group decision making problem; a secondary decision model was established and applied to the safety assessment of dangerous goods transport enterprise. First of all, we used dynamic multivalue background and entropy theory building the first level multiobjective decision model. Secondly, experts were to empower according to the principle of clustering analysis, and combining with the relative entropy theory to establish a secondary rally optimization model based on relative entropy in group decision making, and discuss the solution of the model. Then, after investigation and analysis, we establish the dangerous goods transport enterprise safety evaluation index system. Finally, case analysis to five dangerous goods transport enterprises in the Inner Mongolia Autonomous Region validates the feasibility and effectiveness of this model for dangerous goods transport enterprise recognition, which provides vital decision making basis for recognizing the dangerous goods transport enterprises. PMID:25477954
Wu, Jun; Li, Chengbing; Huo, Yueying
2014-01-01
Safety of dangerous goods transport is directly related to the operation safety of dangerous goods transport enterprise. Aiming at the problem of the high accident rate and large harm in dangerous goods logistics transportation, this paper took the group decision making problem based on integration and coordination thought into a multiagent multiobjective group decision making problem; a secondary decision model was established and applied to the safety assessment of dangerous goods transport enterprise. First of all, we used dynamic multivalue background and entropy theory building the first level multiobjective decision model. Secondly, experts were to empower according to the principle of clustering analysis, and combining with the relative entropy theory to establish a secondary rally optimization model based on relative entropy in group decision making, and discuss the solution of the model. Then, after investigation and analysis, we establish the dangerous goods transport enterprise safety evaluation index system. Finally, case analysis to five dangerous goods transport enterprises in the Inner Mongolia Autonomous Region validates the feasibility and effectiveness of this model for dangerous goods transport enterprise recognition, which provides vital decision making basis for recognizing the dangerous goods transport enterprises.
Therapeutic target discovery using Boolean network attractors: improvements of kali
Guziolowski, Carito
2018-01-01
In a previous article, an algorithm for identifying therapeutic targets in Boolean networks modelling pathological mechanisms was introduced. In the present article, the improvements made on this algorithm, named kali, are described. These improvements are (i) the possibility to work on asynchronous Boolean networks, (ii) a finer assessment of therapeutic targets and (iii) the possibility to use multivalued logic. kali assumes that the attractors of a dynamical system, such as a Boolean network, are associated with the phenotypes of the modelled biological system. Given a logic-based model of pathological mechanisms, kali searches for therapeutic targets able to reduce the reachability of the attractors associated with pathological phenotypes, thus reducing their likeliness. kali is illustrated on an example network and used on a biological case study. The case study is a published logic-based model of bladder tumorigenesis from which kali returns consistent results. However, like any computational tool, kali can predict but cannot replace human expertise: it is a supporting tool for coping with the complexity of biological systems in the field of drug discovery. PMID:29515890
Medical image processing using neural networks based on multivalued and universal binary neurons
NASA Astrophysics Data System (ADS)
Aizenberg, Igor N.; Aizenberg, Naum N.; Gotko, Eugen S.; Sochka, Vladimir A.
1998-06-01
Cellular Neural Networks (CNN) has become a very good mean for solution of the different kind of image processing problems. CNN based on multi-valued neurons (CNN-MVN) and CNN based on universal binary neurons (CNN-UBN) are the specific kinds of the CNN. MVN and UBN are neurons with complex-valued weights, and complex internal arithmetic. Their main feature is possibility of implementation of the arbitrary mapping between inputs and output described by the MVN, and arbitrary (not only threshold) Boolean function (UBN). Great advantage of the CNN is possibility of implementation of the any linear and many non-linear filters in spatial domain. Together with noise removing using CNN it is possible to implement filters, which can amplify high and medium frequencies. These filters are a very good mean for solution of the enhancement problem, and problem of details extraction against complex background. So, CNN make it possible to organize all the processing process from filtering until extraction of the important details. Organization of this process for medical image processing is considered in the paper. A major attention will be concentrated on the processing of the x-ray and ultrasound images corresponding to different oncology (or closed to oncology) pathologies. Additionally we will consider new structure of the neural network for solution of the problem of differential diagnostics of breast cancer.
Equation of state for shock compression of distended solids
NASA Astrophysics Data System (ADS)
Grady, Dennis; Fenton, Gregg; Vogler, Tracy
2014-05-01
Shock Hugoniot data for full-density and porous compounds of boron carbide, silicon dioxide, tantalum pentoxide, uranium dioxide and playa alluvium are investigated for the purpose of equation-of-state representation of intense shock compression. Complications of multivalued Hugoniot behavior characteristic of highly distended solids are addressed through the application of enthalpy-based equations of state of the form originally proposed by Rice and Walsh in the late 1950's. Additive measures of cold and thermal pressure intrinsic to the Mie-Gruneisen EOS framework is replaced by isobaric additive functions of the cold and thermal specific volume components in the enthalpy-based formulation. Additionally, experimental evidence reveals enhancement of shock-induced phase transformation on the Hugoniot with increasing levels of initial distension for silicon dioxide, uranium dioxide and possibly boron carbide. Methods for addressing this experimentally observed feature of the shock compression are incorporated into the EOS model.
Equation of State for Shock Compression of High Distension Solids
NASA Astrophysics Data System (ADS)
Grady, Dennis
2013-06-01
Shock Hugoniot data for full-density and porous compounds of boron carbide, silicon dioxide, tantalum pentoxide, uranium dioxide and playa alluvium are investigated for the purpose of equation-of-state representation of intense shock compression. Complications of multivalued Hugoniot behavior characteristic of highly distended solids are addressed through the application of enthalpy-based equations of state of the form originally proposed by Rice and Walsh in the late 1950's. Additivity of cold and thermal pressure intrinsic to the Mie-Gruneisen EOS framework is replaced by isobaric additive functions of the cold and thermal specific volume components in the enthalpy-based formulation. Additionally, experimental evidence supports acceleration of shock-induced phase transformation on the Hugoniot with increasing levels of initial distention for silicon dioxide, uranium dioxide and possibly boron carbide. Methods for addressing this experimentally observed facet of the shock compression are introduced into the EOS model.
High-speed tracking control of piezoelectric actuators using an ellipse-based hysteresis model.
Gu, Guoying; Zhu, Limin
2010-08-01
In this paper, an ellipse-based mathematic model is developed to characterize the rate-dependent hysteresis in piezoelectric actuators. Based on the proposed model, an expanded input space is constructed to describe the multivalued hysteresis function H[u](t) by a multiple input single output (MISO) mapping Gamma:R(2)-->R. Subsequently, the inverse MISO mapping Gamma(-1)(H[u](t),H[u](t);u(t)) is proposed for real-time hysteresis compensation. In controller design, a hybrid control strategy combining a model-based feedforward controller and a proportional integral differential (PID) feedback loop is used for high-accuracy and high-speed tracking control of piezoelectric actuators. The real-time feedforward controller is developed to cancel the rate-dependent hysteresis based on the inverse hysteresis model, while the PID controller is used to compensate for the creep, modeling errors, and parameter uncertainties. Finally, experiments with and without hysteresis compensation are conducted and the experimental results are compared. The experimental results show that the hysteresis compensation in the feedforward path can reduce the hysteresis-caused error by up to 88% and the tracking performance of the hybrid controller is greatly improved in high-speed tracking control applications, e.g., the root-mean-square tracking error is reduced to only 0.34% of the displacement range under the input frequency of 100 Hz.
Efficient fault diagnosis of helicopter gearboxes
NASA Technical Reports Server (NTRS)
Chin, H.; Danai, K.; Lewicki, D. G.
1993-01-01
Application of a diagnostic system to a helicopter gearbox is presented. The diagnostic system is a nonparametric pattern classifier that uses a multi-valued influence matrix (MVIM) as its diagnostic model and benefits from a fast learning algorithm that enables it to estimate its diagnostic model from a small number of measurement-fault data. To test this diagnostic system, vibration measurements were collected from a helicopter gearbox test stand during accelerated fatigue tests and at various fault instances. The diagnostic results indicate that the MVIM system can accurately detect and diagnose various gearbox faults so long as they are included in training.
Approximation Of Multi-Valued Inverse Functions Using Clustering And Sugeno Fuzzy Inference
NASA Technical Reports Server (NTRS)
Walden, Maria A.; Bikdash, Marwan; Homaifar, Abdollah
1998-01-01
Finding the inverse of a continuous function can be challenging and computationally expensive when the inverse function is multi-valued. Difficulties may be compounded when the function itself is difficult to evaluate. We show that we can use fuzzy-logic approximators such as Sugeno inference systems to compute the inverse on-line. To do so, a fuzzy clustering algorithm can be used in conjunction with a discriminating function to split the function data into branches for the different values of the forward function. These data sets are then fed into a recursive least-squares learning algorithm that finds the proper coefficients of the Sugeno approximators; each Sugeno approximator finds one value of the inverse function. Discussions about the accuracy of the approximation will be included.
NASA Astrophysics Data System (ADS)
Aizenberg, Evgeni; Bigio, Irving J.; Rodriguez-Diaz, Eladio
2012-03-01
The Fourier descriptors paradigm is a well-established approach for affine-invariant characterization of shape contours. In the work presented here, we extend this method to images, and obtain a 2D Fourier representation that is invariant to image rotation. The proposed technique retains phase uniqueness, and therefore structural image information is not lost. Rotation-invariant phase coefficients were used to train a single multi-valued neuron (MVN) to recognize satellite and human face images rotated by a wide range of angles. Experiments yielded 100% and 96.43% classification rate for each data set, respectively. Recognition performance was additionally evaluated under effects of lossy JPEG compression and additive Gaussian noise. Preliminary results show that the derived rotation-invariant features combined with the MVN provide a promising scheme for efficient recognition of rotated images.
High-performance gas sensors with temperature measurement
Zhang, Yong; Li, Shengtao; Zhang, Jingyuan; Pan, Zhigang; Min, Daomin; Li, Xin; Song, Xiaoping; Liu, Junhua
2013-01-01
There are a number of gas ionization sensors using carbon nanotubes as cathode or anode. Unfortunately, their applications are greatly limited by their multi-valued sensitivity, one output value corresponding to several measured concentration values. Here we describe a triple-electrode structure featuring two electric fields with opposite directions, which enable us to overcome the multi-valued sensitivity problem at 1 atm in a wide range of gas concentrations. We used a carbon nanotube array as the first electrode, and the two electric fields between the upper and the lower interelectrode gaps were designed to extract positive ions generated in the upper gap, hence significantly reduced positive ion bombardment on the nanotube electrode, which allowed us to maintain a high electric field near the nanotube tips, leading to a single-valued sensitivity and a long nanotube life. We have demonstrated detection of various gases and simultaneously monitoring temperature, and a potential for applications. PMID:23405281
Defect-phase-dynamics approach to statistical domain-growth problem of clock models
NASA Technical Reports Server (NTRS)
Kawasaki, K.
1985-01-01
The growth of statistical domains in quenched Ising-like p-state clock models with p = 3 or more is investigated theoretically, reformulating the analysis of Ohta et al. (1982) in terms of a phase variable and studying the dynamics of defects introduced into the phase field when the phase variable becomes multivalued. The resulting defect/phase domain-growth equation is applied to the interpretation of Monte Carlo simulations in two dimensions (Kaski and Gunton, 1983; Grest and Srolovitz, 1984), and problems encountered in the analysis of related Potts models are discussed. In the two-dimensional case, the problem is essentially that of a purely dissipative Coulomb gas, with a sq rt t growth law complicated by vertex-pinning effects at small t.
77 FR 58828 - Northern Indiana Public Service Company; Notice of Petition for Declaratory Order
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-24
... kV transmission line from the Reynolds substation to the Greentown substation and (2) substation upgrades at Reynolds substation, including a 765 kV/345kV transformer, a Multi-Value Project approved under...
On the Dynamical Regimes of Pattern-Accelerated Electroconvection.
Davidson, Scott M; Wessling, Matthias; Mani, Ali
2016-03-03
Recent research has established that electroconvection can enhance ion transport at polarized surfaces such as membranes and electrodes where it would otherwise be limited by diffusion. The onset of such overlimiting transport can be influenced by the surface topology of the ion selective membranes as well as inhomogeneities in their electrochemical properties. However, there is little knowledge regarding the mechanisms through which these surface variations promote transport. We use high-resolution direct numerical simulations to develop a comprehensive analysis of electroconvective flows generated by geometric patterns of impermeable stripes and investigate their potential to regularize electrokinetic instabilities. Counterintuitively, we find that reducing the permeable area of an ion exchange membrane, with appropriate patterning, increases the overall ion transport rate by up to 80%. In addition, we present analysis of nonpatterned membranes, and find a novel regime of electroconvection where a multivalued current is possible due to the coexistence of multiple convective states.
Two-layer symbolic representation for stochastic models with phase-type distributed events
NASA Astrophysics Data System (ADS)
Longo, Francesco; Scarpa, Marco
2015-07-01
Among the techniques that have been proposed for the analysis of non-Markovian models, the state space expansion approach showed great flexibility in terms of modelling capacities.The principal drawback is the explosion of the state space. This paper proposes a two-layer symbolic method for efficiently storing the expanded reachability graph of a non-Markovian model in the case in which continuous phase-type distributions are associated with the firing times of system events, and different memory policies are considered. At the lower layer, the reachability graph is symbolically represented in the form of a set of Kronecker matrices, while, at the higher layer, all the information needed to correctly manage event memory is stored in a multi-terminal multi-valued decision diagram. Such an information is collected by applying a symbolic algorithm, which is based on a couple of theorems. The efficiency of the proposed approach, in terms of memory occupation and execution time, is shown by applying it to a set of non-Markovian stochastic Petri nets and comparing it with a classical explicit expansion algorithm. Moreover, a comparison with a classical symbolic approach is performed whenever possible.
Full thermomechanical coupling in modelling of micropolar thermoelasticity
NASA Astrophysics Data System (ADS)
Murashkin, E. V.; Radayev, Y. N.
2018-04-01
The present paper is devoted to plane harmonic waves of displacements and microrotations propagating in fully coupled thermoelastic continua. The analysis is carried out in the framework of linear conventional thermoelastic micropolar continuum model. The reduced energy balance equation and the special form of the Helmholtz free energy are discussed. The constitutive constants providing fully coupling of equations of motion and heat conduction are considered. The dispersion equation is derived and analysed in the form bi-cubic and bi-quadratic polynoms product. The equation are analyzed by the computer algebra system Mathematica. Algebraic forms expressed by complex multivalued square and cubic radicals are obtained for wavenumbers of transverse and longitudinal waves. The exact forms of wavenumbers of a plane harmonic coupled thermoelastic waves are computed.
Accomplishment Summary 1968-1969. Biological Computer Laboratory.
ERIC Educational Resources Information Center
Von Foerster, Heinz; And Others
This report summarizes theoretical, applied, and experimental studies in the areas of computational principles in complex intelligent systems, cybernetics, multivalued logic, and the mechanization of cognitive processes. This work is summarized under the following topic headings: properties of complex dynamic systems; computers and the language…
Fault detection of helicopter gearboxes using the multi-valued influence matrix method
NASA Technical Reports Server (NTRS)
Chin, Hsinyung; Danai, Kourosh; Lewicki, David G.
1993-01-01
In this paper we investigate the effectiveness of a pattern classifying fault detection system that is designed to cope with the variability of fault signatures inherent in helicopter gearboxes. For detection, the measurements are monitored on-line and flagged upon the detection of abnormalities, so that they can be attributed to a faulty or normal case. As such, the detection system is composed of two components, a quantization matrix to flag the measurements, and a multi-valued influence matrix (MVIM) that represents the behavior of measurements during normal operation and at fault instances. Both the quantization matrix and influence matrix are tuned during a training session so as to minimize the error in detection. To demonstrate the effectiveness of this detection system, it was applied to vibration measurements collected from a helicopter gearbox during normal operation and at various fault instances. The results indicate that the MVIM method provides excellent results when the full range of faults effects on the measurements are included in the training set.
Micro and MACRO Fractals Generated by Multi-Valued Dynamical Systems
NASA Astrophysics Data System (ADS)
Banakh, T.; Novosad, N.
2014-08-01
Given a multi-valued function Φ : X \\mumap X on a topological space X we study the properties of its fixed fractal \\malteseΦ, which is defined as the closure of the orbit Φω(*Φ) = ⋃n∈ωΦn(*Φ) of the set *Φ = {x ∈ X : x ∈ Φ(x)} of fixed points of Φ. A special attention is paid to the duality between micro-fractals and macro-fractals, which are fixed fractals \\maltese Φ and \\maltese {Φ -1} for a contracting compact-valued function Φ : X \\mumap X on a complete metric space X. With help of algorithms (described in this paper) we generate various images of macro-fractals which are dual to some well-known micro-fractals like the fractal cross, the Sierpiński triangle, Sierpiński carpet, the Koch curve, or the fractal snowflakes. The obtained images show that macro-fractals have a large-scale fractal structure, which becomes clearly visible after a suitable zooming.
2015-01-01
Implementing parallel and multivalued logic operations at the molecular scale has the potential to improve the miniaturization and efficiency of a new generation of nanoscale computing devices. Two-dimensional photon-echo spectroscopy is capable of resolving dynamical pathways on electronic and vibrational molecular states. We experimentally demonstrate the implementation of molecular decision trees, logic operations where all possible values of inputs are processed in parallel and the outputs are read simultaneously, by probing the laser-induced dynamics of populations and coherences in a rhodamine dye mounted on a short DNA duplex. The inputs are provided by the bilinear interactions between the molecule and the laser pulses, and the output values are read from the two-dimensional molecular response at specific frequencies. Our results highlights how ultrafast dynamics between multiple molecular states induced by light–matter interactions can be used as an advantage for performing complex logic operations in parallel, operations that are faster than electrical switching. PMID:25984269
NASA Astrophysics Data System (ADS)
Zeng, Shengda; Migórski, Stanisław
2018-03-01
In this paper a class of elliptic hemivariational inequalities involving the time-fractional order integral operator is investigated. Exploiting the Rothe method and using the surjectivity of multivalued pseudomonotone operators, a result on existence of solution to the problem is established. Then, this abstract result is applied to provide a theorem on the weak solvability of a fractional viscoelastic contact problem. The process is quasistatic and the constitutive relation is modeled with the fractional Kelvin-Voigt law. The friction and contact conditions are described by the Clarke generalized gradient of nonconvex and nonsmooth functionals. The variational formulation of this problem leads to a fractional hemivariational inequality.
A class of fractional differential hemivariational inequalities with application to contact problem
NASA Astrophysics Data System (ADS)
Zeng, Shengda; Liu, Zhenhai; Migorski, Stanislaw
2018-04-01
In this paper, we study a class of generalized differential hemivariational inequalities of parabolic type involving the time fractional order derivative operator in Banach spaces. We use the Rothe method combined with surjectivity of multivalued pseudomonotone operators and properties of the Clarke generalized gradient to establish existence of solution to the abstract inequality. As an illustrative application, a frictional quasistatic contact problem for viscoelastic materials with adhesion is investigated, in which the friction and contact conditions are described by the Clarke generalized gradient of nonconvex and nonsmooth functionals, and the constitutive relation is modeled by the fractional Kelvin-Voigt law.
Automated Database Schema Design Using Mined Data Dependencies.
ERIC Educational Resources Information Center
Wong, S. K. M.; Butz, C. J.; Xiang, Y.
1998-01-01
Describes a bottom-up procedure for discovering multivalued dependencies in observed data without knowing a priori the relationships among the attributes. The proposed algorithm is an application of technique designed for learning conditional independencies in probabilistic reasoning; a prototype system for automated database schema design has…
Subglacial hydrology and the formation of ice streams
Kyrke-Smith, T. M; Katz, R. F; Fowler, A. C
2014-01-01
Antarctic ice streams are associated with pressurized subglacial meltwater but the role this water plays in the dynamics of the streams is not known. To address this, we present a model of subglacial water flow below ice sheets, and particularly below ice streams. The base-level flow is fed by subglacial melting and is presumed to take the form of a rough-bedded film, in which the ice is supported by larger clasts, but there is a millimetric water film which submerges the smaller particles. A model for the film is given by two coupled partial differential equations, representing mass conservation of water and ice closure. We assume that there is no sediment transport and solve for water film depth and effective pressure. This is coupled to a vertically integrated, higher order model for ice-sheet dynamics. If there is a sufficiently small amount of meltwater produced (e.g. if ice flux is low), the distributed film and ice sheet are stable, whereas for larger amounts of melt the ice–water system can become unstable, and ice streams form spontaneously as a consequence. We show that this can be explained in terms of a multi-valued sliding law, which arises from a simplified, one-dimensional analysis of the coupled model. PMID:24399921
Phase portraits analysis of a barothropic system: The initial value problem
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kuetche, Victor Kamgang, E-mail: vkuetche@yahoo.fr; Department of Physics, Faculty of Science, University of Yaounde I, P.O. Box 812, Yaounde; The Abdus Salam International Center for Theoretical Physics, Strada Costiera 11, 34014 Trieste
2014-05-15
In this paper, we investigate the phase portraits features of a barothropic relaxing medium under pressure perturbations. In the starting point, we show within a third-order of accuracy that the previous system is modeled by a “dissipative” cubic nonlinear evolution equation. Paying particular attention to high-frequency perturbations of the system, we solve the initial value problem of the system both analytically and numerically while unveiling the existence of localized multivalued waveguide channels. Accordingly, we find that the “dissipative” term with a “dissipative” parameter less than some limit value does not destroy the ambiguous solutions. We address some physical implications ofmore » the results obtained previously.« less
Multi-Valued Logic, Neutrosophy, and Schrödinger Equation
NASA Astrophysics Data System (ADS)
Smarandache, Florentin; Christianto, Victor
2017-04-01
Discussing some paradoxes in Quantum Mechanics from the viewpoint of Multi-Valued-logic pioneered by Lukasiewicz, and the recent concept Neutrosophic Logic. Essentially, this new concept offers new insights on the idea of `identity', which too often it has been accepted as given. Neutrosophy itself was developed in attempt to generalize Fuzzy-Logic introduced by L. Zadeh. The discussion is motivated by observation that despite almost eight decades, there is indication that some of those paradoxes known in Quantum Physics are not yet solved. In our knowledge, this is because the solution of those paradoxes requires re-examination of the foundations of logic itself, in particular on the notion of identity and multi-valuedness of entity. The discussion is also intended for young physicist fellows who think that somewhere there should be a `complete' explanation of these paradoxes in Quantum Mechanics. If this it doesn't answer all of their questions, it is our hope that at least it offers a new alternative viewpoint for these old questions.
The fuzzy cube and causal efficacy: representation of concomitant mechanisms in stroke.
Jobe, Thomas H.; Helgason, Cathy M.
1998-04-01
Twentieth century medical science has embraced nineteenth century Boolean probability theory based upon two-valued Aristotelian logic. With the later addition of bit-based, von Neumann structured computational architectures, an epistemology based on randomness has led to a bivalent epidemiological methodology that dominates medical decision making. In contrast, fuzzy logic, based on twentieth century multi-valued logic, and computational structures that are content addressed and adaptively modified, has advanced a new scientific paradigm for the twenty-first century. Diseases such as stroke involve multiple concomitant causal factors that are difficult to represent using conventional statistical methods. We tested which paradigm best represented this complex multi-causal clinical phenomenon-stroke. We show that the fuzzy logic paradigm better represented clinical complexity in cerebrovascular disease than current probability theory based methodology. We believe this finding is generalizable to all of clinical science since multiple concomitant causal factors are involved in nearly all known pathological processes.
Adiabatic pipelining: a key to ternary computing with quantum dots.
Pečar, P; Ramšak, A; Zimic, N; Mraz, M; Lebar Bajec, I
2008-12-10
The quantum-dot cellular automaton (QCA), a processing platform based on interacting quantum dots, was introduced by Lent in the mid-1990s. What followed was an exhilarating period with the development of the line, the functionally complete set of logic functions, as well as more complex processing structures, however all in the realm of binary logic. Regardless of these achievements, it has to be acknowledged that the use of binary logic is in computing systems mainly the end result of the technological limitations, which the designers had to cope with in the early days of their design. The first advancement of QCAs to multi-valued (ternary) processing was performed by Lebar Bajec et al, with the argument that processing platforms of the future should not disregard the clear advantages of multi-valued logic. Some of the elementary ternary QCAs, necessary for the construction of more complex processing entities, however, lead to a remarkable increase in size when compared to their binary counterparts. This somewhat negates the advantages gained by entering the ternary computing domain. As it turned out, even the binary QCA had its initial hiccups, which have been solved by the introduction of adiabatic switching and the application of adiabatic pipeline approaches. We present here a study that introduces adiabatic switching into the ternary QCA and employs the adiabatic pipeline approach to successfully solve the issues of elementary ternary QCAs. What is more, the ternary QCAs presented here are sizewise comparable to binary QCAs. This in our view might serve towards their faster adoption.
Data Aggregation in Multi-Agent Systems in the Presence of Hybrid Faults
ERIC Educational Resources Information Center
Srinivasan, Satish Mahadevan
2010-01-01
Data Aggregation (DA) is a set of functions that provide components of a distributed system access to global information for purposes of network management and user services. With the diverse new capabilities that networks can provide, applicability of DA is growing. DA is useful in dealing with multi-value domain information and often requires…
On an Integral with Two Branch Points
ERIC Educational Resources Information Center
de Oliveira, E. Capelas; Chiacchio, Ary O.
2006-01-01
The paper considers a class of real integrals performed by using a convenient integral in the complex plane. A complex integral containing a multi-valued function with two branch points is transformed into another integral containing a pole and a unique branch point. As a by-product we obtain a new class of integrals which can be calculated in a…
ERIC Educational Resources Information Center
Chang, Chi
2015-01-01
It is known that interventions are hard to assign randomly to subjects in social psychological studies, because randomized control is difficult to implement strictly and precisely. Thus, in nonexperimental studies and observational studies, controlling the impact of covariates on the dependent variables and addressing the robustness of the…
The Bilinear Product Model of Hysteresis Phenomena
NASA Astrophysics Data System (ADS)
Kádár, György
1989-01-01
In ferromagnetic materials non-reversible magnetization processes are represented by rather complex hysteresis curves. The phenomenological description of such curves needs the use of multi-valued, yet unambiguous, deterministic functions. The history dependent calculation of consecutive Everett-integrals of the two-variable Preisach-function can account for the main features of hysteresis curves in uniaxial magnetic materials. The traditional Preisach model has recently been modified on the basis of population dynamics considerations, removing the non-real congruency property of the model. The Preisach-function was proposed to be a product of two factors of distinct physical significance: a magnetization dependent function taking into account the overall magnetization state of the body and a bilinear form of a single variable, magnetic field dependent, switching probability function. The most important statement of the bilinear product model is, that the switching process of individual particles is to be separated from the book-keeping procedure of their states. This empirical model of hysteresis can easily be extended to other irreversible physical processes, such as first order phase transitions.
Saturation: An efficient iteration strategy for symbolic state-space generation
NASA Technical Reports Server (NTRS)
Ciardo, Gianfranco; Luettgen, Gerald; Siminiceanu, Radu; Bushnell, Dennis M. (Technical Monitor)
2001-01-01
This paper presents a novel algorithm for generating state spaces of asynchronous systems using Multi-valued Decision Diagrams. In contrast to related work, the next-state function of a system is not encoded as a single Boolean function, but as cross-products of integer functions. This permits the application of various iteration strategies to build a system's state space. In particular, this paper introduces a new elegant strategy, called saturation, and implements it in the tool SMART. On top of usually performing several orders of magnitude faster than existing BDD-based state-space generators, the algorithm's required peak memory is often close to the nal memory needed for storing the overall state spaces.
Scanner imaging systems, aircraft
NASA Technical Reports Server (NTRS)
Ungar, S. G.
1982-01-01
The causes and effects of distortion in aircraft scanner data are reviewed and an approach to reduce distortions by modelling the effect of aircraft motion on the scanner scene is discussed. With the advent of advanced satellite borne scanner systems, the geometric and radiometric correction of aircraft scanner data has become increasingly important. Corrections are needed to reliably simulate observations obtained by such systems for purposes of evaluation. It is found that if sufficient navigational information is available, aircraft scanner coordinates may be related very precisely to planimetric ground coordinates. However, the potential for a multivalue remapping transformation (i.e., scan lines crossing each other), adds an inherent uncertainty, to any radiometric resampling scheme, which is dependent on the precise geometry of the scan and ground pattern.
Pattern classifier for health monitoring of helicopter gearboxes
NASA Technical Reports Server (NTRS)
Chin, Hsinyung; Danai, Kourosh; Lewicki, David G.
1993-01-01
The application of a newly developed diagnostic method to a helicopter gearbox is demonstrated. This method is a pattern classifier which uses a multi-valued influence matrix (MVIM) as its diagnostic model. The method benefits from a fast learning algorithm, based on error feedback, that enables it to estimate gearbox health from a small set of measurement-fault data. The MVIM method can also assess the diagnosability of the system and variability of the fault signatures as the basis to improve fault signatures. This method was tested on vibration signals reflecting various faults in an OH-58A main rotor transmission gearbox. The vibration signals were then digitized and processed by a vibration signal analyzer to enhance and extract various features of the vibration data. The parameters obtained from this analyzer were utilized to train and test the performance of the MVIM method in both detection and diagnosis. The results indicate that the MVIM method provided excellent detection results when the full range of faults effects on the measurements were included in training, and it had a correct diagnostic rate of 95 percent when the faults were included in training.
Low-order modelling of a drop on a highly-hydrophobic substrate: statics and dynamics
NASA Astrophysics Data System (ADS)
Wray, Alexander W.; Matar, Omar K.; Davis, Stephen H.
2017-11-01
We analyse the behaviour of droplets resting on highly-hydrophobic substrates. This problem is of practical interest due to its appearance in many physical contexts involving the spreading, wetting, and dewetting of fluids on solid substrates. In mathematical terms, it exhibits an interesting challenge as the interface is multi-valued as a function of the natural Cartesian co-ordinates, presenting a stumbling block to typical low-order modelling techniques. Nonetheless, we show that in the static case, the interfacial shape is governed by the Young-Laplace equation, which may be solved explicitly in terms of elliptic functions. We present simple low-order expressions that faithfully reproduce the shapes. We then consider the dynamic case, showing that the predictions of our low-order model compare favourably with those obtained from direct numerical simulations. We also examine the characteristic flow regimes of interest. EPSRC, UK, MEMPHIS program Grant (EP/K003976/1), RAEng Research Chair (OKM).
A longitudinal analysis of the relationship between fertility timing and schooling.
Stange, Kevin
2011-08-01
This article quantifies the contribution of pre-treatment dynamic selection to the relationship between fertility timing and postsecondary attainment, after controlling for a rich set of predetermined characteristics. Eventual mothers and nonmothers are matched using their predicted birth hazard rate, which shares the desirable properties of a propensity score but in a multivalued treatment setting. I find that eventual mothers and matched nonmothers enter college at the same rate, but their educational paths diverge well before the former become pregnant. This pre-pregnancy divergence creates substantial differences in ultimate educational attainment that cannot possibly be due to the childbirth itself. Controls for predetermined characteristics and fixed effects do not address this form of dynamic selection bias. A dynamic model of the simultaneous childbirth-education sequencing decision is necessary to address it.
Dynamics and Stability of Acoustic Wavefronts in the Ocean
2012-09-30
has been developed to solve the eikonal equation and calculate wavefront and ray trajectory displacements, which are required to be small over a...solution of the eikonal equation lies in the eikonal (and acoustic travel time) being a multi-valued function of position. A number of computational...approaches to solve the eikonal equation without ray tracing have been developed in mathematical and seismological communities (Vidale, 1990; Sava and
Choukourov, A; Kylián, O; Petr, M; Vaidulych, M; Nikitin, D; Hanuš, J; Artemenko, A; Shelemin, A; Gordeev, I; Kolská, Z; Solař, P; Khalakhan, I; Ryabov, A; Májek, J; Slavínská, D; Biederman, H
2017-02-16
A layer of 14 nm-sized Ag nanoparticles undergoes complex transformation when overcoated by thin films of a fluorocarbon plasma polymer. Two regimes of surface evolution are identified, both with invariable RMS roughness. In the early regime, the plasma polymer penetrates between and beneath the nanoparticles, raising them above the substrate and maintaining the multivalued character of the surface roughness. The growth (β) and the dynamic (1/z) exponents are close to zero and the interface bears the features of self-affinity. The presence of inter-particle voids leads to heterogeneous wetting with an apparent water contact angle θ a = 135°. The multivalued nanotopography results in two possible positions for the water droplet meniscus, yet strong water adhesion indicates that the meniscus is located at the lower part of the spherical nanofeatures. In the late regime, the inter-particle voids become filled and the interface acquires a single valued character. The plasma polymer proceeds to grow on the thus-roughened surface whereas the nanoparticles keep emerging away from the substrate. The RMS roughness remains invariable and lateral correlations propagate with 1/z = 0.27. The surface features multiaffinity which is given by different evolution of length scales associated with the nanoparticles and with the plasma polymer. The wettability turns to the homogeneous wetting state.
Anonymous voting for multi-dimensional CV quantum system
NASA Astrophysics Data System (ADS)
Rong-Hua, Shi; Yi, Xiao; Jin-Jing, Shi; Ying, Guo; Moon-Ho, Lee
2016-06-01
We investigate the design of anonymous voting protocols, CV-based binary-valued ballot and CV-based multi-valued ballot with continuous variables (CV) in a multi-dimensional quantum cryptosystem to ensure the security of voting procedure and data privacy. The quantum entangled states are employed in the continuous variable quantum system to carry the voting information and assist information transmission, which takes the advantage of the GHZ-like states in terms of improving the utilization of quantum states by decreasing the number of required quantum states. It provides a potential approach to achieve the efficient quantum anonymous voting with high transmission security, especially in large-scale votes. Project supported by the National Natural Science Foundation of China (Grant Nos. 61272495, 61379153, and 61401519), the Research Fund for the Doctoral Program of Higher Education of China (Grant No. 20130162110012), and the MEST-NRF of Korea (Grant No. 2012-002521).
Tanaka, Gouhei; Aihara, Kazuyuki
2009-09-01
A widely used complex-valued activation function for complex-valued multistate Hopfield networks is revealed to be essentially based on a multilevel step function. By replacing the multilevel step function with other multilevel characteristics, we present two alternative complex-valued activation functions. One is based on a multilevel sigmoid function, while the other on a characteristic of a multistate bifurcating neuron. Numerical experiments show that both modifications to the complex-valued activation function bring about improvements in network performance for a multistate associative memory. The advantage of the proposed networks over the complex-valued Hopfield networks with the multilevel step function is more outstanding when a complex-valued neuron represents a larger number of multivalued states. Further, the performance of the proposed networks in reconstructing noisy 256 gray-level images is demonstrated in comparison with other recent associative memories to clarify their advantages and disadvantages.
NASA Astrophysics Data System (ADS)
Patel, Ravi; Kong, Bo; Capecelatro, Jesse; Fox, Rodney; Desjardins, Olivier
2017-11-01
Particle-laden turbulent flows are important features of many environmental and industrial processes. Euler-Euler (EE) simulations of these flows are more computationally efficient than Euler-Lagrange (EL) simulations. However, traditional EE methods, such as the two-fluid model, cannot faithfully capture dilute regions of flow with finite Stokes number particles. For this purpose, the multi-valued nature of the particle velocity field must be treated with a polykinetic description. Various quadrature-based moment methods (QBMM) can be used to approximate the full kinetic description by solving for a set of moments of the particle velocity distribution function (VDF) and providing closures for the higher-order moments. Early QBMM fail to maintain the strict hyperbolicity of the kinetic equations, producing unphysical delta shocks (i.e., mass accumulation at a point). In previous work, a 2-D conditional hyperbolic quadrature method of moments (CHyQMOM) was proposed as a fourth-order QBMM closure that maintains strict hyperbolicity. Here, we present the 3-D extension of CHyQMOM. We compare results from CHyQMOM to other QBMM and EL in the context of particle trajectory crossing, cluster-induced turbulence, and particle-laden channel flow. NSF CBET-1437903.
Short Note on Complexity of Multi-Value Byzantine Agreement
2010-07-27
which lead to nBl /D bits over the whole algorithm. Broadcasts in extended step: In the extended step, every node broadcasts D bits. Thus nDB bits...bits, as: (n− 1)l + n(n− 1)(k +D/k)l/D + nBl /D + nDBt(t+ 1) (4) = (n− 1)l +O(n2kl/D + n2l/k + nBl /D + n3BD). (5) Notice that broadcast algorithm of
Molecular processors: from qubits to fuzzy logic.
Gentili, Pier Luigi
2011-03-14
Single molecules or their assemblies are information processing devices. Herein it is demonstrated how it is possible to process different types of logic through molecules. As long as decoherent effects are maintained far away from a pure quantum mechanical system, quantum logic can be processed. If the collapse of superimposed or entangled wavefunctions is unavoidable, molecules can still be used to process either crisp (binary or multi-valued) or fuzzy logic. The way for implementing fuzzy inference engines is declared and it is supported by the examples of molecular fuzzy logic systems devised so far. Fuzzy logic is drawing attention in the field of artificial intelligence, because it models human reasoning quite well. This ability may be due to some structural analogies between a fuzzy logic system and the human nervous system. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Huang, Dongmei; Xu, Wei
2017-11-01
In this paper, the combination of the cubic nonlinearity and time delay is proposed to improve the performance of a piecewise-smooth (PWS) system with negative stiffness. Dynamical properties, feedback control performance and symmetry-breaking bifurcation are mainly considered for a PWS system with negative stiffness under nonlinear position and velocity feedback control. For the free vibration system, the homoclinic-like orbits are firstly derived. Then, the amplitude-frequency response of the controlled system is obtained analytically in aspect of the Lindstedt-Poincaré method and the method of multiple scales, which is also verified through the numerical results. In this regard, a softening-type behavior, which directly leads to the multi-valued responses, is illustrated over the negative position feedback. Especially, the five-valued responses in which three branches of them are stable are found. And complex multi-valued characteristics are also observed in the force-amplitude responses. Furthermore, for explaining the effectiveness of feedback control, the equivalent damping and stiffness are also introduced. Sensitivity of the system response to the feedback gain and time delay is comprehensively considered and interesting dynamical properties are found. Relatively, from the perspective of suppressing the maximum amplitude and controlling the resonance stability, the selection of the feedback parameters is discussed. Finally, the symmetry-breaking bifurcation and chaotic motion are considered.
A High-resolution Study of Presupernova Core Structure
NASA Astrophysics Data System (ADS)
Sukhbold, Tuguldur; Woosley, S. E.; Heger, Alexander
2018-06-01
The density structure surrounding the iron core of a massive star when it dies is known to have a major effect on whether or not the star explodes. Here we repeat previous surveys of presupernova evolution with some important corrections to code physics and four to 10 times better mass resolution in each star. The number of presupernova masses considered is also much larger. Over 4000 models are calculated in the range from 12 to 60 M ⊙ with varying mass loss rates. The core structure is not greatly affected by the increased spatial resolution. The qualitative patterns of compactness measures and their extrema are the same, but with the increased number of models, the scatter seen in previous studies is replaced by several localized branches. More physics-based analyses by Ertl et al. and Müller et al. show these branches with less scatter than the single-parameter characterization of O’Connor & Ott. These branches are particularly apparent for stars in the mass ranges 14–19 and 22–24 M ⊙. The multivalued solutions are a consequence of interference between several carbon- and oxygen-burning shells during the late stages of evolution. For a relevant range of masses, whether a star explodes or not may reflect the small, almost random differences in its late evolution more than its initial mass. The large number of models allows statistically meaningful statements about the radius, luminosity, and effective temperatures of presupernova stars, their core structures, and their remnant mass distributions.
Lefschetz thimbles in fermionic effective models with repulsive vector-field
NASA Astrophysics Data System (ADS)
Mori, Yuto; Kashiwa, Kouji; Ohnishi, Akira
2018-06-01
We discuss two problems in complexified auxiliary fields in fermionic effective models, the auxiliary sign problem associated with the repulsive vector-field and the choice of the cut for the scalar field appearing from the logarithmic function. In the fermionic effective models with attractive scalar and repulsive vector-type interaction, the auxiliary scalar and vector fields appear in the path integral after the bosonization of fermion bilinears. When we make the path integral well-defined by the Wick rotation of the vector field, the oscillating Boltzmann weight appears in the partition function. This "auxiliary" sign problem can be solved by using the Lefschetz-thimble path-integral method, where the integration path is constructed in the complex plane. Another serious obstacle in the numerical construction of Lefschetz thimbles is caused by singular points and cuts induced by multivalued functions of the complexified scalar field in the momentum integration. We propose a new prescription which fixes gradient flow trajectories on the same Riemann sheet in the flow evolution by performing the momentum integration in the complex domain.
Future directions: Integrated resource planning
NASA Astrophysics Data System (ADS)
Bauer, D. C.; Eto, J.
Integrated resource planning or IRP is the process for integrating supply- and demand-side resources to provide energy services at a cost that balances the interests of all stakeholders. It now is the resource planning process used by electric utilities in over 30 states. The goals of IRP have evolved from least cost planning and encouragement of demand-side management to broader, more complex issues including core competitive business activity, risk management and sharing, accounting for externalities, and fuel switching between gas and electricity. IRP processes are being extended to other interior regions of the country, to non-investor owned utilities, and to regional (rather than individual utility) planning bases, and to other fuels (natural gas). The comprehensive, multi-valued, and public reasoning characteristics of IRP could be extended to applications beyond energy, e.g., transportation, surface water management, and health care in ways suggested.
Construction of dynamic stochastic simulation models using knowledge-based techniques
NASA Technical Reports Server (NTRS)
Williams, M. Douglas; Shiva, Sajjan G.
1990-01-01
Over the past three decades, computer-based simulation models have proven themselves to be cost-effective alternatives to the more structured deterministic methods of systems analysis. During this time, many techniques, tools and languages for constructing computer-based simulation models have been developed. More recently, advances in knowledge-based system technology have led many researchers to note the similarities between knowledge-based programming and simulation technologies and to investigate the potential application of knowledge-based programming techniques to simulation modeling. The integration of conventional simulation techniques with knowledge-based programming techniques is discussed to provide a development environment for constructing knowledge-based simulation models. A comparison of the techniques used in the construction of dynamic stochastic simulation models and those used in the construction of knowledge-based systems provides the requirements for the environment. This leads to the design and implementation of a knowledge-based simulation development environment. These techniques were used in the construction of several knowledge-based simulation models including the Advanced Launch System Model (ALSYM).
Time-delayed autosynchronous swarm control.
Biggs, James D; Bennet, Derek J; Dadzie, S Kokou
2012-01-01
In this paper a general Morse potential model of self-propelling particles is considered in the presence of a time-delayed term and a spring potential. It is shown that the emergent swarm behavior is dependent on the delay term and weights of the time-delayed function, which can be set to induce a stationary swarm, a rotating swarm with uniform translation, and a rotating swarm with a stationary center of mass. An analysis of the mean field equations shows that without a spring potential the motion of the center of mass is determined explicitly by a multivalued function. For a nonzero spring potential the swarm converges to a vortex formation about a stationary center of mass, except at discrete bifurcation points where the center of mass will periodically trace an ellipse. The analytical results defining the behavior of the center of mass are shown to correspond with the numerical swarm simulations.
On the structure of the set of coincidence points
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arutyunov, A V; Gel'man, B D
2015-03-31
We consider the set of coincidence points for two maps between metric spaces. Cardinality, metric and topological properties of the coincidence set are studied. We obtain conditions which guarantee that this set (a) consists of at least two points; (b) consists of at least n points; (c) contains a countable subset; (d) is uncountable. The results are applied to study the structure of the double point set and the fixed point set for multivalued contractions. Bibliography: 12 titles.
NASA Astrophysics Data System (ADS)
Ghosh, Amal K.
2010-09-01
The parity generators and the checkers are the most important circuits in communication systems. With the development of multi-valued logic (MVL), the proposed system with parity generators and checkers is the most required using the recently developed optoelectronic technology in the modified trinary number (MTN) system. This system also meets up the tremendous needs of speeds by exploiting the savart plates and spatial light modulators (SLM) in the optical tree architecture (OTA).
Resolving the issue of branched Hamiltonian in modified Lanczos-Lovelock gravity
NASA Astrophysics Data System (ADS)
Ruz, Soumendranath; Mandal, Ranajit; Debnath, Subhra; Sanyal, Abhik Kumar
2016-07-01
The Hamiltonian constraint H_c = N{H} = 0, defines a diffeomorphic structure on spatial manifolds by the lapse function N in general theory of relativity. However, it is not manifest in Lanczos-Lovelock gravity, since the expression for velocity in terms of the momentum is multivalued. Thus the Hamiltonian is a branch function of momentum. Here we propose an extended theory of Lanczos-Lovelock gravity to construct a unique Hamiltonian in its minisuperspace version, which results in manifest diffeomorphic invariance and canonical quantization.
Plausible inference: A multi-valued logic for problem solving
NASA Technical Reports Server (NTRS)
Friedman, L.
1979-01-01
A new logic is developed which permits continuously variable strength of belief in the truth of assertions. Four inference rules result, with formal logic as a limiting case. Quantification of belief is defined. Propagation of belief to linked assertions results from dependency-based techniques of truth maintenance so that local consistency is achieved or contradiction discovered in problem solving. Rules for combining, confirming, or disconfirming beliefs are given, and several heuristics are suggested that apply to revising already formed beliefs in the light of new evidence. The strength of belief that results in such revisions based on conflicting evidence are a highly subjective phenomenon. Certain quantification rules appear to reflect an orderliness in the subjectivity. Several examples of reasoning by plausible inference are given, including a legal example and one from robot learning. Propagation of belief takes place in directions forbidden in formal logic and this results in conclusions becoming possible for a given set of assertions that are not reachable by formal logic.
Canonical multi-valued input Reed-Muller trees and forms
NASA Technical Reports Server (NTRS)
Perkowski, M. A.; Johnson, P. D.
1991-01-01
There is recently an increased interest in logic synthesis using EXOR gates. The paper introduces the fundamental concept of Orthogonal Expansion, which generalizes the ring form of the Shannon expansion to the logic with multiple-valued (mv) inputs. Based on this concept we are able to define a family of canonical tree circuits. Such circuits can be considered for binary and multiple-valued input cases. They can be multi-level (trees and DAG's) or flattened to two-level AND-EXOR circuits. Input decoders similar to those used in Sum of Products (SOP) PLA's are used in realizations of multiple-valued input functions. In the case of the binary logic the family of flattened AND-EXOR circuits includes several forms discussed by Davio and Green. For the case of the logic with multiple-valued inputs, the family of the flattened mv AND-EXOR circuits includes three expansions known from literature and two new expansions.
Nonlinear hyperbolic theory of thermal waves in metals
NASA Technical Reports Server (NTRS)
Wilhelm, H. E.; Choi, S. H.
1975-01-01
A closed-form solution for cylindrical thermal waves in metals is given based on the nonlinear hyperbolic system of energy-conservation and heat-flux relaxation equations. It is shown that heat released from a line source propagates radially outward with finite speed in the form of a thermal wave which exhibits a discontinuous wave front. Unique nonlinear thermal-wave solutions exist up to a critical amount of driving energy, i.e., for larger energy releases, the thermal flow becomes multivalued (occurrence of shock waves). By comparison, it is demonstrated that the parabolic thermal-wave theory gives, in general, a misleading picture of the profile and propagation of thermal waves and leads to physical (infinite speed of heat propagation) and mathematical (divergent energy integrals) difficulties. Attention is drawn to the importance of temporal heat-flux relaxation for the physical understanding of fast transient processes such as thermal waves and more general explosions and implosions.
Garai, Sisir Kumar
2012-04-10
To meet the demand of very fast and agile optical networks, the optical processors in a network system should have a very fast execution rate, large information handling, and large information storage capacities. Multivalued logic operations and multistate optical flip-flops are the basic building blocks for such fast running optical computing and data processing systems. In the past two decades, many methods of implementing all-optical flip-flops have been proposed. Most of these suffer from speed limitations because of the low switching response of active devices. The frequency encoding technique has been used because of its many advantages. It can preserve its identity throughout data communication irrespective of loss of light energy due to reflection, refraction, attenuation, etc. The action of polarization-rotation-based very fast switching of semiconductor optical amplifiers increases processing speed. At the same time, tristate optical flip-flops increase information handling capacity.
Asymmetric nanowire SQUID: Linear current-phase relation, stochastic switching, and symmetries
NASA Astrophysics Data System (ADS)
Murphy, A.; Bezryadin, A.
2017-09-01
We study nanostructures based on two ultrathin superconducting nanowires connected in parallel to form a superconducting quantum interference device (SQUID). The measured function of the critical current versus magnetic field, IC(B ) , is multivalued, asymmetric, and its maxima and minima are shifted from the usual integer and half integer flux quantum points. We also propose a low-temperature-limit model which generates accurate fits to the IC(B ) functions and provides verifiable predictions. The key assumption of our model is that each wire is characterized by a sample-specific critical phase ϕC defined as the phase difference at which the supercurrent in the wire is the maximum. For our nanowires ϕC is much greater than the usual π /2 , which makes a qualitative difference in the behavior of the SQUID. The nanowire current-phase relation is assumed linear, since the wires are much longer than the coherence length. The model explains single-valuedness regions where only one vorticity value nv is stable. Also, it predicts regions where multiple vorticity values are stable because the Little-Parks (LP) diamonds, which describe the region of stability for each winding number nv in the current-field diagram, can overlap. We also observe and explain regions in which the standard deviation of the switching current is independent of the magnetic field. We develop a technique that allows a reliable detection of hidden phase slips and use it to determine the boundaries of the LP diamonds even at low currents where IC(B ) is not directly measurable.
NASA Astrophysics Data System (ADS)
Charles, Jérôme
1999-03-01
Penguin contributions, being non-negligible in general, can hide the information on the Cabibbo-Kobayashi-Maskawa angle α coming from the measurement of the time-dependent B0d(t)-->π+π- CP asymmetry. Nevertheless, we show that this information can be summarized in a set of simple equations, expressing α as a multivalued function of a single theoretically unknown parameter, which conveniently can be chosen as a well-defined ratio of penguin to tree amplitudes. Using these exact analytic expressions, free of any assumption other than the standard model, and some reasonable hypotheses to constrain the modulus of the penguin amplitude, we derive several new upper bounds on the penguin-induced shift \\|2α-2αeff\\|, generalizing the recent result of Grossman and Quinn. These bounds depend on the average branching ratios of some decays (π0,π0,K0K0¯,K+/-π-/+) particularly sensitive to the penguin contributions. On the other hand, with further and less conservative approximations, we show that the knowledge of the B+/--->Kπ+/- branching ratio alone gives sufficient information to extract the free parameter without the need of other measurements, and without knowing \\|Vtd\\| or \\|Vub\\|. More generally, knowing the modulus of the penguin amplitude with an accuracy of ~30% might result in an extraction of α competitive with the experimentally more difficult isospin analysis. We also show that our framework allows us to recover most of the previous approaches in a transparent and simple way, and in some cases to improve them. In addition we discuss in detail the problem of the various kinds of discrete ambiguities.
Problem-Oriented Corporate Knowledge Base Models on the Case-Based Reasoning Approach Basis
NASA Astrophysics Data System (ADS)
Gluhih, I. N.; Akhmadulin, R. K.
2017-07-01
One of the urgent directions of efficiency enhancement of production processes and enterprises activities management is creation and use of corporate knowledge bases. The article suggests a concept of problem-oriented corporate knowledge bases (PO CKB), in which knowledge is arranged around possible problem situations and represents a tool for making and implementing decisions in such situations. For knowledge representation in PO CKB a case-based reasoning approach is encouraged to use. Under this approach, the content of a case as a knowledge base component has been defined; based on the situation tree a PO CKB knowledge model has been developed, in which the knowledge about typical situations as well as specific examples of situations and solutions have been represented. A generalized problem-oriented corporate knowledge base structural chart and possible modes of its operation have been suggested. The obtained models allow creating and using corporate knowledge bases for support of decision making and implementing, training, staff skill upgrading and analysis of the decisions taken. The universal interpretation of terms “situation” and “solution” adopted in the work allows using the suggested models to develop problem-oriented corporate knowledge bases in different subject domains. It has been suggested to use the developed models for making corporate knowledge bases of the enterprises that operate engineer systems and networks at large production facilities.
The Meta-Ontology Model of the Fishdisease Diagnostic Knowledge Based on Owl
NASA Astrophysics Data System (ADS)
Shi, Yongchang; Gao, Wen; Hu, Liang; Fu, Zetian
For improving available and reusable of knowledge in fish disease diagnosis (FDD) domain and facilitating knowledge acquisition, an ontology model of FDD knowledge was developed based on owl according to FDD knowledge model. It includes terminology of terms in FDD knowledge and hierarchies of their class.
A Holographic Prism Based on Photo-Thermo-Refractive Glass: Requirements and Possibilities
NASA Astrophysics Data System (ADS)
Angervaks, A. E.; Gorokhovskii, K. S.; Granovskii, V. A.; Van Bac, Doan; Ivanov, S. A.; Okun', R. A.; Nikonorov, N. V.; Ryskin, A. I.
2017-12-01
A technology for multivalued holographic measurement of a plane angle—a holographic prism, which serves as the basis for a device for calibrating the testing tools for navigational equipment under rolling— has been developed. The holographic prism is a miniature sample of photosensitive material, which contains a system of superimposed holographic gratings, and a laser the radiation of which passes through gratings to form a fan of diffracted beams. The fan in a prism based on a calcium fluoride (fluorite) crystal with color centers contained six out-of-plane beams radiating from a region with a hard-to-localize center. This fact hindered calibration of the measure and its application in the testing device. The use of photothermo-refractive glass as a material for preparing a sample and recording a system of holograms in it makes it possible to eliminate the drawbacks of fluorite-based prism. The number of holograms rises up to 21, the fan becomes in-plane, and its center is localized in a small region, with a size of several tenths of the sample thickness (1-2 mm). The fan beams are energetically homogeneous, and each can be identified when using the fan in a testing device.
Light-Triggered Ternary Device and Inverter Based on Heterojunction of van der Waals Materials.
Shim, Jaewoo; Jo, Seo-Hyeon; Kim, Minwoo; Song, Young Jae; Kim, Jeehwan; Park, Jin-Hong
2017-06-27
Multivalued logic (MVL) devices/circuits have received considerable attention because the binary logic used in current Si complementary metal-oxide-semiconductor (CMOS) technology cannot handle the predicted information throughputs and energy demands of the future. To realize MVL, the conventional transistor platform needs to be redesigned to have two or more distinctive threshold voltages (V TH s). Here, we report a finding: the photoinduced drain current in graphene/WSe 2 heterojunction transistors unusually decreases with increasing gate voltage under illumination, which we refer to as the light-induced negative differential transconductance (L-NDT) phenomenon. We also prove that such L-NDT phenomenon in specific bias ranges originates from a variable potential barrier at a graphene/WSe 2 junction due to a gate-controllable graphene electrode. This finding allows us to conceive graphene/WSe 2 -based MVL logic circuits by using the I D -V G characteristics with two distinctive V TH s. Based on this finding, we further demonstrate a light-triggered ternary inverter circuit with three stable logical states (ΔV out of each state <0.05 V). Our study offers the pathway to substantialize MVL systems.
Tunable negative differential resistance in planar graphene superlattice resonant tunneling diode
NASA Astrophysics Data System (ADS)
Sattari-Esfahlan, S. M.; Fouladi-Oskuei, J.; Shojaei, S.
2017-04-01
Here, we study the negative differential resistance (NDR) of Dirac electrons in biased planar graphene superlattice (PGSL) and investigate the transport characteristics by adopted transfer matrix method within Landauer-Buttiker formalism. Our model device is based on one-dimensional Kronig-Penney type electrostatic potential in monolayer graphene deposited on a substrate, where the bias voltage is applied by two electrodes in the left and right. At Low bias voltages, we found that NDR appears due to breaking of minibands to Wannier-Stark ladders (WSLs). At the critical bias voltage, delocalization appeared by WS states leads to tunneling peak current in current-voltage (I-V) characteristics. With increasing bias voltage, crossing of rungs from various WSL results in multi-peak NDR. The results demonstrate that the structure parameters like barrier/well thickness and barrier height have remarkable effect on I-V characteristics of PGSL. In addition, Dirac gap enhances peak to valley (PVR) value due to suppressing Klein tunneling. Our results show that the tunable PVR in PGSL resonant tunneling diode can be achievable by structure parameters engineering. NDR at ultra-low bias voltages, such as 100 mV, with giant PVR of 20 is obtained. In our device, the multiple same NDR peaks with ultra-low bias voltage provide promising prospect for multi-valued memories and the low power nanoelectronic tunneling devices.
Towards Modeling False Memory With Computational Knowledge Bases.
Li, Justin; Kohanyi, Emma
2017-01-01
One challenge to creating realistic cognitive models of memory is the inability to account for the vast common-sense knowledge of human participants. Large computational knowledge bases such as WordNet and DBpedia may offer a solution to this problem but may pose other challenges. This paper explores some of these difficulties through a semantic network spreading activation model of the Deese-Roediger-McDermott false memory task. In three experiments, we show that these knowledge bases only capture a subset of human associations, while irrelevant information introduces noise and makes efficient modeling difficult. We conclude that the contents of these knowledge bases must be augmented and, more important, that the algorithms must be refined and optimized, before large knowledge bases can be widely used for cognitive modeling. Copyright © 2016 Cognitive Science Society, Inc.
An Ontology-Based Conceptual Model For Accumulating And Reusing Knowledge In A DMAIC Process
NASA Astrophysics Data System (ADS)
Nguyen, ThanhDat; Kifor, Claudiu Vasile
2015-09-01
DMAIC (Define, Measure, Analyze, Improve, and Control) is an important process used to enhance quality of processes basing on knowledge. However, it is difficult to access DMAIC knowledge. Conventional approaches meet a problem arising from structuring and reusing DMAIC knowledge. The main reason is that DMAIC knowledge is not represented and organized systematically. In this article, we overcome the problem basing on a conceptual model that is a combination of DMAIC process, knowledge management, and Ontology engineering. The main idea of our model is to utilizing Ontologies to represent knowledge generated by each of DMAIC phases. We build five different knowledge bases for storing all knowledge of DMAIC phases with the support of necessary tools and appropriate techniques in Information Technology area. Consequently, these knowledge bases provide knowledge available to experts, managers, and web users during or after DMAIC execution in order to share and reuse existing knowledge.
A New Perspective on Modeling Groundwater-Driven Health Risk With Subjective Information
NASA Astrophysics Data System (ADS)
Ozbek, M. M.
2003-12-01
Fuzzy rule-based systems provide an efficient environment for the modeling of expert information in the context of risk management for groundwater contamination problems. In general, their use in the form of conditional pieces of knowledge, has been either as a tool for synthesizing control laws from data (i.e., conjunction-based models), or in a knowledge representation and reasoning perspective in Artificial Intelligence (i.e., implication-based models), where only the latter may lead to coherence problems (e.g., input data that leads to logical inconsistency when added to the knowledge base). We implement a two-fold extension to an implication-based groundwater risk model (Ozbek and Pinder, 2002) including: 1) the implementation of sufficient conditions for a coherent knowledge base, and 2) the interpolation of expert statements to supplement gaps in knowledge. The original model assumes statements of public health professionals for the characterization of the exposed individual and the relation of dose and pattern of exposure to its carcinogenic effects. We demonstrate the utility of the extended model in that it: 1)identifies inconsistent statements and establishes coherence in the knowledge base, and 2) minimizes the burden of knowledge elicitation from the experts for utilizing existing knowledge in an optimal fashion.ÿÿ
Incorporating linguistic knowledge for learning distributed word representations.
Wang, Yan; Liu, Zhiyuan; Sun, Maosong
2015-01-01
Combined with neural language models, distributed word representations achieve significant advantages in computational linguistics and text mining. Most existing models estimate distributed word vectors from large-scale data in an unsupervised fashion, which, however, do not take rich linguistic knowledge into consideration. Linguistic knowledge can be represented as either link-based knowledge or preference-based knowledge, and we propose knowledge regularized word representation models (KRWR) to incorporate these prior knowledge for learning distributed word representations. Experiment results demonstrate that our estimated word representation achieves better performance in task of semantic relatedness ranking. This indicates that our methods can efficiently encode both prior knowledge from knowledge bases and statistical knowledge from large-scale text corpora into a unified word representation model, which will benefit many tasks in text mining.
Incorporating Linguistic Knowledge for Learning Distributed Word Representations
Wang, Yan; Liu, Zhiyuan; Sun, Maosong
2015-01-01
Combined with neural language models, distributed word representations achieve significant advantages in computational linguistics and text mining. Most existing models estimate distributed word vectors from large-scale data in an unsupervised fashion, which, however, do not take rich linguistic knowledge into consideration. Linguistic knowledge can be represented as either link-based knowledge or preference-based knowledge, and we propose knowledge regularized word representation models (KRWR) to incorporate these prior knowledge for learning distributed word representations. Experiment results demonstrate that our estimated word representation achieves better performance in task of semantic relatedness ranking. This indicates that our methods can efficiently encode both prior knowledge from knowledge bases and statistical knowledge from large-scale text corpora into a unified word representation model, which will benefit many tasks in text mining. PMID:25874581
NASA Astrophysics Data System (ADS)
de Haro, Jaume; Amorós, Jaume
2018-03-01
We consider the Arnowitt-Deser-Misner formalism as a tool to build bouncing cosmologies. In this approach, the foliation of the spacetime has to be fixed in order to go beyond general relativity modifying the gravitational sector. Once a preferred slicing, which we choose based on the matter content of the Universe following the spirit of Weyl's postulate, has been fixed, f theories depending on the extrinsic and intrinsic curvature of the slicing are covariant for all the reference frames preserving the foliation; i.e., the constraint and dynamical equations have the same form for all these observers. Moreover, choosing multivalued f functions, bouncing backgrounds emerge in a natural way. In fact, the simplest is the one corresponding to holonomy corrected loop quantum cosmology. The final goal of this work is to provide the equations of perturbations which, unlike the full equations, become gauge invariant in this theory, and apply them to the so-called matter bounce scenario.
Terminological reference of a knowledge-based system: the data dictionary.
Stausberg, J; Wormek, A; Kraut, U
1995-01-01
The development of open and integrated knowledge bases makes new demands on the definition of the used terminology. The definition should be realized in a data dictionary separated from the knowledge base. Within the works done at a reference model of medical knowledge, a data dictionary has been developed and used in different applications: a term definition shell, a documentation tool and a knowledge base. The data dictionary includes that part of terminology, which is largely independent of a certain knowledge model. For that reason, the data dictionary can be used as a basis for integrating knowledge bases into information systems, for knowledge sharing and reuse and for modular development of knowledge-based systems.
Applying knowledge compilation techniques to model-based reasoning
NASA Technical Reports Server (NTRS)
Keller, Richard M.
1991-01-01
Researchers in the area of knowledge compilation are developing general purpose techniques for improving the efficiency of knowledge-based systems. In this article, an attempt is made to define knowledge compilation, to characterize several classes of knowledge compilation techniques, and to illustrate how some of these techniques can be applied to improve the performance of model-based reasoning systems.
Shock formation in the dispersionless Kadomtsev-Petviashvili equation
NASA Astrophysics Data System (ADS)
Grava, T.; Klein, C.; Eggers, J.
2016-04-01
The dispersionless Kadomtsev-Petviashvili (dKP) equation {{≤ft({{u}t}+u{{u}x}\\right)}x}={{u}yy} is one of the simplest nonlinear wave equations describing two-dimensional shocks. To solve the dKP equation numerically we use a coordinate transformation inspired by the method of characteristics for the one-dimensional Hopf equation {{u}t}+u{{u}x}=0 . We show numerically that the solutions to the transformed equation stays regular for longer times than the solution of the dKP equation. This permits us to extend the dKP solution as the graph of a multivalued function beyond the critical time when the gradients blow up. This overturned solution is multivalued in a lip shape region in the (x, y) plane, where the solution of the dKP equation exists in a weak sense only, and a shock front develops. A local expansion reveals the universal scaling structure of the shock, which after a suitable change of coordinates corresponds to a generic cusp catastrophe. We provide a heuristic derivation of the shock front position near the critical point for the solution of the dKP equation, and study the solution of the dKP equation when a small amount of dissipation is added. Using multiple-scale analysis, we show that in the limit of small dissipation and near the critical point of the dKP solution, the solution of the dissipative dKP equation converges to a Pearcey integral. We test and illustrate our results by detailed comparisons with numerical simulations of both the regularized equation, the dKP equation, and the asymptotic description given in terms of the Pearcey integral.
Automated extraction of knowledge for model-based diagnostics
NASA Technical Reports Server (NTRS)
Gonzalez, Avelino J.; Myler, Harley R.; Towhidnejad, Massood; Mckenzie, Frederic D.; Kladke, Robin R.
1990-01-01
The concept of accessing computer aided design (CAD) design databases and extracting a process model automatically is investigated as a possible source for the generation of knowledge bases for model-based reasoning systems. The resulting system, referred to as automated knowledge generation (AKG), uses an object-oriented programming structure and constraint techniques as well as internal database of component descriptions to generate a frame-based structure that describes the model. The procedure has been designed to be general enough to be easily coupled to CAD systems that feature a database capable of providing label and connectivity data from the drawn system. The AKG system is capable of defining knowledge bases in formats required by various model-based reasoning tools.
A knowledge base architecture for distributed knowledge agents
NASA Technical Reports Server (NTRS)
Riedesel, Joel; Walls, Bryan
1990-01-01
A tuple space based object oriented model for knowledge base representation and interpretation is presented. An architecture for managing distributed knowledge agents is then implemented within the model. The general model is based upon a database implementation of a tuple space. Objects are then defined as an additional layer upon the database. The tuple space may or may not be distributed depending upon the database implementation. A language for representing knowledge and inference strategy is defined whose implementation takes advantage of the tuple space. The general model may then be instantiated in many different forms, each of which may be a distinct knowledge agent. Knowledge agents may communicate using tuple space mechanisms as in the LINDA model as well as using more well known message passing mechanisms. An implementation of the model is presented describing strategies used to keep inference tractable without giving up expressivity. An example applied to a power management and distribution network for Space Station Freedom is given.
Online Knowledge-Based Model for Big Data Topic Extraction.
Khan, Muhammad Taimoor; Durrani, Mehr; Khalid, Shehzad; Aziz, Furqan
2016-01-01
Lifelong machine learning (LML) models learn with experience maintaining a knowledge-base, without user intervention. Unlike traditional single-domain models they can easily scale up to explore big data. The existing LML models have high data dependency, consume more resources, and do not support streaming data. This paper proposes online LML model (OAMC) to support streaming data with reduced data dependency. With engineering the knowledge-base and introducing new knowledge features the learning pattern of the model is improved for data arriving in pieces. OAMC improves accuracy as topic coherence by 7% for streaming data while reducing the processing cost to half.
The research on construction and application of machining process knowledge base
NASA Astrophysics Data System (ADS)
Zhao, Tan; Qiao, Lihong; Qie, Yifan; Guo, Kai
2018-03-01
In order to realize the application of knowledge in machining process design, from the perspective of knowledge in the application of computer aided process planning(CAPP), a hierarchical structure of knowledge classification is established according to the characteristics of mechanical engineering field. The expression of machining process knowledge is structured by means of production rules and the object-oriented methods. Three kinds of knowledge base models are constructed according to the representation of machining process knowledge. In this paper, the definition and classification of machining process knowledge, knowledge model, and the application flow of the process design based on the knowledge base are given, and the main steps of the design decision of the machine tool are carried out as an application by using the knowledge base.
On the formulation of the aerodynamic characteristics in aircraft dynamics
NASA Technical Reports Server (NTRS)
Tobak, M.; Schiff, L. B.
1976-01-01
The theory of functionals is used to reformulate the notions of aerodynamic indicial functions and superposition. Integral forms for the aerodynamic response to arbitrary motions are derived that are free of dependence on a linearity assumption. Simplifications of the integral forms lead to practicable nonlinear generalizations of the linear superpositions and stability derivative formulations. Applied to arbitrary nonplanar motions, the generalization yields a form for the aerodynamic response that can be compounded of the contributions from a limited number of well-defined characteristic motions, in principle reproducible in the wind tunnel. Further generalizations that would enable the consideration of random fluctuations and multivalued aerodynamic responses are indicated.
Relaxation in control systems of subdifferential type
NASA Astrophysics Data System (ADS)
Tolstonogov, A. A.
2006-02-01
In a separable Hilbert space we consider a control system with evolution operators that are subdifferentials of a proper convex lower semicontinuous function depending on time. The constraint on the control is given by a multivalued function with non-convex values that is lower semicontinuous with respect to the variable states. Along with the original system we consider the system in which the constraint on the control is the upper semicontinuous convex-valued regularization of the original constraint. We study relations between the solution sets of these systems. As an application we consider a control variational inequality. We give an example of a control system of parabolic type with an obstacle.
Experimental testing of the noise-canceling processor.
Collins, Michael D; Baer, Ralph N; Simpson, Harry J
2011-09-01
Signal-processing techniques for localizing an acoustic source buried in noise are tested in a tank experiment. Noise is generated using a discrete source, a bubble generator, and a sprinkler. The experiment has essential elements of a realistic scenario in matched-field processing, including complex source and noise time series in a waveguide with water, sediment, and multipath propagation. The noise-canceling processor is found to outperform the Bartlett processor and provide the correct source range for signal-to-noise ratios below -10 dB. The multivalued Bartlett processor is found to outperform the Bartlett processor but not the noise-canceling processor. © 2011 Acoustical Society of America
Kramer, Desré M; Wells, Richard P; Carlan, Nicolette; Aversa, Theresa; Bigelow, Philip P; Dixon, Shane M; McMillan, Keith
2013-01-01
Few evaluation tools are available to assess knowledge-transfer and exchange interventions. The objective of this paper is to develop and demonstrate a theory-based knowledge-transfer and exchange method of evaluation (KEME) that synthesizes 3 theoretical frameworks: the promoting action on research implementation of health services (PARiHS) model, the transtheoretical model of change, and a model of knowledge use. It proposes a new term, keme, to mean a unit of evidence-based transferable knowledge. The usefulness of the evaluation method is demonstrated with 4 occupational health and safety knowledge transfer and exchange (KTE) implementation case studies that are based upon the analysis of over 50 pre-existing interviews. The usefulness of the evaluation model has enabled us to better understand stakeholder feedback, frame our interpretation, and perform a more comprehensive evaluation of the knowledge use outcomes of our KTE efforts.
Honda, Hidehito; Matsuka, Toshihiko; Ueda, Kazuhiro
2017-05-01
Some researchers on binary choice inference have argued that people make inferences based on simple heuristics, such as recognition, fluency, or familiarity. Others have argued that people make inferences based on available knowledge. To examine the boundary between heuristic and knowledge usage, we examine binary choice inference processes in terms of attribute substitution in heuristic use (Kahneman & Frederick, 2005). In this framework, it is predicted that people will rely on heuristic or knowledge-based inference depending on the subjective difficulty of the inference task. We conducted competitive tests of binary choice inference models representing simple heuristics (fluency and familiarity heuristics) and knowledge-based inference models. We found that a simple heuristic model (especially a familiarity heuristic model) explained inference patterns for subjectively difficult inference tasks, and that a knowledge-based inference model explained subjectively easy inference tasks. These results were consistent with the predictions of the attribute substitution framework. Issues on usage of simple heuristics and psychological processes are discussed. Copyright © 2016 Cognitive Science Society, Inc.
Computational neuroanatomy: ontology-based representation of neural components and connectivity.
Rubin, Daniel L; Talos, Ion-Florin; Halle, Michael; Musen, Mark A; Kikinis, Ron
2009-02-05
A critical challenge in neuroscience is organizing, managing, and accessing the explosion in neuroscientific knowledge, particularly anatomic knowledge. We believe that explicit knowledge-based approaches to make neuroscientific knowledge computationally accessible will be helpful in tackling this challenge and will enable a variety of applications exploiting this knowledge, such as surgical planning. We developed ontology-based models of neuroanatomy to enable symbolic lookup, logical inference and mathematical modeling of neural systems. We built a prototype model of the motor system that integrates descriptive anatomic and qualitative functional neuroanatomical knowledge. In addition to modeling normal neuroanatomy, our approach provides an explicit representation of abnormal neural connectivity in disease states, such as common movement disorders. The ontology-based representation encodes both structural and functional aspects of neuroanatomy. The ontology-based models can be evaluated computationally, enabling development of automated computer reasoning applications. Neuroanatomical knowledge can be represented in machine-accessible format using ontologies. Computational neuroanatomical approaches such as described in this work could become a key tool in translational informatics, leading to decision support applications that inform and guide surgical planning and personalized care for neurological disease in the future.
Quantum anonymous voting with unweighted continuous-variable graph states
NASA Astrophysics Data System (ADS)
Guo, Ying; Feng, Yanyan; Zeng, Guihua
2016-08-01
Motivated by the revealing topological structures of continuous-variable graph state (CVGS), we investigate the design of quantum voting scheme, which has serious advantages over the conventional ones in terms of efficiency and graphicness. Three phases are included, i.e., the preparing phase, the voting phase and the counting phase, together with three parties, i.e., the voters, the tallyman and the ballot agency. Two major voting operations are performed on the yielded CVGS in the voting process, namely the local rotation transformation and the displacement operation. The voting information is carried by the CVGS established before hand, whose persistent entanglement is deployed to keep the privacy of votes and the anonymity of legal voters. For practical applications, two CVGS-based quantum ballots, i.e., comparative ballot and anonymous survey, are specially designed, followed by the extended ballot schemes for the binary-valued and multi-valued ballots under some constraints for the voting design. Security is ensured by entanglement of the CVGS, the voting operations and the laws of quantum mechanics. The proposed schemes can be implemented using the standard off-the-shelf components when compared to discrete-variable quantum voting schemes attributing to the characteristics of the CV-based quantum cryptography.
Online Knowledge-Based Model for Big Data Topic Extraction
Khan, Muhammad Taimoor; Durrani, Mehr; Khalid, Shehzad; Aziz, Furqan
2016-01-01
Lifelong machine learning (LML) models learn with experience maintaining a knowledge-base, without user intervention. Unlike traditional single-domain models they can easily scale up to explore big data. The existing LML models have high data dependency, consume more resources, and do not support streaming data. This paper proposes online LML model (OAMC) to support streaming data with reduced data dependency. With engineering the knowledge-base and introducing new knowledge features the learning pattern of the model is improved for data arriving in pieces. OAMC improves accuracy as topic coherence by 7% for streaming data while reducing the processing cost to half. PMID:27195004
NASA Technical Reports Server (NTRS)
Nieten, Joseph L.; Seraphine, Kathleen M.
1991-01-01
Procedural modeling systems, rule based modeling systems, and a method for converting a procedural model to a rule based model are described. Simulation models are used to represent real time engineering systems. A real time system can be represented by a set of equations or functions connected so that they perform in the same manner as the actual system. Most modeling system languages are based on FORTRAN or some other procedural language. Therefore, they must be enhanced with a reaction capability. Rule based systems are reactive by definition. Once the engineering system has been decomposed into a set of calculations using only basic algebraic unary operations, a knowledge network of calculations and functions can be constructed. The knowledge network required by a rule based system can be generated by a knowledge acquisition tool or a source level compiler. The compiler would take an existing model source file, a syntax template, and a symbol table and generate the knowledge network. Thus, existing procedural models can be translated and executed by a rule based system. Neural models can be provide the high capacity data manipulation required by the most complex real time models.
Hardware implementation of fuzzy Petri net as a controller.
Gniewek, Lesław; Kluska, Jacek
2004-06-01
The paper presents a new approach to fuzzy Petri net (FPN) and its hardware implementation. The authors' motivation is as follows. Complex industrial processes can be often decomposed into many parallelly working subprocesses, which can, in turn, be modeled using Petri nets. If all the process variables (or events) are assumed to be two-valued signals, then it is possible to obtain a hardware or software control device, which works according to the algorithm described by conventional Petri net. However, the values of real signals are contained in some bounded interval and can be interpreted as events which are not only true or false, but rather true in some degree from the interval [0, 1]. Such a natural interpretation from multivalued logic (fuzzy logic) point of view, concerns sensor outputs, control signals, time expiration, etc. It leads to the idea of FPN as a controller, which one can rather simply obtain, and which would be able to process both analog, and binary signals. In the paper both graphical, and algebraic representations of the proposed FPN are given. The conditions under which transitions can be fired are described. The algebraic description of the net and a theorem which enables computation of new marking in the net, based on current marking, are formulated. Hardware implementation of the FPN, which uses fuzzy JK flip-flops and fuzzy gates, are proposed. An example illustrating usefulness of the proposed FPN for control algorithm description and its synthesis as a controller device for the concrete production process are presented.
Cui, Meng; Yang, Shuo; Yu, Tong; Yang, Ce; Gao, Yonghong; Zhu, Haiyan
2013-10-01
To design a model to capture information on the state and trends of knowledge creation, at both an individual and an organizational level, in order to enhance knowledge management. We designed a graph-theoretic knowledge model, the expert knowledge map (EKM), based on literature-based annotation. A case study in the domain of Traditional Chinese Medicine research was used to illustrate the usefulness of the model. The EKM successfully captured various aspects of knowledge and enhanced knowledge management within the case-study organization through the provision of knowledge graphs, expert graphs, and expert-knowledge biography. Our model could help to reveal the hot topics, trends, and products of the research done by an organization. It can potentially be used to facilitate knowledge learning, sharing and decision-making among researchers, academicians, students, and administrators of organizations.
A new intrusion prevention model using planning knowledge graph
NASA Astrophysics Data System (ADS)
Cai, Zengyu; Feng, Yuan; Liu, Shuru; Gan, Yong
2013-03-01
Intelligent plan is a very important research in artificial intelligence, which has applied in network security. This paper proposes a new intrusion prevention model base on planning knowledge graph and discuses the system architecture and characteristics of this model. The Intrusion Prevention based on plan knowledge graph is completed by plan recognition based on planning knowledge graph, and the Intrusion response strategies and actions are completed by the hierarchical task network (HTN) planner in this paper. Intrusion prevention system has the advantages of intelligent planning, which has the advantage of the knowledge-sharing, the response focused, learning autonomy and protective ability.
Memory Applications Using Resonant Tunneling Diodes
NASA Astrophysics Data System (ADS)
Shieh, Ming-Huei
Resonant tunneling diodes (RTDs) producing unique folding current-voltage (I-V) characteristics have attracted considerable research attention due to their promising application in signal processing and multi-valued logic. The negative differential resistance of RTDs renders the operating points self-latching and stable. We have proposed a multiple -dimensional multiple-state RTD-based static random-access memory (SRAM) cell in which the number of stable states can significantly be increased to (N + 1)^ m or more for m number of N-peak RTDs connected in series. The proposed cells take advantage of the hysteresis and folding I-V characteristics of RTD. Several cell designs are presented and evaluated. A two-dimensional nine-state memory cell has been implemented and demonstrated by a breadboard circuit using two 2-peak RTDs. The hysteresis phenomenon in a series of RTDs is also further analyzed. The switch model provided in SPICE 3 can be utilized to simulate the hysteretic I-V characteristics of RTDs. A simple macro-circuit is described to model the hysteretic I-V characteristic of RTD for circuit simulation. A new scheme for storing word-wide multiple-bit information very efficiently in a single memory cell using RTDs is proposed. An efficient and inexpensive periphery circuit to read from and write into the cell is also described. Simulation results on the design of a 3-bit memory cell scheme using one-peak RTDs are also presented. Finally, a binary transistor-less memory cell which is only composed of a pair of RTDs and an ordinary rectifier diode is presented and investigated. A simple means for reading and writing information from or into the memory cell is also discussed.
A Model of Knowledge Based Information Retrieval with Hierarchical Concept Graph.
ERIC Educational Resources Information Center
Kim, Young Whan; Kim, Jin H.
1990-01-01
Proposes a model of knowledge-based information retrieval (KBIR) that is based on a hierarchical concept graph (HCG) which shows relationships between index terms and constitutes a hierarchical thesaurus as a knowledge base. Conceptual distance between a query and an object is discussed and the use of Boolean operators is described. (25…
Computational neuroanatomy: ontology-based representation of neural components and connectivity
Rubin, Daniel L; Talos, Ion-Florin; Halle, Michael; Musen, Mark A; Kikinis, Ron
2009-01-01
Background A critical challenge in neuroscience is organizing, managing, and accessing the explosion in neuroscientific knowledge, particularly anatomic knowledge. We believe that explicit knowledge-based approaches to make neuroscientific knowledge computationally accessible will be helpful in tackling this challenge and will enable a variety of applications exploiting this knowledge, such as surgical planning. Results We developed ontology-based models of neuroanatomy to enable symbolic lookup, logical inference and mathematical modeling of neural systems. We built a prototype model of the motor system that integrates descriptive anatomic and qualitative functional neuroanatomical knowledge. In addition to modeling normal neuroanatomy, our approach provides an explicit representation of abnormal neural connectivity in disease states, such as common movement disorders. The ontology-based representation encodes both structural and functional aspects of neuroanatomy. The ontology-based models can be evaluated computationally, enabling development of automated computer reasoning applications. Conclusion Neuroanatomical knowledge can be represented in machine-accessible format using ontologies. Computational neuroanatomical approaches such as described in this work could become a key tool in translational informatics, leading to decision support applications that inform and guide surgical planning and personalized care for neurological disease in the future. PMID:19208191
Knowledge representation to support reasoning based on multiple models
NASA Technical Reports Server (NTRS)
Gillam, April; Seidel, Jorge P.; Parker, Alice C.
1990-01-01
Model Based Reasoning is a powerful tool used to design and analyze systems, which are often composed of numerous interactive, interrelated subsystems. Models of the subsystems are written independently and may be used together while they are still under development. Thus the models are not static. They evolve as information becomes obsolete, as improved artifact descriptions are developed, and as system capabilities change. Researchers are using three methods to support knowledge/data base growth, to track the model evolution, and to handle knowledge from diverse domains. First, the representation methodology is based on having pools, or types, of knowledge from which each model is constructed. In addition information is explicit. This includes the interactions between components, the description of the artifact structure, and the constraints and limitations of the models. The third principle we have followed is the separation of the data and knowledge from the inferencing and equation solving mechanisms. This methodology is used in two distinct knowledge-based systems: one for the design of space systems and another for the synthesis of VLSI circuits. It has facilitated the growth and evolution of our models, made accountability of results explicit, and provided credibility for the user community. These capabilities have been implemented and are being used in actual design projects.
Knowledge Representation and Ontologies
NASA Astrophysics Data System (ADS)
Grimm, Stephan
Knowledge representation and reasoning aims at designing computer systems that reason about a machine-interpretable representation of the world. Knowledge-based systems have a computational model of some domain of interest in which symbols serve as surrogates for real world domain artefacts, such as physical objects, events, relationships, etc. [1]. The domain of interest can cover any part of the real world or any hypothetical system about which one desires to represent knowledge for com-putational purposes. A knowledge-based system maintains a knowledge base, which stores the symbols of the computational model in the form of statements about the domain, and it performs reasoning by manipulating these symbols. Applications can base their decisions on answers to domain-relevant questions posed to a knowledge base.
Arranging ISO 13606 archetypes into a knowledge base.
Kopanitsa, Georgy
2014-01-01
To enable the efficient reuse of standard based medical data we propose to develop a higher level information model that will complement the archetype model of ISO 13606. This model will make use of the relationships that are specified in UML to connect medical archetypes into a knowledge base within a repository. UML connectors were analyzed for their ability to be applied in the implementation of a higher level model that will establish relationships between archetypes. An information model was developed using XML Schema notation. The model allows linking different archetypes of one repository into a knowledge base. Presently it supports several relationships and will be advanced in future.
A Model to Assess the Behavioral Impacts of Consultative Knowledge Based Systems.
ERIC Educational Resources Information Center
Mak, Brenda; Lyytinen, Kalle
1997-01-01
This research model studies the behavioral impacts of consultative knowledge based systems (KBS). A study of graduate students explored to what extent their decisions were affected by user participation in updating the knowledge base; ambiguity of decision setting; routinization of usage; and source credibility of the expertise embedded in the…
Distributed, cooperating knowledge-based systems
NASA Technical Reports Server (NTRS)
Truszkowski, Walt
1991-01-01
Some current research in the development and application of distributed, cooperating knowledge-based systems technology is addressed. The focus of the current research is the spacecraft ground operations environment. The underlying hypothesis is that, because of the increasing size, complexity, and cost of planned systems, conventional procedural approaches to the architecture of automated systems will give way to a more comprehensive knowledge-based approach. A hallmark of these future systems will be the integration of multiple knowledge-based agents which understand the operational goals of the system and cooperate with each other and the humans in the loop to attain the goals. The current work includes the development of a reference model for knowledge-base management, the development of a formal model of cooperating knowledge-based agents, the use of testbed for prototyping and evaluating various knowledge-based concepts, and beginning work on the establishment of an object-oriented model of an intelligent end-to-end (spacecraft to user) system. An introductory discussion of these activities is presented, the major concepts and principles being investigated are highlighted, and their potential use in other application domains is indicated.
Sugihara, Masahiro
2010-01-01
In survival analysis, treatment effects are commonly evaluated based on survival curves and hazard ratios as causal treatment effects. In observational studies, these estimates may be biased due to confounding factors. The inverse probability of treatment weighted (IPTW) method based on the propensity score is one of the approaches utilized to adjust for confounding factors between binary treatment groups. As a generalization of this methodology, we developed an exact formula for an IPTW log-rank test based on the generalized propensity score for survival data. This makes it possible to compare the group differences of IPTW Kaplan-Meier estimators of survival curves using an IPTW log-rank test for multi-valued treatments. As causal treatment effects, the hazard ratio can be estimated using the IPTW approach. If the treatments correspond to ordered levels of a treatment, the proposed method can be easily extended to the analysis of treatment effect patterns with contrast statistics. In this paper, the proposed method is illustrated with data from the Kyushu Lipid Intervention Study (KLIS), which investigated the primary preventive effects of pravastatin on coronary heart disease (CHD). The results of the proposed method suggested that pravastatin treatment reduces the risk of CHD and that compliance to pravastatin treatment is important for the prevention of CHD. (c) 2009 John Wiley & Sons, Ltd.
Trust from the past: Bayesian Personalized Ranking based Link Prediction in Knowledge Graphs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Baichuan; Choudhury, Sutanay; Al-Hasan, Mohammad
2016-02-01
Estimating the confidence for a link is a critical task for Knowledge Graph construction. Link prediction, or predicting the likelihood of a link in a knowledge graph based on prior state is a key research direction within this area. We propose a Latent Feature Embedding based link recommendation model for prediction task and utilize Bayesian Personalized Ranking based optimization technique for learning models for each predicate. Experimental results on large-scale knowledge bases such as YAGO2 show that our approach achieves substantially higher performance than several state-of-art approaches. Furthermore, we also study the performance of the link prediction algorithm in termsmore » of topological properties of the Knowledge Graph and present a linear regression model to reason about its expected level of accuracy.« less
More than Anecdotes: Fishers' Ecological Knowledge Can Fill Gaps for Ecosystem Modeling.
Bevilacqua, Ana Helena V; Carvalho, Adriana R; Angelini, Ronaldo; Christensen, Villy
2016-01-01
Ecosystem modeling applied to fisheries remains hampered by a lack of local information. Fishers' knowledge could fill this gap, improving participation in and the management of fisheries. The same fishing area was modeled using two approaches: based on fishers' knowledge and based on scientific information. For the former, the data was collected by interviews through the Delphi methodology, and for the latter, the data was gathered from the literature. Agreement between the attributes generated by the fishers' knowledge model and scientific model is discussed and explored, aiming to improve data availability, the ecosystem model, and fisheries management. The ecosystem attributes produced from the fishers' knowledge model were consistent with the ecosystem attributes produced by the scientific model, and elaborated using only the scientific data from literature. This study provides evidence that fishers' knowledge may suitably complement scientific data, and may improve the modeling tools for the research and management of fisheries.
Translating three states of knowledge--discovery, invention, and innovation
2010-01-01
Background Knowledge Translation (KT) has historically focused on the proper use of knowledge in healthcare delivery. A knowledge base has been created through empirical research and resides in scholarly literature. Some knowledge is amenable to direct application by stakeholders who are engaged during or after the research process, as shown by the Knowledge to Action (KTA) model. Other knowledge requires multiple transformations before achieving utility for end users. For example, conceptual knowledge generated through science or engineering may become embodied as a technology-based invention through development methods. The invention may then be integrated within an innovative device or service through production methods. To what extent is KT relevant to these transformations? How might the KTA model accommodate these additional development and production activities while preserving the KT concepts? Discussion Stakeholders adopt and use knowledge that has perceived utility, such as a solution to a problem. Achieving a technology-based solution involves three methods that generate knowledge in three states, analogous to the three classic states of matter. Research activity generates discoveries that are intangible and highly malleable like a gas; development activity transforms discoveries into inventions that are moderately tangible yet still malleable like a liquid; and production activity transforms inventions into innovations that are tangible and immutable like a solid. The paper demonstrates how the KTA model can accommodate all three types of activity and address all three states of knowledge. Linking the three activities in one model also illustrates the importance of engaging the relevant stakeholders prior to initiating any knowledge-related activities. Summary Science and engineering focused on technology-based devices or services change the state of knowledge through three successive activities. Achieving knowledge implementation requires methods that accommodate these three activities and knowledge states. Accomplishing beneficial societal impacts from technology-based knowledge involves the successful progression through all three activities, and the effective communication of each successive knowledge state to the relevant stakeholders. The KTA model appears suitable for structuring and linking these processes. PMID:20205873
An, Gary; Christley, Scott
2012-01-01
Given the panoply of system-level diseases that result from disordered inflammation, such as sepsis, atherosclerosis, cancer, and autoimmune disorders, understanding and characterizing the inflammatory response is a key target of biomedical research. Untangling the complex behavioral configurations associated with a process as ubiquitous as inflammation represents a prototype of the translational dilemma: the ability to translate mechanistic knowledge into effective therapeutics. A critical failure point in the current research environment is a throughput bottleneck at the level of evaluating hypotheses of mechanistic causality; these hypotheses represent the key step toward the application of knowledge for therapy development and design. Addressing the translational dilemma will require utilizing the ever-increasing power of computers and computational modeling to increase the efficiency of the scientific method in the identification and evaluation of hypotheses of mechanistic causality. More specifically, development needs to focus on facilitating the ability of non-computer trained biomedical researchers to utilize and instantiate their knowledge in dynamic computational models. This is termed "dynamic knowledge representation." Agent-based modeling is an object-oriented, discrete-event, rule-based simulation method that is well suited for biomedical dynamic knowledge representation. Agent-based modeling has been used in the study of inflammation at multiple scales. The ability of agent-based modeling to encompass multiple scales of biological process as well as spatial considerations, coupled with an intuitive modeling paradigm, suggest that this modeling framework is well suited for addressing the translational dilemma. This review describes agent-based modeling, gives examples of its applications in the study of inflammation, and introduces a proposed general expansion of the use of modeling and simulation to augment the generation and evaluation of knowledge by the biomedical research community at large.
An object-relational model for structured representation of medical knowledge.
Koch, S; Risch, T; Schneider, W; Wagner, I V
2006-07-01
Domain specific knowledge is often not static but continuously evolving. This is especially true for the medical domain. Furthermore, the lack of standardized structures for presenting knowledge makes it difficult or often impossible to assess new knowledge in the context of existing knowledge. Possibilities to compare knowledge easily and directly are often not given. It is therefore of utmost importance to create a model that allows for comparability, consistency and quality assurance of medical knowledge in specific work situations. For this purpose, we have designed on object-relational model based on structured knowledge elements that are dynamically reusable by different multi-media-based tools for case-based documentation, disease course simulation, and decision support. With this model, high-level components, such as patient case reports or simulations of the course of a disease, and low-level components (e.g., diagnoses, symptoms or treatments) as well as the relationships between these components are modeled. The resulting schema has been implemented in AMOS II, on object-relational multi-database system supporting different views with regard to search and analysis depending on different work situations.
An alternative extragradient projection method for quasi-equilibrium problems.
Chen, Haibin; Wang, Yiju; Xu, Yi
2018-01-01
For the quasi-equilibrium problem where the players' costs and their strategies both depend on the rival's decisions, an alternative extragradient projection method for solving it is designed. Different from the classical extragradient projection method whose generated sequence has the contraction property with respect to the solution set, the newly designed method possesses an expansion property with respect to a given initial point. The global convergence of the method is established under the assumptions of pseudomonotonicity of the equilibrium function and of continuity of the underlying multi-valued mapping. Furthermore, we show that the generated sequence converges to the nearest point in the solution set to the initial point. Numerical experiments show the efficiency of the method.
Improved knowledge diffusion model based on the collaboration hypernetwork
NASA Astrophysics Data System (ADS)
Wang, Jiang-Pan; Guo, Qiang; Yang, Guang-Yong; Liu, Jian-Guo
2015-06-01
The process for absorbing knowledge becomes an essential element for innovation in firms and in adapting to changes in the competitive environment. In this paper, we present an improved knowledge diffusion hypernetwork (IKDH) model based on the idea that knowledge will spread from the target node to all its neighbors in terms of the hyperedge and knowledge stock. We apply the average knowledge stock V(t) , the variable σ2(t) , and the variance coefficient c(t) to evaluate the performance of knowledge diffusion. By analyzing different knowledge diffusion ways, selection ways of the highly knowledgeable nodes, hypernetwork sizes and hypernetwork structures for the performance of knowledge diffusion, results show that the diffusion speed of IKDH model is 3.64 times faster than that of traditional knowledge diffusion (TKDH) model. Besides, it is three times faster to diffuse knowledge by randomly selecting "expert" nodes than that by selecting large-hyperdegree nodes as "expert" nodes. Furthermore, either the closer network structure or smaller network size results in the faster knowledge diffusion.
Jump Resonance in Fractional Order Circuits
NASA Astrophysics Data System (ADS)
Buscarino, Arturo; Caponetto, Riccardo; Famoso, Carlo; Fortuna, Luigi
The occurrence of an hysteretic loop in the frequency response of a driven nonlinear system is a phenomenon deeply investigated in nonlinear control theory. Such a phenomenon, which is linked to the multistable behavior of the system, is called jump resonance, since the magnitude of the frequency response is subjected to an abrupt jump up/down with respect to the increasing/decreasing of the frequency of the driving signal. In this paper, we aim at investigating fractional order nonlinear systems showing jump resonance, that is systems in which the order of the derivative is noninteger and their frequency response has a magnitude that is a multivalued function in a given range of frequencies. Furthermore, a strategy for designing fractional order systems showing jump resonance is presented along with the procedure to design and implement an analog circuit based on the approximation of the fractional order derivative. An extensive numerical analysis allows one to assess that the phenomenon is robust to the difference in the derivative order, enlightening the first example of a system with order lower than two which is able to demonstrate a jump resonance behavior.
Knowledge repositories for multiple uses
NASA Technical Reports Server (NTRS)
Williamson, Keith; Riddle, Patricia
1991-01-01
In the life cycle of a complex physical device or part, for example, the docking bay door of the Space Station, there are many uses for knowledge about the device or part. The same piece of knowledge might serve several uses. Given the quantity and complexity of the knowledge that must be stored, it is critical to maintain the knowledge in one repository, in one form. At the same time, because of quantity and complexity of knowledge that must be used in life cycle applications such as cost estimation, re-design, and diagnosis, it is critical to automate such knowledge uses. For each specific use, a knowledge base must be available and must be in a from that promotes the efficient performance of that knowledge base. However, without a single source knowledge repository, the cost of maintaining consistent knowledge between multiple knowledge bases increases dramatically; as facts and descriptions change, they must be updated in each individual knowledge base. A use-neutral representation of a hydraulic system for the F-111 aircraft was developed. The ability to derive portions of four different knowledge bases is demonstrated from this use-neutral representation: one knowledge base is for re-design of the device using a model-based reasoning problem solver; two knowledge bases, at different levels of abstraction, are for diagnosis using a model-based reasoning solver; and one knowledge base is for diagnosis using an associational reasoning problem solver. It was shown how updates issued against the single source use-neutral knowledge repository can be propagated to the underlying knowledge bases.
Epistemological Beliefs and Knowledge Sharing in Work Teams: A New Model and Research Questions
ERIC Educational Resources Information Center
Weinberg, Frankie J.
2015-01-01
Purpose: The purpose of this paper is to present a knowledge-sharing model that explains individual members' motivation to share knowledge (knowledge donation and knowledge collection). Design/methodology/approach: The model is based on social-constructivist theories of epistemological beliefs, learning and distributed cognition, and is organized…
Arranging ISO 13606 archetypes into a knowledge base using UML connectors.
Kopanitsa, Georgy
2014-01-01
To enable the efficient reuse of standard based medical data we propose to develop a higher-level information model that will complement the archetype model of ISO 13606. This model will make use of the relationships that are specified in UML to connect medical archetypes into a knowledge base within a repository. UML connectors were analysed for their ability to be applied in the implementation of a higher-level model that will establish relationships between archetypes. An information model was developed using XML Schema notation. The model allows linking different archetypes of one repository into a knowledge base. Presently it supports several relationships and will be advanced in future.
Knowledge management in Portuguese healthcare institutions.
Cruz, Sofia Gaspar; Ferreira, Maria Manuela Frederico
2016-06-01
Knowledge management imposes itself as a pressing need for the organizations of several sectors of the economy, including healthcare. to evaluate the perception of healthcare institution collaborators in relation to knowledge management in the institution where they operate and analyze the existence of differences in this perception, based on the institution's management model. a study conducted in a sample consisting of 671 collaborators from 10 Portuguese healthcare institutions with different models of management. In order to assess the knowledge management perception, we used a score designed from and based on items from the scores available in the literature. the perception of moderate knowledge management on the healthcare institutions and the statistically significant differences in knowledge management perception were evidenced in each management model. management knowledge takes place in healthcare institutions, and the current management model determines the way staff at these institutions manage their knowledge.
XML-Based SHINE Knowledge Base Interchange Language
NASA Technical Reports Server (NTRS)
James, Mark; Mackey, Ryan; Tikidjian, Raffi
2008-01-01
The SHINE Knowledge Base Interchange Language software has been designed to more efficiently send new knowledge bases to spacecraft that have been embedded with the Spacecraft Health Inference Engine (SHINE) tool. The intention of the behavioral model is to capture most of the information generally associated with a spacecraft functional model, while specifically addressing the needs of execution within SHINE and Livingstone. As such, it has some constructs that are based on one or the other.
NASA Astrophysics Data System (ADS)
Pathak, Jaideep; Wikner, Alexander; Fussell, Rebeckah; Chandra, Sarthak; Hunt, Brian R.; Girvan, Michelle; Ott, Edward
2018-04-01
A model-based approach to forecasting chaotic dynamical systems utilizes knowledge of the mechanistic processes governing the dynamics to build an approximate mathematical model of the system. In contrast, machine learning techniques have demonstrated promising results for forecasting chaotic systems purely from past time series measurements of system state variables (training data), without prior knowledge of the system dynamics. The motivation for this paper is the potential of machine learning for filling in the gaps in our underlying mechanistic knowledge that cause widely-used knowledge-based models to be inaccurate. Thus, we here propose a general method that leverages the advantages of these two approaches by combining a knowledge-based model and a machine learning technique to build a hybrid forecasting scheme. Potential applications for such an approach are numerous (e.g., improving weather forecasting). We demonstrate and test the utility of this approach using a particular illustrative version of a machine learning known as reservoir computing, and we apply the resulting hybrid forecaster to a low-dimensional chaotic system, as well as to a high-dimensional spatiotemporal chaotic system. These tests yield extremely promising results in that our hybrid technique is able to accurately predict for a much longer period of time than either its machine-learning component or its model-based component alone.
Teacher Education: Considerations for a Knowledge Base Framework.
ERIC Educational Resources Information Center
Tumposky, Nancy
Traditionally, the knowledge base has been defined more as product than process and has encompassed definitions, principles, values, and facts. Recent reforms in teaching and teacher education have brought about efforts to redefine the knowledge base. The reconceptualized knowledge base builds upon the earlier model but gives higher priority to…
Sticky knowledge: A possible model for investigating implementation in healthcare contexts
Elwyn, Glyn; Taubert, Mark; Kowalczuk, Jenny
2007-01-01
Background In health care, a well recognized gap exists between what we know should be done based on accumulated evidence and what we actually do in practice. A body of empirical literature shows organizations, like individuals, are difficult to change. In the business literature, knowledge management and transfer has become an established area of theory and practice, whilst in healthcare it is only starting to establish a firm footing. Knowledge has become a business resource, and knowledge management theorists and practitioners have examined how knowledge moves in organisations, how it is shared, and how the return on knowledge capital can be maximised to create competitive advantage. New models are being considered, and we wanted to explore the applicability of one of these conceptual models to the implementation of evidence-based practice in healthcare systems. Methods The application of a conceptual model called sticky knowledge, based on an integration of communication theory and knowledge transfer milestones, into a scenario of attempting knowledge transfer in primary care. Results We describe Szulanski's model, the empirical work he conducted, and illustrate its potential applicability with a hypothetical healthcare example based on improving palliative care services. We follow a doctor through two different posts and analyse aspects of knowledge transfer in different primary care settings. The factors included in the sticky knowledge model include: causal ambiguity, unproven knowledge, motivation of source, credibility of source, recipient motivation, recipient absorptive capacity, recipient retentive capacity, barren organisational context, and arduous relationship between source and recipient. We found that we could apply all these factors to the difficulty of implementing new knowledge into practice in primary care settings. Discussion Szulanski argues that knowledge factors play a greater role in the success or failure of a knowledge transfer than has been suspected, and we consider that this conjecture requires further empirical work in healthcare settings. PMID:18096040
Knowledge modeling of coal mining equipments based on ontology
NASA Astrophysics Data System (ADS)
Zhang, Baolong; Wang, Xiangqian; Li, Huizong; Jiang, Miaomiao
2017-06-01
The problems of information redundancy and sharing are universe in coal mining equipment management. In order to improve the using efficiency of knowledge of coal mining equipments, this paper proposed a new method of knowledge modeling based on ontology. On the basis of analyzing the structures and internal relations of coal mining equipment knowledge, taking OWL as ontology construct language, the ontology model of coal mining equipment knowledge is built with the help of Protégé 4.3 software tools. The knowledge description method will lay the foundation for the high effective knowledge management and sharing, which is very significant for improving the production management level of coal mining enterprises.
Abidi, Samina
2017-10-26
Clinical management of comorbidities is a challenge, especially in a clinical decision support setting, as it requires the safe and efficient reconciliation of multiple disease-specific clinical procedures to formulate a comorbid therapeutic plan that is both effective and safe for the patient. In this paper we pursue the integration of multiple disease-specific Clinical Practice Guidelines (CPG) in order to manage co-morbidities within a computerized Clinical Decision Support System (CDSS). We present a CPG integration framework-termed as COMET (Comorbidity Ontological Modeling & ExecuTion) that manifests a knowledge management approach to model, computerize and integrate multiple CPG to yield a comorbid CPG knowledge model that upon execution can provide evidence-based recommendations for handling comorbid patients. COMET exploits semantic web technologies to achieve (a) CPG knowledge synthesis to translate a paper-based CPG to disease-specific clinical pathways (CP) that include specialized co-morbidity management procedures based on input from domain experts; (b) CPG knowledge modeling to computerize the disease-specific CP using a Comorbidity CPG ontology; (c) CPG knowledge integration by aligning multiple ontologically-modeled CP to develop a unified comorbid CPG knowledge model; and (e) CPG knowledge execution using reasoning engines to derive CPG-mediated recommendations for managing patients with comorbidities. We present a web-accessible COMET CDSS that provides family physicians with CPG-mediated comorbidity decision support to manage Atrial Fibrillation and Chronic Heart Failure. We present our qualitative and quantitative analysis of the knowledge content and usability of COMET CDSS.
Hayes, Brett K; Heit, Evan; Swendsen, Haruka
2010-03-01
Inductive reasoning entails using existing knowledge or observations to make predictions about novel cases. We review recent findings in research on category-based induction as well as theoretical models of these results, including similarity-based models, connectionist networks, an account based on relevance theory, Bayesian models, and other mathematical models. A number of touchstone empirical phenomena that involve taxonomic similarity are described. We also examine phenomena involving more complex background knowledge about premises and conclusions of inductive arguments and the properties referenced. Earlier models are shown to give a good account of similarity-based phenomena but not knowledge-based phenomena. Recent models that aim to account for both similarity-based and knowledge-based phenomena are reviewed and evaluated. Among the most important new directions in induction research are a focus on induction with uncertain premise categories, the modeling of the relationship between inductive and deductive reasoning, and examination of the neural substrates of induction. A common theme in both the well-established and emerging lines of induction research is the need to develop well-articulated and empirically testable formal models of induction. Copyright © 2010 John Wiley & Sons, Ltd. For further resources related to this article, please visit the WIREs website. Copyright © 2010 John Wiley & Sons, Ltd.
Using fuzzy rule-based knowledge model for optimum plating conditions search
NASA Astrophysics Data System (ADS)
Solovjev, D. S.; Solovjeva, I. A.; Litovka, Yu V.; Arzamastsev, A. A.; Glazkov, V. P.; L’vov, A. A.
2018-03-01
The paper discusses existing approaches to plating process modeling in order to decrease the distribution thickness of plating surface cover. However, these approaches do not take into account the experience, knowledge, and intuition of the decision-makers when searching the optimal conditions of electroplating technological process. The original approach to optimal conditions search for applying the electroplating coatings, which uses the rule-based model of knowledge and allows one to reduce the uneven product thickness distribution, is proposed. The block diagrams of a conventional control system of a galvanic process as well as the system based on the production model of knowledge are considered. It is shown that the fuzzy production model of knowledge in the control system makes it possible to obtain galvanic coatings of a given thickness unevenness with a high degree of adequacy to the experimental data. The described experimental results confirm the theoretical conclusions.
Use of metaknowledge in the verification of knowledge-based systems
NASA Technical Reports Server (NTRS)
Morell, Larry J.
1989-01-01
Knowledge-based systems are modeled as deductive systems. The model indicates that the two primary areas of concern in verification are demonstrating consistency and completeness. A system is inconsistent if it asserts something that is not true of the modeled domain. A system is incomplete if it lacks deductive capability. Two forms of consistency are discussed along with appropriate verification methods. Three forms of incompleteness are discussed. The use of metaknowledge, knowledge about knowledge, is explored in connection to each form of incompleteness.
Qian, S.; Dunham, M.E.
1996-11-12
A system and method are disclosed for constructing a bank of filters which detect the presence of signals whose frequency content varies with time. The present invention includes a novel system and method for developing one or more time templates designed to match the received signals of interest and the bank of matched filters use the one or more time templates to detect the received signals. Each matched filter compares the received signal x(t) with a respective, unique time template that has been designed to approximate a form of the signals of interest. The robust time domain template is assumed to be of the order of w(t)=A(t)cos(2{pi}{phi}(t)) and the present invention uses the trajectory of a joint time-frequency representation of x(t) as an approximation of the instantaneous frequency function {phi}{prime}(t). First, numerous data samples of the received signal x(t) are collected. A joint time frequency representation is then applied to represent the signal, preferably using the time frequency distribution series. The joint time-frequency transformation represents the analyzed signal energy at time t and frequency f, P(t,f), which is a three-dimensional plot of time vs. frequency vs. signal energy. Then P(t,f) is reduced to a multivalued function f(t), a two dimensional plot of time vs. frequency, using a thresholding process. Curve fitting steps are then performed on the time/frequency plot, preferably using Levenberg-Marquardt curve fitting techniques, to derive a general instantaneous frequency function {phi}{prime}(t) which best fits the multivalued function f(t). Integrating {phi}{prime}(t) along t yields {phi}{prime}(t), which is then inserted into the form of the time template equation. A suitable amplitude A(t) is also preferably determined. Once the time template has been determined, one or more filters are developed which each use a version or form of the time template. 7 figs.
More than Anecdotes: Fishers’ Ecological Knowledge Can Fill Gaps for Ecosystem Modeling
Bevilacqua, Ana Helena V.; Carvalho, Adriana R.; Angelini, Ronaldo; Christensen, Villy
2016-01-01
Background Ecosystem modeling applied to fisheries remains hampered by a lack of local information. Fishers’ knowledge could fill this gap, improving participation in and the management of fisheries. Methodology The same fishing area was modeled using two approaches: based on fishers’ knowledge and based on scientific information. For the former, the data was collected by interviews through the Delphi methodology, and for the latter, the data was gathered from the literature. Agreement between the attributes generated by the fishers’ knowledge model and scientific model is discussed and explored, aiming to improve data availability, the ecosystem model, and fisheries management. Principal Findings The ecosystem attributes produced from the fishers’ knowledge model were consistent with the ecosystem attributes produced by the scientific model, and elaborated using only the scientific data from literature. Conclusions/Significance This study provides evidence that fishers’ knowledge may suitably complement scientific data, and may improve the modeling tools for the research and management of fisheries. PMID:27196131
Knowledge Acquisition of Generic Queries for Information Retrieval
Seol, Yoon-Ho; Johnson, Stephen B.; Cimino, James J.
2002-01-01
Several studies have identified clinical questions posed by health care professionals to understand the nature of information needs during clinical practice. To support access to digital information sources, it is necessary to integrate the information needs with a computer system. We have developed a conceptual guidance approach in information retrieval, based on a knowledge base that contains the patterns of information needs. The knowledge base uses a formal representation of clinical questions based on the UMLS knowledge sources, called the Generic Query model. To improve the coverage of the knowledge base, we investigated a method for extracting plausible clinical questions from the medical literature. This poster presents the Generic Query model, shows how it is used to represent the patterns of clinical questions, and describes the framework used to extract knowledge from the medical literature.
Enhancing Users' Participation in Business Process Modeling through Ontology-Based Training
NASA Astrophysics Data System (ADS)
Macris, A.; Malamateniou, F.; Vassilacopoulos, G.
Successful business process design requires active participation of users who are familiar with organizational activities and business process modelling concepts. Hence, there is a need to provide users with reusable, flexible, agile and adaptable training material in order to enable them instil their knowledge and expertise in business process design and automation activities. Knowledge reusability is of paramount importance in designing training material on process modelling since it enables users participate actively in process design/redesign activities stimulated by the changing business environment. This paper presents a prototype approach for the design and use of training material that provides significant advantages to both the designer (knowledge - content reusability and semantic web enabling) and the user (semantic search, knowledge navigation and knowledge dissemination). The approach is based on externalizing domain knowledge in the form of ontology-based knowledge networks (i.e. training scenarios serving specific training needs) so that it is made reusable.
Knowledge-Based Planning Model for Courses of Action Generation,
1986-04-07
AO-AIS 608 KNOWLEDGE-BASED PLANNING MODEL FOR COURSES OF ACTION mJI OENERATION(U) ARMY MAR COLL CARLISLE BARRACKS PA USI FE D R COLLINS ET AL. 97APR...agencies. This document may not be released for open publication until it has been cleared by the appropriate military service or government agency. 00 DTIC...I ELECTE KNOWLEDGE-BASED PLANNING MODEL C AUG 5~ FOR COURSES OF ACTION GENERATION DD BY COLONEL D. R. COLLINS LIEUTENANT COLONEL(P) T. A. BAUCUM
Chasin, Rachel; Rumshisky, Anna; Uzuner, Ozlem; Szolovits, Peter
2014-01-01
Objective To evaluate state-of-the-art unsupervised methods on the word sense disambiguation (WSD) task in the clinical domain. In particular, to compare graph-based approaches relying on a clinical knowledge base with bottom-up topic-modeling-based approaches. We investigate several enhancements to the topic-modeling techniques that use domain-specific knowledge sources. Materials and methods The graph-based methods use variations of PageRank and distance-based similarity metrics, operating over the Unified Medical Language System (UMLS). Topic-modeling methods use unlabeled data from the Multiparameter Intelligent Monitoring in Intensive Care (MIMIC II) database to derive models for each ambiguous word. We investigate the impact of using different linguistic features for topic models, including UMLS-based and syntactic features. We use a sense-tagged clinical dataset from the Mayo Clinic for evaluation. Results The topic-modeling methods achieve 66.9% accuracy on a subset of the Mayo Clinic's data, while the graph-based methods only reach the 40–50% range, with a most-frequent-sense baseline of 56.5%. Features derived from the UMLS semantic type and concept hierarchies do not produce a gain over bag-of-words features in the topic models, but identifying phrases from UMLS and using syntax does help. Discussion Although topic models outperform graph-based methods, semantic features derived from the UMLS prove too noisy to improve performance beyond bag-of-words. Conclusions Topic modeling for WSD provides superior results in the clinical domain; however, integration of knowledge remains to be effectively exploited. PMID:24441986
System and method for knowledge based matching of users in a network
Verspoor, Cornelia Maria [Santa Fe, NM; Sims, Benjamin Hayden [Los Alamos, NM; Ambrosiano, John Joseph [Los Alamos, NM; Cleland, Timothy James [Los Alamos, NM
2011-04-26
A knowledge-based system and methods to matchmaking and social network extension are disclosed. The system is configured to allow users to specify knowledge profiles, which are collections of concepts that indicate a certain topic or area of interest selected from an. The system utilizes the knowledge model as the semantic space within which to compare similarities in user interests. The knowledge model is hierarchical so that indications of interest in specific concepts automatically imply interest in more general concept. Similarity measures between profiles may then be calculated based on suitable distance formulas within this space.
A User-Centric Knowledge Creation Model in a Web of Object-Enabled Internet of Things Environment
Kibria, Muhammad Golam; Fattah, Sheik Mohammad Mostakim; Jeong, Kwanghyeon; Chong, Ilyoung; Jeong, Youn-Kwae
2015-01-01
User-centric service features in a Web of Object-enabled Internet of Things environment can be provided by using a semantic ontology that classifies and integrates objects on the World Wide Web as well as shares and merges context-aware information and accumulated knowledge. The semantic ontology is applied on a Web of Object platform to virtualize the real world physical devices and information to form virtual objects that represent the features and capabilities of devices in the virtual world. Detailed information and functionalities of multiple virtual objects are combined with service rules to form composite virtual objects that offer context-aware knowledge-based services, where context awareness plays an important role in enabling automatic modification of the system to reconfigure the services based on the context. Converting the raw data into meaningful information and connecting the information to form the knowledge and storing and reusing the objects in the knowledge base can both be expressed by semantic ontology. In this paper, a knowledge creation model that synchronizes a service logistic model and a virtual world knowledge model on a Web of Object platform has been proposed. To realize the context-aware knowledge-based service creation and execution, a conceptual semantic ontology model has been developed and a prototype has been implemented for a use case scenario of emergency service. PMID:26393609
A User-Centric Knowledge Creation Model in a Web of Object-Enabled Internet of Things Environment.
Kibria, Muhammad Golam; Fattah, Sheik Mohammad Mostakim; Jeong, Kwanghyeon; Chong, Ilyoung; Jeong, Youn-Kwae
2015-09-18
User-centric service features in a Web of Object-enabled Internet of Things environment can be provided by using a semantic ontology that classifies and integrates objects on the World Wide Web as well as shares and merges context-aware information and accumulated knowledge. The semantic ontology is applied on a Web of Object platform to virtualize the real world physical devices and information to form virtual objects that represent the features and capabilities of devices in the virtual world. Detailed information and functionalities of multiple virtual objects are combined with service rules to form composite virtual objects that offer context-aware knowledge-based services, where context awareness plays an important role in enabling automatic modification of the system to reconfigure the services based on the context. Converting the raw data into meaningful information and connecting the information to form the knowledge and storing and reusing the objects in the knowledge base can both be expressed by semantic ontology. In this paper, a knowledge creation model that synchronizes a service logistic model and a virtual world knowledge model on a Web of Object platform has been proposed. To realize the context-aware knowledge-based service creation and execution, a conceptual semantic ontology model has been developed and a prototype has been implemented for a use case scenario of emergency service.
Halatchliyski, Iassen; Cress, Ulrike
2014-01-01
Using a longitudinal network analysis approach, we investigate the structural development of the knowledge base of Wikipedia in order to explain the appearance of new knowledge. The data consists of the articles in two adjacent knowledge domains: psychology and education. We analyze the development of networks of knowledge consisting of interlinked articles at seven snapshots from 2006 to 2012 with an interval of one year between them. Longitudinal data on the topological position of each article in the networks is used to model the appearance of new knowledge over time. Thus, the structural dimension of knowledge is related to its dynamics. Using multilevel modeling as well as eigenvector and betweenness measures, we explain the significance of pivotal articles that are either central within one of the knowledge domains or boundary-crossing between the two domains at a given point in time for the future development of new knowledge in the knowledge base. PMID:25365319
Reusing Design Knowledge Based on Design Cases and Knowledge Map
ERIC Educational Resources Information Center
Yang, Cheng; Liu, Zheng; Wang, Haobai; Shen, Jiaoqi
2013-01-01
Design knowledge was reused for innovative design work to support designers with product design knowledge and help designers who lack rich experiences to improve their design capacity and efficiency. First, based on the ontological model of product design knowledge constructed by taxonomy, implicit and explicit knowledge was extracted from some…
Well-posedness for a class of doubly nonlinear stochastic PDEs of divergence type
NASA Astrophysics Data System (ADS)
Scarpa, Luca
2017-08-01
We prove well-posedness for doubly nonlinear parabolic stochastic partial differential equations of the form dXt - div γ (∇Xt) dt + β (Xt) dt ∋ B (t ,Xt) dWt, where γ and β are the two nonlinearities, assumed to be multivalued maximal monotone operators everywhere defined on Rd and R respectively, and W is a cylindrical Wiener process. Using variational techniques, suitable uniform estimates (both pathwise and in expectation) and some compactness results, well-posedness is proved under the classical Leray-Lions conditions on γ and with no restrictive smoothness or growth assumptions on β. The operator B is assumed to be Hilbert-Schmidt and to satisfy some classical Lipschitz conditions in the second variable.
NASA Astrophysics Data System (ADS)
Shin, Sunhae; Rok Kim, Kyung
2015-06-01
In this paper, we propose a novel multiple negative differential resistance (NDR) device with ultra-high peak-to-valley current ratio (PVCR) over 106 by combining tunnel diode with a conventional MOSFET, which suppresses the valley current with transistor off-leakage level. Band-to-band tunneling (BTBT) in tunnel junction provides the first peak, and the second peak and valley are generated from the suppression of diffusion current in tunnel diode by the off-state MOSFET. The multiple NDR curves can be controlled by doping concentration of tunnel junction and the threshold voltage of MOSFET. By using complementary multiple NDR devices, five-state memory is demonstrated only with six transistors.
NASA Astrophysics Data System (ADS)
Rotaru, Ionela Magdalena
2015-09-01
Knowledge management is a powerful instrument. Areas where knowledge - based modelling can be applied are different from business, industry, government to education area. Companies engage in efforts to restructure the database held based on knowledge management principles as they recognize in it a guarantee of models characterized by the fact that they consist only from relevant and sustainable knowledge that can bring value to the companies. The proposed paper presents a theoretical model of what it means optimizing polyethylene pipes, thus bringing to attention two important engineering fields, the one of the metal cutting process and gas industry, who meet in order to optimize the butt fusion welding process - the polyethylene cutting part - of the polyethylene pipes. All approach is shaped on the principles of knowledge management. The study was made in collaboration with companies operating in the field.
NASA Astrophysics Data System (ADS)
Hong, Haibo; Yin, Yuehong; Chen, Xing
2016-11-01
Despite the rapid development of computer science and information technology, an efficient human-machine integrated enterprise information system for designing complex mechatronic products is still not fully accomplished, partly because of the inharmonious communication among collaborators. Therefore, one challenge in human-machine integration is how to establish an appropriate knowledge management (KM) model to support integration and sharing of heterogeneous product knowledge. Aiming at the diversity of design knowledge, this article proposes an ontology-based model to reach an unambiguous and normative representation of knowledge. First, an ontology-based human-machine integrated design framework is described, then corresponding ontologies and sub-ontologies are established according to different purposes and scopes. Second, a similarity calculation-based ontology integration method composed of ontology mapping and ontology merging is introduced. The ontology searching-based knowledge sharing method is then developed. Finally, a case of human-machine integrated design of a large ultra-precision grinding machine is used to demonstrate the effectiveness of the method.
NASA Technical Reports Server (NTRS)
Rahimian, Eric N.; Graves, Sara J.
1988-01-01
A new approach used in constructing a rational data knowledge base system is described. The relational database is well suited for distribution due to its property of allowing data fragmentation and fragmentation transparency. An example is formulated of a simple relational data knowledge base which may be generalized for use in developing a relational distributed data knowledge base system. The efficiency and ease of application of such a data knowledge base management system is briefly discussed. Also discussed are the potentials of the developed model for sharing the data knowledge base as well as the possible areas of difficulty in implementing the relational data knowledge base management system.
ERIC Educational Resources Information Center
Meerbaum-Salant, Orni; Hazzan, Orit
2009-01-01
This paper focuses on challenges in mentoring software development projects in the high school and analyzes difficulties encountered by Computer Science teachers in the mentoring process according to Shulman's Teacher Knowledge Base Model. The main difficulties that emerged from the data analysis belong to the following knowledge sources of…
Surface-confined assemblies and polymers for molecular logic.
de Ruiter, Graham; van der Boom, Milko E
2011-08-16
Stimuli responsive materials are capable of mimicking the operation characteristics of logic gates such as AND, OR, NOR, and even flip-flops. Since the development of molecular sensors and the introduction of the first AND gate in solution by de Silva in 1993, Molecular (Boolean) Logic and Computing (MBLC) has become increasingly popular. In this Account, we present recent research activities that focus on MBLC with electrochromic polymers and metal polypyridyl complexes on a solid support. Metal polypyridyl complexes act as useful sensors to a variety of analytes in solution (i.e., H(2)O, Fe(2+/3+), Cr(6+), NO(+)) and in the gas phase (NO(x) in air). This information transfer, whether the analyte is present, is based on the reversible redox chemistry of the metal complexes, which are stable up to 200 °C in air. The concurrent changes in the optical properties are nondestructive and fast. In such a setup, the input is directly related to the output and, therefore, can be represented by one-input logic gates. These input-output relationships are extendable for mimicking the diverse functions of essential molecular logic gates and circuits within a set of Boolean algebraic operations. Such a molecular approach towards Boolean logic has yielded a series of proof-of-concept devices: logic gates, multiplexers, half-adders, and flip-flop logic circuits. MBLC is a versatile and, potentially, a parallel approach to silicon circuits: assemblies of these molecular gates can perform a wide variety of logic tasks through reconfiguration of their inputs. Although these developments do not require a semiconductor blueprint, similar guidelines such as signal propagation, gate-to-gate communication, propagation delay, and combinatorial and sequential logic will play a critical role in allowing this field to mature. For instance, gate-to-gate communication by chemical wiring of the gates with metal ions as electron carriers results in the integration of stand-alone systems: the output of one gate is used as the input for another gate. Using the same setup, we were able to display both combinatorial and sequential logic. We have demonstrated MBLC by coupling electrochemical inputs with optical readout, which resulted in various logic architectures built on a redox-active, functionalized surface. Electrochemically operated sequential logic systems such as flip-flops, multivalued logic, and multistate memory could enhance computational power without increasing spatial requirements. Applying multivalued digits in data storage could exponentially increase memory capacity. Furthermore, we evaluate the pros and cons of MBLC and identify targets for future research in this Account. © 2011 American Chemical Society
Viewing Knowledge Bases as Qualitative Models.
ERIC Educational Resources Information Center
Clancey, William J.
The concept of a qualitative model provides a unifying perspective for understanding how expert systems differ from conventional programs. Knowledge bases contain qualitative models of systems in the world, that is, primarily non-numeric descriptions that provide a basis for explaining and predicting behavior and formulating action plans. The…
Diagnosis by integrating model-based reasoning with knowledge-based reasoning
NASA Technical Reports Server (NTRS)
Bylander, Tom
1988-01-01
Our research investigates how observations can be categorized by integrating a qualitative physical model with experiential knowledge. Our domain is diagnosis of pathologic gait in humans, in which the observations are the gait motions, muscle activity during gait, and physical exam data, and the diagnostic hypotheses are the potential muscle weaknesses, muscle mistimings, and joint restrictions. Patients with underlying neurological disorders typically have several malfunctions. Among the problems that need to be faced are: the ambiguity of the observations, the ambiguity of the qualitative physical model, correspondence of the observations and hypotheses to the qualitative physical model, the inherent uncertainty of experiential knowledge, and the combinatorics involved in forming composite hypotheses. Our system divides the work so that the knowledge-based reasoning suggests which hypotheses appear more likely than others, the qualitative physical model is used to determine which hypotheses explain which observations, and another process combines these functionalities to construct a composite hypothesis based on explanatory power and plausibility. We speculate that the reasoning architecture of our system is generally applicable to complex domains in which a less-than-perfect physical model and less-than-perfect experiential knowledge need to be combined to perform diagnosis.
Knowledge-driven genomic interactions: an application in ovarian cancer.
Kim, Dokyoon; Li, Ruowang; Dudek, Scott M; Frase, Alex T; Pendergrass, Sarah A; Ritchie, Marylyn D
2014-01-01
Effective cancer clinical outcome prediction for understanding of the mechanism of various types of cancer has been pursued using molecular-based data such as gene expression profiles, an approach that has promise for providing better diagnostics and supporting further therapies. However, clinical outcome prediction based on gene expression profiles varies between independent data sets. Further, single-gene expression outcome prediction is limited for cancer evaluation since genes do not act in isolation, but rather interact with other genes in complex signaling or regulatory networks. In addition, since pathways are more likely to co-operate together, it would be desirable to incorporate expert knowledge to combine pathways in a useful and informative manner. Thus, we propose a novel approach for identifying knowledge-driven genomic interactions and applying it to discover models associated with cancer clinical phenotypes using grammatical evolution neural networks (GENN). In order to demonstrate the utility of the proposed approach, an ovarian cancer data from the Cancer Genome Atlas (TCGA) was used for predicting clinical stage as a pilot project. We identified knowledge-driven genomic interactions associated with cancer stage from single knowledge bases such as sources of pathway-pathway interaction, but also knowledge-driven genomic interactions across different sets of knowledge bases such as pathway-protein family interactions by integrating different types of information. Notably, an integration model from different sources of biological knowledge achieved 78.82% balanced accuracy and outperformed the top models with gene expression or single knowledge-based data types alone. Furthermore, the results from the models are more interpretable because they are framed in the context of specific biological pathways or other expert knowledge. The success of the pilot study we have presented herein will allow us to pursue further identification of models predictive of clinical cancer survival and recurrence. Understanding the underlying tumorigenesis and progression in ovarian cancer through the global view of interactions within/between different biological knowledge sources has the potential for providing more effective screening strategies and therapeutic targets for many types of cancer.
An empirically based model for knowledge management in health care organizations.
Sibbald, Shannon L; Wathen, C Nadine; Kothari, Anita
2016-01-01
Knowledge management (KM) encompasses strategies, processes, and practices that allow an organization to capture, share, store, access, and use knowledge. Ideal KM combines different sources of knowledge to support innovation and improve performance. Despite the importance of KM in health care organizations (HCOs), there has been very little empirical research to describe KM in this context. This study explores KM in HCOs, focusing on the status of current intraorganizational KM. The intention is to provide insight for future studies and model development for effective KM implementation in HCOs. A qualitative methods approach was used to create an empirically based model of KM in HCOs. Methods included (a) qualitative interviews (n = 24) with senior leadership to identify types of knowledge important in these roles plus current information-seeking behaviors/needs and (b) in-depth case study with leaders in new executive positions (n = 2). The data were collected from 10 HCOs. Our empirically based model for KM was assessed for face and content validity. The findings highlight the paucity of formal KM in our sample HCOs. Organizational culture, leadership, and resources are instrumental in supporting KM processes. An executive's knowledge needs are extensive, but knowledge assets are often limited or difficult to acquire as much of the available information is not in a usable format. We propose an empirically based model for KM to highlight the importance of context (internal and external), and knowledge seeking, synthesis, sharing, and organization. Participants who reviewed the model supported its basic components and processes, and potential for incorporating KM into organizational processes. Our results articulate ways to improve KM, increase organizational learning, and support evidence-informed decision-making. This research has implications for how to better integrate evidence and knowledge into organizations while considering context and the role of organizational processes.
NASA Technical Reports Server (NTRS)
Krishnan, G. S.
1997-01-01
A cost effective model which uses the artificial intelligence techniques in the selection and approval of parts is presented. The knowledge which is acquired from the specialists for different part types are represented in a knowledge base in the form of rules and objects. The parts information is stored separately in a data base and is isolated from the knowledge base. Validation, verification and performance issues are highlighted.
Artificial intelligence in process control: Knowledge base for the shuttle ECS model
NASA Technical Reports Server (NTRS)
Stiffler, A. Kent
1989-01-01
The general operation of KATE, an artificial intelligence controller, is outlined. A shuttle environmental control system (ECS) demonstration system for KATE is explained. The knowledge base model for this system is derived. An experimental test procedure is given to verify parameters in the model.
Dynamic Strategic Planning in a Professional Knowledge-Based Organization
ERIC Educational Resources Information Center
Olivarius, Niels de Fine; Kousgaard, Marius Brostrom; Reventlow, Susanne; Quelle, Dan Grevelund; Tulinius, Charlotte
2010-01-01
Professional, knowledge-based institutions have a particular form of organization and culture that makes special demands on the strategic planning supervised by research administrators and managers. A model for dynamic strategic planning based on a pragmatic utilization of the multitude of strategy models was used in a small university-affiliated…
Zhang, Yi-Fan; Tian, Yu; Zhou, Tian-Shu; Araki, Kenji; Li, Jing-Song
2016-01-01
The broad adoption of clinical decision support systems within clinical practice has been hampered mainly by the difficulty in expressing domain knowledge and patient data in a unified formalism. This paper presents a semantic-based approach to the unified representation of healthcare domain knowledge and patient data for practical clinical decision making applications. A four-phase knowledge engineering cycle is implemented to develop a semantic healthcare knowledge base based on an HL7 reference information model, including an ontology to model domain knowledge and patient data and an expression repository to encode clinical decision making rules and queries. A semantic clinical decision support system is designed to provide patient-specific healthcare recommendations based on the knowledge base and patient data. The proposed solution is evaluated in the case study of type 2 diabetes mellitus inpatient management. The knowledge base is successfully instantiated with relevant domain knowledge and testing patient data. Ontology-level evaluation confirms model validity. Application-level evaluation of diagnostic accuracy reaches a sensitivity of 97.5%, a specificity of 100%, and a precision of 98%; an acceptance rate of 97.3% is given by domain experts for the recommended care plan orders. The proposed solution has been successfully validated in the case study as providing clinical decision support at a high accuracy and acceptance rate. The evaluation results demonstrate the technical feasibility and application prospect of our approach. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Nagothu, U. S.
2016-12-01
Agricultural extension services, among others, contribute to improving rural livelihoods and enhancing economic development. Knowledge development and transfer from the cognitive science point of view, is about, how farmers use and apply their experiential knowledge as well as acquired new knowledge to solve new problems. This depends on the models adopted, the way knowledge is generated and delivered. New extension models based on ICT platforms and smart phones are promising. Results from a 5-year project (www.climaadapt.org) in India shows that farmer led-on farm validations of technologies and knowledge exchange through ICT based platforms outperformed state operated linear extension programs. Innovation here depends on the connectivity, net-working between stakeholders that are involved in generating, transferring and using the knowledge. Key words: Smallholders, Knowledge, Extension, Innovation, India
A Theory of the Measurement of Knowledge Content, Access, and Learning.
ERIC Educational Resources Information Center
Pirolli, Peter; Wilson, Mark
1998-01-01
An approach to the measurement of knowledge content, knowledge access, and knowledge learning is developed. First a theoretical view of cognition is described, and then a class of measurement models, based on Rasch modeling, is presented. Knowledge access and content are viewed as determining the observable actions selected by an agent to achieve…
Ramzan, Asia; Wang, Hai; Buckingham, Christopher
2014-01-01
Clinical decision support systems (CDSSs) often base their knowledge and advice on human expertise. Knowledge representation needs to be in a format that can be easily understood by human users as well as supporting ongoing knowledge engineering, including evolution and consistency of knowledge. This paper reports on the development of an ontology specification for managing knowledge engineering in a CDSS for assessing and managing risks associated with mental-health problems. The Galatean Risk and Safety Tool, GRiST, represents mental-health expertise in the form of a psychological model of classification. The hierarchical structure was directly represented in the machine using an XML document. Functionality of the model and knowledge management were controlled using attributes in the XML nodes, with an accompanying paper manual for specifying how end-user tools should behave when interfacing with the XML. This paper explains the advantages of using the web-ontology language, OWL, as the specification, details some of the issues and problems encountered in translating the psychological model to OWL, and shows how OWL benefits knowledge engineering. The conclusions are that OWL can have an important role in managing complex knowledge domains for systems based on human expertise without impeding the end-users' understanding of the knowledge base. The generic classification model underpinning GRiST makes it applicable to many decision domains and the accompanying OWL specification facilitates its implementation.
Reducing the Knowledge Tracing Space
ERIC Educational Resources Information Center
Ritter, Steven; Harris, Thomas K.; Nixon, Tristan; Dickison, Daniel; Murray, R. Charles; Towle, Brendon
2009-01-01
In Cognitive Tutors, student skill is represented by estimates of student knowledge on various knowledge components. The estimate for each knowledge component is based on a four-parameter model developed by Corbett and Anderson [Nb]. In this paper, we investigate the nature of the parameter space defined by these four parameters by modeling data…
Intrusion Detection Systems with Live Knowledge System
2016-05-31
Ripple -down Rule (RDR) to maintain the knowledge from human experts with knowledge base generated by the Induct RDR, which is a machine-learning based RDR...propose novel approach that uses Ripple -down Rule (RDR) to maintain the knowledge from human experts with knowledge base generated by the Induct RDR...detection model by applying Induct RDR approach. The proposed induct RDR ( Ripple Down Rules) approach allows to acquire the phishing detection
Dameron, O; Gibaud, B; Morandi, X
2004-06-01
The human cerebral cortex anatomy describes the brain organization at the scale of gyri and sulci. It is used as landmarks for neurosurgery as well as localization support for functional data analysis or inter-subject data comparison. Existing models of the cortex anatomy either rely on image labeling but fail to represent variability and structural properties or rely on a conceptual model but miss the inner 3D nature and relations of anatomical structures. This study was therefore conducted to propose a model of sulco-gyral anatomy for the healthy human brain. We hypothesized that both numeric knowledge (i.e., image-based) and symbolic knowledge (i.e., concept-based) have to be represented and coordinated. In addition, the representation of this knowledge should be application-independent in order to be usable in various contexts. Therefore, we devised a symbolic model describing specialization, composition and spatial organization of cortical anatomical structures. We also collected numeric knowledge such as 3D models of shape and shape variation about cortical anatomical structures. For each numeric piece of knowledge, a companion file describes the concept it refers to and the nature of the relationship. Demonstration software performs a mapping between the numeric and the symbolic aspects for browsing the knowledge base.
Literature-Based Scientific Learning: A Collaboration Model
ERIC Educational Resources Information Center
Elrod, Susan L.; Somerville, Mary M.
2007-01-01
Amidst exponential growth of knowledge, student insights into the knowledge creation practices of the scientific community can be furthered by science faculty collaborations with university librarians. The Literature-Based Scientific Learning model advances undergraduates' disciplinary mastery and information literacy through experience with…
NASA Technical Reports Server (NTRS)
Iscoe, Neil; Liu, Zheng-Yang; Feng, Guohui; Yenne, Britt; Vansickle, Larry; Ballantyne, Michael
1992-01-01
Domain-specific knowledge is required to create specifications, generate code, and understand existing systems. Our approach to automating software design is based on instantiating an application domain model with industry-specific knowledge and then using that model to achieve the operational goals of specification elicitation and verification, reverse engineering, and code generation. Although many different specification models can be created from any particular domain model, each specification model is consistent and correct with respect to the domain model.
NASA Astrophysics Data System (ADS)
Reinfried, Sibylle; Tempelmann, Sebastian
2014-01-01
This paper provides a video-based learning process study that investigates the kinds of mental models of the atmospheric greenhouse effect 13-year-old learners have and how these mental models change with a learning environment, which is optimised in regard to instructional psychology. The objective of this explorative study was to observe and analyse the learners' learning pathways according to their previous knowledge in detail and to understand the mental model formation processes associated with them more precisely. For the analysis of the learning pathways, drawings, texts, video and interview transcripts from 12 students were studied using qualitative methods. The learning pathways pursued by the learners significantly depend on their domain-specific previous knowledge. The learners' preconceptions could be typified based on specific characteristics, whereby three preconception types could be formed. The 'isolated pieces of knowledge' type of learners, who have very little or no previous knowledge about the greenhouse effect, build new mental models that are close to the target model. 'Reduced heat output' type of learners, who have previous knowledge that indicates compliances with central ideas of the normative model, reconstruct their knowledge by reorganising and interpreting their existing knowledge structures. 'Increasing heat input' type of learners, whose previous knowledge consists of subjective worldly knowledge, which has a greater personal explanatory value than the information from the learning environment, have more difficulties changing their mental models. They have to fundamentally reconstruct their mental models.
Robust low-bias negative differential resistance in graphene superlattices
NASA Astrophysics Data System (ADS)
Sattari-Esfahlan, S. M.; Fouladi-Oskuei, J.; Shojaei, S.
2017-06-01
In this work, we present a detailed theoretical study on the low bias current-voltage (I-V) characteristic of biased planar graphene superlattice (PGSL), provided by a heterostructured substrate and a series of grounded metallic planes placed over a graphene sheet, which induce a periodically modulated Dirac gap and Fermi velocity barrier, respectively. We investigate the effect of PGSL parameters on the I-V characteristic and the appearance of multipeak negative differential resistance (NDR) in the proposed device within the Landauer-Buttiker formalism and adopted transfer matrix method. Moreover‚ we propose a novel venue to control the NDR in PGSL with Fermi velocity barrier. Different regimes of NDR have been recognized, based on the PGSL parameters and external bias. From this viewpoint‚ we obtain multipeak NDR through miniband aligning in PGSL. The maximum pick to valley ratio (PVR) up to 167 obtained for ~{{\\upsilon}c} , the Fermi velocity correlation (ratio of Fermi velocity in barrier and well region), is 1.9 at bias voltages between 70-130 mV. Our findings have good agreement with experiments and can be considered in designing multi-valued memory‚ functional circuit, low power and high-speed nanoelectronic device applications.
NASA Astrophysics Data System (ADS)
Wang, Hongling
2011-10-01
This article, placed the comprehensive quality improvement of undergraduates under the background of elite culture and mass culture, analyzed the influences and challenges brought by elite culture and mass culture on the undergraduate education from multiple perspectives of philosophy, ethics, economics, education, sociology and etc. and combing some foreign developed countries' experiences proposed the principles should be insisted by high schools in the context of elite culture and mass culture. With the development of times, undergraduate education should also constantly develop into new historical starting points and thoroughly reform the undergraduate education from content to essence, perception to format with a globalized horizon, so as to be able to reflect the time characteristics and better promote the overall development of undergraduates. Exactly based on such a view, this article, on the premise of full recognition that the flourishing and development of elite culture and mass culture has promoted China into a multicultural situation, proposed the principles for university moral education, such as education should promote the integration of undergraduate multi-values, sticking to the integration of unary guidance with diverse development, insisting on seeking common points while reserving differences and harmony but with differences, and etc.
The Relationship between Agriculture Knowledge Bases for Teaching and Sources of Knowledge
ERIC Educational Resources Information Center
Rice, Amber H.; Kitchel, Tracy
2015-01-01
The purpose of this study was to describe the agriculture knowledge bases for teaching of agriculture teachers and to see if a relationship existed between years of teaching experience, sources of knowledge, and development of pedagogical content knowledge (PCK), using quantitative methods. A model of PCK from mathematics was utilized as a…
ERIC Educational Resources Information Center
Bernacki, Matthew
2010-01-01
This study examined how learners construct textbase and situation model knowledge in hypertext computer-based learning environments (CBLEs) and documented the influence of specific self-regulated learning (SRL) tactics, prior knowledge, and characteristics of the learner on posttest knowledge scores from exposure to a hypertext. A sample of 160…
Automated knowledge generation
NASA Technical Reports Server (NTRS)
Myler, Harley R.; Gonzalez, Avelino J.
1988-01-01
The general objectives of the NASA/UCF Automated Knowledge Generation Project were the development of an intelligent software system that could access CAD design data bases, interpret them, and generate a diagnostic knowledge base in the form of a system model. The initial area of concentration is in the diagnosis of the process control system using the Knowledge-based Autonomous Test Engineer (KATE) diagnostic system. A secondary objective was the study of general problems of automated knowledge generation. A prototype was developed, based on object-oriented language (Flavors).
Conceptual model of knowledge base system
NASA Astrophysics Data System (ADS)
Naykhanova, L. V.; Naykhanova, I. V.
2018-05-01
In the article, the conceptual model of the knowledge based system by the type of the production system is provided. The production system is intended for automation of problems, which solution is rigidly conditioned by the legislation. A core component of the system is a knowledge base. The knowledge base consists of a facts set, a rules set, the cognitive map and ontology. The cognitive map is developed for implementation of a control strategy, ontology - the explanation mechanism. Knowledge representation about recognition of a situation in the form of rules allows describing knowledge of the pension legislation. This approach provides the flexibility, originality and scalability of the system. In the case of changing legislation, it is necessary to change the rules set. This means that the change of the legislation would not be a big problem. The main advantage of the system is that there is an opportunity to be adapted easily to changes of the legislation.
Modelling Teaching Strategies.
ERIC Educational Resources Information Center
Major, Nigel
1995-01-01
Describes a modelling language for representing teaching strategies, based in the context of the COCA intelligent tutoring system. Examines work on meta-reasoning in knowledge-based systems and describes COCA's architecture, giving details of the language used for representing teaching knowledge. Discusses implications for future work. (AEF)
Ahmed, Wamiq M; Lenz, Dominik; Liu, Jia; Paul Robinson, J; Ghafoor, Arif
2008-03-01
High-throughput biological imaging uses automated imaging devices to collect a large number of microscopic images for analysis of biological systems and validation of scientific hypotheses. Efficient manipulation of these datasets for knowledge discovery requires high-performance computational resources, efficient storage, and automated tools for extracting and sharing such knowledge among different research sites. Newly emerging grid technologies provide powerful means for exploiting the full potential of these imaging techniques. Efficient utilization of grid resources requires the development of knowledge-based tools and services that combine domain knowledge with analysis algorithms. In this paper, we first investigate how grid infrastructure can facilitate high-throughput biological imaging research, and present an architecture for providing knowledge-based grid services for this field. We identify two levels of knowledge-based services. The first level provides tools for extracting spatiotemporal knowledge from image sets and the second level provides high-level knowledge management and reasoning services. We then present cellular imaging markup language, an extensible markup language-based language for modeling of biological images and representation of spatiotemporal knowledge. This scheme can be used for spatiotemporal event composition, matching, and automated knowledge extraction and representation for large biological imaging datasets. We demonstrate the expressive power of this formalism by means of different examples and extensive experimental results.
Knowledge service decision making in business incubators based on the supernetwork model
NASA Astrophysics Data System (ADS)
Zhao, Liming; Zhang, Haihong; Wu, Wenqing
2017-08-01
As valuable resources for incubating firms, knowledge resources have received gradually increasing attention from all types of business incubators, and business incubators use a variety of knowledge services to stimulate rapid growth in incubating firms. Based on previous research, we generalize the knowledge transfer and knowledge networking services of two main forms of knowledge services and further divide knowledge transfer services into knowledge depth services and knowledge breadth services. Then, we construct the business incubators' knowledge supernetwork model, describe the evolution mechanism among heterogeneous agents and utilize a simulation to explore the performance variance of different business incubators' knowledge services. The simulation results show that knowledge stock increases faster when business incubators are able to provide knowledge services to more incubating firms and that the degree of discrepancy in the knowledge stock increases during the process of knowledge growth. Further, knowledge transfer services lead to greater differences in the knowledge structure, while knowledge networking services lead to smaller differences. Regarding the two types of knowledge transfer services, knowledge depth services are more conducive to knowledge growth than knowledge breadth services, but knowledge depth services lead to greater gaps in knowledge stocks and greater differences in knowledge structures. Overall, it is optimal for business incubators to select a single knowledge service or portfolio strategy based on the amount of time and energy expended on the two types of knowledge services.
Non-convex dissipation potentials in multiscale non-equilibrium thermodynamics
NASA Astrophysics Data System (ADS)
Janečka, Adam; Pavelka, Michal
2018-04-01
Reformulating constitutive relation in terms of gradient dynamics (being derivative of a dissipation potential) brings additional information on stability, metastability and instability of the dynamics with respect to perturbations of the constitutive relation, called CR-stability. CR-instability is connected to the loss of convexity of the dissipation potential, which makes the Legendre-conjugate dissipation potential multivalued and causes dissipative phase transitions that are not induced by non-convexity of free energy, but by non-convexity of the dissipation potential. CR-stability of the constitutive relation with respect to perturbations is then manifested by constructing evolution equations for the perturbations in a thermodynamically sound way (CR-extension). As a result, interesting experimental observations of behavior of complex fluids under shear flow and supercritical boiling curve can be explained.
NASA Astrophysics Data System (ADS)
Sazonov, S. V.; Ustinov, N. V.
2017-02-01
The nonlinear propagation of extremely short electromagnetic pulses in a medium of symmetric and asymmetric molecules placed in static magnetic and electric fields is theoretically studied. Asymmetric molecules differ in that they have nonzero permanent dipole moments in stationary quantum states. A system of wave equations is derived for the ordinary and extraordinary components of pulses. It is shown that this system can be reduced in some cases to a system of coupled Ostrovsky equations and to the equation intagrable by the method for an inverse scattering transformation, including the vector version of the Ostrovsky-Vakhnenko equation. Different types of solutions of this system are considered. Only solutions representing the superposition of periodic solutions are single-valued, whereas soliton and breather solutions are multivalued.
Enhanced sensitivity of a passive optical cavity by an intracavity dispersive medium
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, David D.; Department of Physics, University of Alabama in Huntsville, Huntsville, Alabama 35899; Myneni, Krishna
2009-07-15
The pushing of the modes of a Fabry-Perot cavity by an intracavity rubidium cell is measured. The scale factor of the modes is increased by the anomalous dispersion and is inversely proportional to the sum of the effective group index and an additional cavity delay factor that arises from the variation of the Rb absorption over a free spectral range. This additional positive feedback further increases the effect of the anomalous dispersion and goes to zero at the lasing threshold. The mode width does not grow as fast as the scale factor as the intracavity absorption is increased resulting inmore » enhanced measurement sensitivities. For absorptions larger than the scale factor pole, the atom-cavity response is multivalued and mode splitting occurs.« less
Automated knowledge-base refinement
NASA Technical Reports Server (NTRS)
Mooney, Raymond J.
1994-01-01
Over the last several years, we have developed several systems for automatically refining incomplete and incorrect knowledge bases. These systems are given an imperfect rule base and a set of training examples and minimally modify the knowledge base to make it consistent with the examples. One of our most recent systems, FORTE, revises first-order Horn-clause knowledge bases. This system can be viewed as automatically debugging Prolog programs based on examples of correct and incorrect I/O pairs. In fact, we have already used the system to debug simple Prolog programs written by students in a programming language course. FORTE has also been used to automatically induce and revise qualitative models of several continuous dynamic devices from qualitative behavior traces. For example, it has been used to induce and revise a qualitative model of a portion of the Reaction Control System (RCS) of the NASA Space Shuttle. By fitting a correct model of this portion of the RCS to simulated qualitative data from a faulty system, FORTE was also able to correctly diagnose simple faults in this system.
ERIC Educational Resources Information Center
Cheriani, Cheriani; Mahmud, Alimuddin; Tahmir, Suradi; Manda, Darman; Dirawan, Gufran Darma
2015-01-01
This study aims to determine the differences in learning output by using Problem Based Model combines with the "Buginese" Local Cultural Knowledge (PBL-Culture). It is also explores the students activities in learning mathematics subject by using PBL-Culture Models. This research is using Mixed Methods approach that combined quantitative…
A Comparison of Functional Models for Use in the Function-Failure Design Method
NASA Technical Reports Server (NTRS)
Stock, Michael E.; Stone, Robert B.; Tumer, Irem Y.
2006-01-01
When failure analysis and prevention, guided by historical design knowledge, are coupled with product design at its conception, shorter design cycles are possible. By decreasing the design time of a product in this manner, design costs are reduced and the product will better suit the customer s needs. Prior work indicates that similar failure modes occur with products (or components) with similar functionality. To capitalize on this finding, a knowledge base of historical failure information linked to functionality is assembled for use by designers. One possible use for this knowledge base is within the Elemental Function-Failure Design Method (EFDM). This design methodology and failure analysis tool begins at conceptual design and keeps the designer cognizant of failures that are likely to occur based on the product s functionality. The EFDM offers potential improvement over current failure analysis methods, such as FMEA, FMECA, and Fault Tree Analysis, because it can be implemented hand in hand with other conceptual design steps and carried throughout a product s design cycle. These other failure analysis methods can only truly be effective after a physical design has been completed. The EFDM however is only as good as the knowledge base that it draws from, and therefore it is of utmost importance to develop a knowledge base that will be suitable for use across a wide spectrum of products. One fundamental question that arises in using the EFDM is: At what level of detail should functional descriptions of components be encoded? This paper explores two approaches to populating a knowledge base with actual failure occurrence information from Bell 206 helicopters. Functional models expressed at various levels of detail are investigated to determine the necessary detail for an applicable knowledge base that can be used by designers in both new designs as well as redesigns. High level and more detailed functional descriptions are derived for each failed component based on NTSB accident reports. To best record this data, standardized functional and failure mode vocabularies are used. Two separate function-failure knowledge bases are then created aid compared. Results indicate that encoding failure data using more detailed functional models allows for a more robust knowledge base. Interestingly however, when applying the EFDM, high level descriptions continue to produce useful results when using the knowledge base generated from the detailed functional models.
NASA Technical Reports Server (NTRS)
Kolb, Mark A.
1990-01-01
Viewgraphs on Rubber Airplane: Constraint-based Component-Modeling for Knowledge Representation in Computer Aided Conceptual Design are presented. Topics covered include: computer aided design; object oriented programming; airfoil design; surveillance aircraft; commercial aircraft; aircraft design; and launch vehicles.
NASA Astrophysics Data System (ADS)
Wu, Jiangning; Wang, Xiaohuan
Rapidly increasing amount of mobile phone users and types of services leads to a great accumulation of complaining information. How to use this information to enhance the quality of customers' services is a big issue at present. To handle this kind of problem, the paper presents an approach to construct a domain knowledge map for navigating the explicit and tacit knowledge in two ways: building the Topic Map-based explicit knowledge navigation model, which includes domain TM construction, a semantic topic expansion algorithm and VSM-based similarity calculation; building Social Network Analysis-based tacit knowledge navigation model, which includes a multi-relational expert navigation algorithm and the criterions to evaluate the performance of expert networks. In doing so, both the customer managers and operators in call centers can find the appropriate knowledge and experts quickly and exactly. The experimental results show that the above method is very powerful for knowledge navigation.
NASA Astrophysics Data System (ADS)
Tucker, Deborah L.
Purpose. The purpose of this grounded theory study was to refine, using a Delphi study process, the four categories of the theoretical model of the comprehensive knowledge base required by providers of professional development for K-12 teachers of science generated from a review of the literature. Methodology. This grounded theory study used data collected through a modified Delphi technique and interviews to refine and validate the literature-based knowledge base required by providers of professional development for K-12 teachers of science. Twenty-three participants, experts in the fields of science education, how people learn, instructional and assessment strategies, and learning contexts, responded to the study's questions. Findings. By "densifying" the four categories of the knowledge base, this study determined the causal conditions (the science subject matter knowledge), the intervening conditions (how people learn), the strategies (the effective instructional and assessment strategies), and the context (the context and culture of formal learning environments) surrounding the science professional development process. Eight sections were added to the literature-based knowledge base; the final model comprised of forty-nine sections. The average length of the operational definitions increased nearly threefold and the number of citations per operational definition increased more than twofold. Conclusions. A four-category comprehensive model that can serve as the foundation for the knowledge base required by science professional developers now exists. Subject matter knowledge includes science concepts, inquiry, the nature of science, and scientific habits of mind; how people learn includes the principles of learning, active learning, andragogy, variations in learners, neuroscience and cognitive science, and change theory; effective instructional and assessment strategies include constructivist learning and inquiry-based teaching, differentiation of instruction, making knowledge and thinking accessible to learners, automatic and fluent retrieval of nonscience-specific skills, and science assessment and assessment strategies, science-specific instructional strategies, and safety within a learning environment; and, contextual knowledge includes curriculum selection and implementation strategies and knowledge of building program coherence. Recommendations. Further research on the use of which specific instructional strategies identified in the refined knowledge base have positive, significant effect sizes for adult learners is recommended.
A Knowledge Generation Model via the Hypernetwork
Liu, Jian-Guo; Yang, Guang-Yong; Hu, Zhao-Long
2014-01-01
The influence of the statistical properties of the network on the knowledge diffusion has been extensively studied. However, the structure evolution and the knowledge generation processes are always integrated simultaneously. By introducing the Cobb-Douglas production function and treating the knowledge growth as a cooperative production of knowledge, in this paper, we present two knowledge-generation dynamic evolving models based on different evolving mechanisms. The first model, named “HDPH model,” adopts the hyperedge growth and the hyperdegree preferential attachment mechanisms. The second model, named “KSPH model,” adopts the hyperedge growth and the knowledge stock preferential attachment mechanisms. We investigate the effect of the parameters on the total knowledge stock of the two models. The hyperdegree distribution of the HDPH model can be theoretically analyzed by the mean-field theory. The analytic result indicates that the hyperdegree distribution of the HDPH model obeys the power-law distribution and the exponent is . Furthermore, we present the distributions of the knowledge stock for different parameters . The findings indicate that our proposed models could be helpful for deeply understanding the scientific research cooperation. PMID:24626143
A knowledge generation model via the hypernetwork.
Liu, Jian-Guo; Yang, Guang-Yong; Hu, Zhao-Long
2014-01-01
The influence of the statistical properties of the network on the knowledge diffusion has been extensively studied. However, the structure evolution and the knowledge generation processes are always integrated simultaneously. By introducing the Cobb-Douglas production function and treating the knowledge growth as a cooperative production of knowledge, in this paper, we present two knowledge-generation dynamic evolving models based on different evolving mechanisms. The first model, named "HDPH model," adopts the hyperedge growth and the hyperdegree preferential attachment mechanisms. The second model, named "KSPH model," adopts the hyperedge growth and the knowledge stock preferential attachment mechanisms. We investigate the effect of the parameters (α,β) on the total knowledge stock of the two models. The hyperdegree distribution of the HDPH model can be theoretically analyzed by the mean-field theory. The analytic result indicates that the hyperdegree distribution of the HDPH model obeys the power-law distribution and the exponent is γ = 2 + 1/m. Furthermore, we present the distributions of the knowledge stock for different parameters (α,β). The findings indicate that our proposed models could be helpful for deeply understanding the scientific research cooperation.
Koutkias, Vassilis; Kilintzis, Vassilis; Stalidis, George; Lazou, Katerina; Niès, Julie; Durand-Texte, Ludovic; McNair, Peter; Beuscart, Régis; Maglaveras, Nicos
2012-06-01
The primary aim of this work was the development of a uniform, contextualized and sustainable knowledge-based framework to support adverse drug event (ADE) prevention via Clinical Decision Support Systems (CDSSs). In this regard, the employed methodology involved first the systematic analysis and formalization of the knowledge sources elaborated in the scope of this work, through which an application-specific knowledge model has been defined. The entire framework architecture has been then specified and implemented by adopting Computer Interpretable Guidelines (CIGs) as the knowledge engineering formalism for its construction. The framework integrates diverse and dynamic knowledge sources in the form of rule-based ADE signals, all under a uniform Knowledge Base (KB) structure, according to the defined knowledge model. Equally important, it employs the means to contextualize the encapsulated knowledge, in order to provide appropriate support considering the specific local environment (hospital, medical department, language, etc.), as well as the mechanisms for knowledge querying, inference, sharing, and management. In this paper, we present thoroughly the establishment of the proposed knowledge framework by presenting the employed methodology and the results obtained as regards implementation, performance and validation aspects that highlight its applicability and virtue in medication safety. Copyright © 2012 Elsevier Inc. All rights reserved.
Arar, Nedal; Knight, Sara J; Modell, Stephen M; Issa, Amalia M
2011-03-01
The main mission of the Genomic Applications in Practice and Prevention Network™ is to advance collaborative efforts involving partners from across the public health sector to realize the promise of genomics in healthcare and disease prevention. We introduce a new framework that supports the Genomic Applications in Practice and Prevention Network mission and leverages the characteristics of the complex adaptive systems approach. We call this framework the Genome-based Knowledge Management in Cycles model (G-KNOMIC). G-KNOMIC proposes that the collaborative work of multidisciplinary teams utilizing genome-based applications will enhance translating evidence-based genomic findings by creating ongoing knowledge management cycles. Each cycle consists of knowledge synthesis, knowledge evaluation, knowledge implementation and knowledge utilization. Our framework acknowledges that all the elements in the knowledge translation process are interconnected and continuously changing. It also recognizes the importance of feedback loops, and the ability of teams to self-organize within a dynamic system. We demonstrate how this framework can be used to improve the adoption of genomic technologies into practice using two case studies of genomic uptake.
A Knowledge-Base for a Personalized Infectious Disease Risk Prediction System.
Vinarti, Retno; Hederman, Lucy
2018-01-01
We present a knowledge-base to represent collated infectious disease risk (IDR) knowledge. The knowledge is about personal and contextual risk of contracting an infectious disease obtained from declarative sources (e.g. Atlas of Human Infectious Diseases). Automated prediction requires encoding this knowledge in a form that can produce risk probabilities (e.g. Bayesian Network - BN). The knowledge-base presented in this paper feeds an algorithm that can auto-generate the BN. The knowledge from 234 infectious diseases was compiled. From this compilation, we designed an ontology and five rule types for modelling IDR knowledge in general. The evaluation aims to assess whether the knowledge-base structure, and its application to three disease-country contexts, meets the needs of personalized IDR prediction system. From the evaluation results, the knowledge-base conforms to the system's purpose: personalization of infectious disease risk.
Using the Knowledge to Action Process Model to Incite Clinical Change
ERIC Educational Resources Information Center
Petzold, Anita; Korner-Bitensky, Nicol; Menon, Anita
2010-01-01
Introduction: Knowledge translation (KT) has only recently emerged in the field of rehabilitation with attention on creating effective KT interventions to increase clinicians' knowledge and use of evidence-based practice (EBP). The uptake of EBP is a complex process that can be facilitated by the use of the Knowledge to Action Process model. This…
ERIC Educational Resources Information Center
Tynjälä, Päivi; Virtanen, Anne; Klemola, Ulla; Kostiainen, Emma; Rasku-Puttonen, Helena
2016-01-01
The purpose of the study was to examine how social competence and other generic skills can be developed in teacher education using a pedagogical model called Integrative Pedagogy. This model is based on the idea of integrating the four basic components of expertise: Theoretical knowledge, practical knowledge, self-regulative knowledge, and…
The Importance and Weaknesses of the Productivist Industrial Model of Knowledge Production
ERIC Educational Resources Information Center
Persson, Roland S.
2010-01-01
To view contemporary Science as an industry is a very apt and timely stance. Ghassib's (2010) historical analysis of knowledge production, which he terms "A Productivist Industrial Model of Knowledge Production," is an interesting one. It is important, however, to observe that the outline of this model is based entirely on the production of…
ERIC Educational Resources Information Center
Lee, Chia-Jung; Kim, ChanMin
2014-01-01
This study presents a refined technological pedagogical content knowledge (also known as TPACK) based instructional design model, which was revised using findings from the implementation study of a prior model. The refined model was applied in a technology integration course with 38 preservice teachers. A case study approach was used in this…
Teachers' Perceptions of the Teaching of Acids and Bases in Swedish Upper Secondary Schools
ERIC Educational Resources Information Center
Drechsler, Michal; Van Driel, Jan
2009-01-01
We report in this paper on a study of chemistry teachers' perceptions of their teaching in upper secondary schools in Sweden, regarding models of acids and bases, especially the Bronsted and the Arrhenius model. A questionnaire consisting of a Likert-type scale was developed, which focused on teachers' knowledge of different models, knowledge of…
Conceptualization of an R&D Based Learning-to-Innovate Model for Science Education
NASA Astrophysics Data System (ADS)
Lai, Oiki Sylvia
The purpose of this research was to conceptualize an R & D based learning-to-innovate (LTI) model. The problem to be addressed was the lack of a theoretical L TI model, which would inform science pedagogy. The absorptive capacity (ACAP) lens was adopted to untangle the R & D LTI phenomenon into four learning processes: problem-solving via knowledge acquisition, incremental improvement via knowledge participation, scientific discovery via knowledge creation, and product design via knowledge productivity. The four knowledge factors were the latent factors and each factor had seven manifest elements as measured variables. The key objectives of the non experimental quantitative survey were to measure the relative importance of the identified elements and to explore the underlining structure of the variables. A questionnaire had been prepared, and was administered to more than 155 R & D professionals from four sectors - business, academic, government, and nonprofit. The results showed that every identified element was important to the R & D professionals, in terms of improving the related type of innovation. The most important elements were highlighted to serve as building blocks for elaboration. In search for patterns of the data matrix, exploratory factor analysis (EF A) was performed. Principal component analysis was the first phase of EF A to extract factors; while maximum likelihood estimation (MLE) was used to estimate the model. EF A yielded the finding of two aspects in each kind of knowledge. Logical names were assigned to represent the nature of the subsets: problem and knowledge under knowledge acquisition, planning and participation under knowledge participation, exploration and discovery under knowledge creation, and construction and invention under knowledge productivity. These two constructs, within each kind of knowledge, added structure to the vague R & D based LTI model. The research questions and hypotheses testing were addressed using correlation analysis. The alternative hypotheses that there were positive relationships between knowledge factors and their corresponding types of innovation were accepted. In-depth study of each process is recommended in both research and application. Experimental tests are needed, in order to ultimately present the LTI model to enhance the scientific knowledge absorptive capacity of the learners to facilitate their innovation performance.
PSYCHE: An Object-Oriented Approach to Simulating Medical Education
Mullen, Jamie A.
1990-01-01
Traditional approaches to computer-assisted instruction (CAI) do not provide realistic simulations of medical education, in part because they do not utilize heterogeneous knowledge bases for their source of domain knowledge. PSYCHE, a CAI program designed to teach hypothetico-deductive psychiatric decision-making to medical students, uses an object-oriented implementation of an intelligent tutoring system (ITS) to model the student, domain expert, and tutor. It models the transactions between the participants in complex transaction chains, and uses heterogeneous knowledge bases to represent both domain and procedural knowledge in clinical medicine. This object-oriented approach is a flexible and dynamic approach to modeling, and represents a potentially valuable tool for the investigation of medical education and decision-making.
How much expert knowledge is it worth to put in conceptual hydrological models?
NASA Astrophysics Data System (ADS)
Antonetti, Manuel; Zappa, Massimiliano
2017-04-01
Both modellers and experimentalists agree on using expert knowledge to improve our conceptual hydrological simulations on ungauged basins. However, they use expert knowledge differently for both hydrologically mapping the landscape and parameterising a given hydrological model. Modellers use generally very simplified (e.g. topography-based) mapping approaches and put most of the knowledge for constraining the model by defining parameter and process relational rules. In contrast, experimentalists tend to invest all their detailed and qualitative knowledge about processes to obtain a spatial distribution of areas with different dominant runoff generation processes (DRPs) as realistic as possible, and for defining plausible narrow value ranges for each model parameter. Since, most of the times, the modelling goal is exclusively to simulate runoff at a specific site, even strongly simplified hydrological classifications can lead to satisfying results due to equifinality of hydrological models, overfitting problems and the numerous uncertainty sources affecting runoff simulations. Therefore, to test to which extent expert knowledge can improve simulation results under uncertainty, we applied a typical modellers' modelling framework relying on parameter and process constraints defined based on expert knowledge to several catchments on the Swiss Plateau. To map the spatial distribution of the DRPs, mapping approaches with increasing involvement of expert knowledge were used. Simulation results highlighted the potential added value of using all the expert knowledge available on a catchment. Also, combinations of event types and landscapes, where even a simplified mapping approach can lead to satisfying results, were identified. Finally, the uncertainty originated by the different mapping approaches was compared with the one linked to meteorological input data and catchment initial conditions.
Qian, Shie; Dunham, Mark E.
1996-01-01
A system and method for constructing a bank of filters which detect the presence of signals whose frequency content varies with time. The present invention includes a novel system and method for developing one or more time templates designed to match the received signals of interest and the bank of matched filters use the one or more time templates to detect the received signals. Each matched filter compares the received signal x(t) with a respective, unique time template that has been designed to approximate a form of the signals of interest. The robust time domain template is assumed to be of the order of w(t)=A(t)cos{2.pi..phi.(t)} and the present invention uses the trajectory of a joint time-frequency representation of x(t) as an approximation of the instantaneous frequency function {.phi.'(t). First, numerous data samples of the received signal x(t) are collected. A joint time frequency representation is then applied to represent the signal, preferably using the time frequency distribution series (also known as the Gabor spectrogram). The joint time-frequency transformation represents the analyzed signal energy at time t and frequency .function., P(t,f), which is a three-dimensional plot of time vs. frequency vs. signal energy. Then P(t,f) is reduced to a multivalued function f(t), a two dimensional plot of time vs. frequency, using a thresholding process. Curve fitting steps are then performed on the time/frequency plot, preferably using Levenberg-Marquardt curve fitting techniques, to derive a general instantaneous frequency function .phi.'(t) which best fits the multivalued function f(t), a trajectory of the joint time-frequency domain representation of x(t). Integrating .phi.'(t) along t yields .phi.(t), which is then inserted into the form of the time template equation. A suitable amplitude A(t) is also preferably determined. Once the time template has been determined, one or more filters are developed which each use a version or form of the time template.
Design Specifications for the Advanced Instructional Design Advisor (AIDA). Volume 1
1992-01-01
research; (3) Describe the knowledge base sufficient to support the varieties of knowledge to be represented in the AIDA model ; (4) Document the...feasibility of continuing the development of the AIDA model . 2.3 Background In Phase I of the AIDA project (Task 0006), (1) the AIDA concept was defined...the AIDA Model A paper-based demonstration of the AIDA instructional design model was performed by using the model to develop a minimal application
The Visual Representation and Acquisition of Driving Knowledge for Autonomous Vehicle
NASA Astrophysics Data System (ADS)
Zhang, Zhaoxia; Jiang, Qing; Li, Ping; Song, LiangTu; Wang, Rujing; Yu, Biao; Mei, Tao
2017-09-01
In this paper, the driving knowledge base of autonomous vehicle is designed. Based on the driving knowledge modeling system, the driving knowledge of autonomous vehicle is visually acquired, managed, stored, and maintenanced, which has vital significance for creating the development platform of intelligent decision-making systems of automatic driving expert systems for autonomous vehicle.
Bridging the gap: simulations meet knowledge bases
NASA Astrophysics Data System (ADS)
King, Gary W.; Morrison, Clayton T.; Westbrook, David L.; Cohen, Paul R.
2003-09-01
Tapir and Krill are declarative languages for specifying actions and agents, respectively, that can be executed in simulation. As such, they bridge the gap between strictly declarative knowledge bases and strictly executable code. Tapir and Krill components can be combined to produce models of activity which can answer questions about mechanisms and processes using conventional inference methods and simulation. Tapir was used in DARPA's Rapid Knowledge Formation (RKF) project to construct models of military tactics from the Army Field Manual FM3-90. These were then used to build Courses of Actions (COAs) which could be critiqued by declarative reasoning or via Monte Carlo simulation. Tapir and Krill can be read and written by non-knowledge engineers making it an excellent vehicle for Subject Matter Experts to build and critique knowledge bases.
Model compilation: An approach to automated model derivation
NASA Technical Reports Server (NTRS)
Keller, Richard M.; Baudin, Catherine; Iwasaki, Yumi; Nayak, Pandurang; Tanaka, Kazuo
1990-01-01
An approach is introduced to automated model derivation for knowledge based systems. The approach, model compilation, involves procedurally generating the set of domain models used by a knowledge based system. With an implemented example, how this approach can be used to derive models of different precision and abstraction is illustrated, and models are tailored to different tasks, from a given set of base domain models. In particular, two implemented model compilers are described, each of which takes as input a base model that describes the structure and behavior of a simple electromechanical device, the Reaction Wheel Assembly of NASA's Hubble Space Telescope. The compilers transform this relatively general base model into simple task specific models for troubleshooting and redesign, respectively, by applying a sequence of model transformations. Each transformation in this sequence produces an increasingly more specialized model. The compilation approach lessens the burden of updating and maintaining consistency among models by enabling their automatic regeneration.
A study of EMR-based medical knowledge network and its applications.
Zhao, Chao; Jiang, Jingchi; Xu, Zhiming; Guan, Yi
2017-05-01
Electronic medical records (EMRs) contain an amount of medical knowledge which can be used for clinical decision support. We attempt to integrate this medical knowledge into a complex network, and then implement a diagnosis model based on this network. The dataset of our study contains 992 records which are uniformly sampled from different departments of the hospital. In order to integrate the knowledge of these records, an EMR-based medical knowledge network (EMKN) is constructed. This network takes medical entities as nodes, and co-occurrence relationships between the two entities as edges. Selected properties of this network are analyzed. To make use of this network, a basic diagnosis model is implemented. Seven hundred records are randomly selected to re-construct the network, and the remaining 292 records are used as test records. The vector space model is applied to illustrate the relationships between diseases and symptoms. Because there may exist more than one actual disease in a record, the recall rate of the first ten results, and the average precision are adopted as evaluation measures. Compared with a random network of the same size, this network has a similar average length but a much higher clustering coefficient. Additionally, it can be observed that there are direct correlations between the community structure and the real department classes in the hospital. For the diagnosis model, the vector space model using disease as a base obtains the best result. At least one accurate disease can be obtained in 73.27% of the records in the first ten results. We constructed an EMR-based medical knowledge network by extracting the medical entities. This network has the small-world and scale-free properties. Moreover, the community structure showed that entities in the same department have a tendency to be self-aggregated. Based on this network, a diagnosis model was proposed. This model uses only the symptoms as inputs and is not restricted to a specific disease. The experiments conducted demonstrated that EMKN is a simple and universal technique to integrate different medical knowledge from EMRs, and can be used for clinical decision support. Copyright © 2017 Elsevier B.V. All rights reserved.
Image/video understanding systems based on network-symbolic models
NASA Astrophysics Data System (ADS)
Kuvich, Gary
2004-03-01
Vision is a part of a larger information system that converts visual information into knowledge structures. These structures drive vision process, resolve ambiguity and uncertainty via feedback projections, and provide image understanding that is an interpretation of visual information in terms of such knowledge models. Computer simulation models are built on the basis of graphs/networks. The ability of human brain to emulate similar graph/network models is found. Symbols, predicates and grammars naturally emerge in such networks, and logic is simply a way of restructuring such models. Brain analyzes an image as a graph-type relational structure created via multilevel hierarchical compression of visual information. Primary areas provide active fusion of image features on a spatial grid-like structure, where nodes are cortical columns. Spatial logic and topology naturally present in such structures. Mid-level vision processes like perceptual grouping, separation of figure from ground, are special kinds of network transformations. They convert primary image structure into the set of more abstract ones, which represent objects and visual scene, making them easy for analysis by higher-level knowledge structures. Higher-level vision phenomena are results of such analysis. Composition of network-symbolic models combines learning, classification, and analogy together with higher-level model-based reasoning into a single framework, and it works similar to frames and agents. Computational intelligence methods transform images into model-based knowledge representation. Based on such principles, an Image/Video Understanding system can convert images into the knowledge models, and resolve uncertainty and ambiguity. This allows creating intelligent computer vision systems for design and manufacturing.
Rasmussen's model of human behavior in laparoscopy training.
Wentink, M; Stassen, L P S; Alwayn, I; Hosman, R J A W; Stassen, H G
2003-08-01
Compared to aviation, where virtual reality (VR) training has been standardized and simulators have proven their benefits, the objectives, needs, and means of VR training in minimally invasive surgery (MIS) still have to be established. The aim of the study presented is to introduce Rasmussen's model of human behavior as a practical framework for the definition of the training objectives, needs, and means in MIS. Rasmussen distinguishes three levels of human behavior: skill-, rule-, and knowledge-based behaviour. The training needs of a laparoscopic novice can be determined by identifying the specific skill-, rule-, and knowledge-based behavior that is required for performing safe laparoscopy. Future objectives of VR laparoscopy trainers should address all three levels of behavior. Although most commercially available simulators for laparoscopy aim at training skill-based behavior, especially the training of knowledge-based behavior during complications in surgery will improve safety levels. However, the cost and complexity of a training means increases when the training objectives proceed from the training of skill-based behavior to the training of complex knowledge-based behavior. In aviation, human behavior models have been used successfully to integrate the training of skill-, rule-, and knowledge-based behavior in a full flight simulator. Understanding surgeon behavior is one of the first steps towards a future full-scale laparoscopy simulator.
Ueda, Yoshihiro; Fukunaga, Jun-Ichi; Kamima, Tatsuya; Adachi, Yumiko; Nakamatsu, Kiyoshi; Monzen, Hajime
2018-03-20
The aim of this study was to evaluate the performance of a commercial knowledge-based planning system, in volumetric modulated arc therapy for prostate cancer at multiple radiation therapy departments. In each institute, > 20 cases were assessed. For the knowledge-based planning, the estimated dose (ED) based on geometric and dosimetric information of plans was generated in the model. Lower and upper limits of estimated dose were saved as dose volume histograms for each organ at risk. To verify whether the models performed correctly, KBP was compared with manual optimization planning in two cases. The relationships between the EDs in the models and the ratio of the OAR volumes overlapping volume with PTV to the whole organ volume (V overlap /V whole ) were investigated. There were no significant dosimetric differences in OARs and PTV between manual optimization planning and knowledge-based planning. In knowledge-based planning, the difference in the volume ratio of receiving 90% and 50% of the prescribed dose (V90 and V50) between institutes were more than 5.0% and 10.0%, respectively. The calculated doses with knowledge-based planning were between the upper and lower limits of ED or slightly under the lower limit of ED. The relationships between the lower limit of ED and V overlap /V whole were different among the models. In the V90 and V50 for the rectum, the maximum differences between the lower limit of ED among institutes were 8.2% and 53.5% when V overlap /V whole for the rectum was 10%. In the V90 and V50 for the bladder, the maximum differences of the lower limit of ED among institutes were 15.1% and 33.1% when V overlap /V whole for the bladder was 10%. Organs' upper and lower limits of ED in the models correlated closely with the V overlap /V whole . It is important to determine whether the models in KBP match a different institute's plan design before the models can be shared.
Weinstein, Nathan; Mendoza, Luis
2013-01-01
The vulva of Caenorhabditis elegans has been long used as an experimental model of cell differentiation and organogenesis. While it is known that the signaling cascades of Wnt, Ras/MAPK, and NOTCH interact to form a molecular network, there is no consensus regarding its precise topology and dynamical properties. We inferred the molecular network, and developed a multivalued synchronous discrete dynamic model to study its behavior. The model reproduces the patterns of activation reported for the following types of cell: vulval precursor, first fate, second fate, second fate with reversed polarity, third fate, and fusion fate. We simulated the fusion of cells, the determination of the first, second, and third fates, as well as the transition from the second to the first fate. We also used the model to simulate all possible single loss- and gain-of-function mutants, as well as some relevant double and triple mutants. Importantly, we associated most of these simulated mutants to multivulva, vulvaless, egg-laying defective, or defective polarity phenotypes. The model shows that it is necessary for RAL-1 to activate NOTCH signaling, since the repression of LIN-45 by RAL-1 would not suffice for a proper second fate determination in an environment lacking DSL ligands. We also found that the model requires the complex formed by LAG-1, LIN-12, and SEL-8 to inhibit the transcription of eff-1 in second fate cells. Our model is the largest reconstruction to date of the molecular network controlling the specification of vulval precursor cells and cell fusion control in C. elegans. According to our model, the process of fate determination in the vulval precursor cells is reversible, at least until either the cells fuse with the ventral hypoderm or divide, and therefore the cell fates must be maintained by the presence of extracellular signals. PMID:23785384
Toward a Computational Model of Tutoring.
ERIC Educational Resources Information Center
Woolf, Beverly Park
1992-01-01
Discusses the integration of instructional science and computer science. Topics addressed include motivation for building knowledge-based systems; instructional design issues, including cognitive models, representing student intentions, and student models and error diagnosis; representing tutoring knowledge; building a tutoring system, including…
Jones, Josette; Harris, Marcelline; Bagley-Thompson, Cheryl; Root, Jane
2003-01-01
This poster describes the development of user-centered interfaces in order to extend the functionality of the Virginia Henderson International Nursing Library (VHINL) from library to web based portal to nursing knowledge resources. The existing knowledge structure and computational models are revised and made complementary. Nurses' search behavior is captured and analyzed, and the resulting search models are mapped to the revised knowledge structure and computational model.
Current Standardization and Cooperative Efforts Related to Industrial Information Infrastructures.
1993-05-01
Data Management Systems: Components used to store, manage, and retrieve data. Data management includes knowledge bases, database management...Application Development Tools and Methods X/Open and POSIX APIs Integrated Design Support System (IDS) Knowledge -Based Systems (KBS) Application...IDEFlx) Yourdon Jackson System Design (JSD) Knowledge -Based Systems (KBSs) Structured Systems Development (SSD) Semantic Unification Meta-Model
A Thematic Analysis of Theoretical Models for Translational Science in Nursing: Mapping the Field
Mitchell, Sandra A.; Fisher, Cheryl A.; Hastings, Clare E.; Silverman, Leanne B.; Wallen, Gwenyth R.
2010-01-01
Background The quantity and diversity of conceptual models in translational science may complicate rather than advance the use of theory. Purpose This paper offers a comparative thematic analysis of the models available to inform knowledge development, transfer, and utilization. Method Literature searches identified 47 models for knowledge translation. Four thematic areas emerged: (1) evidence-based practice and knowledge transformation processes; (2) strategic change to promote adoption of new knowledge; (3) knowledge exchange and synthesis for application and inquiry; (4) designing and interpreting dissemination research. Discussion This analysis distinguishes the contributions made by leaders and researchers at each phase in the process of discovery, development, and service delivery. It also informs the selection of models to guide activities in knowledge translation. Conclusions A flexible theoretical stance is essential to simultaneously develop new knowledge and accelerate the translation of that knowledge into practice behaviors and programs of care that support optimal patient outcomes. PMID:21074646
Knowledge Engineering as a Component of the Curriculum for Medical Cybernetists.
Karas, Sergey; Konev, Arthur
2017-01-01
According to a new state educational standard, students who have chosen medical cybernetics as their major must develop a knowledge engineering competency. Previously, in the course "Clinical cybernetics" while practicing project-based learning students were designing automated workstations for medical personnel using client-server technology. The purpose of the article is to give insight into the project of a new educational module "Knowledge engineering". Students will acquire expert knowledge by holding interviews and conducting surveys, and then they will formalize it. After that, students will form declarative expert knowledge in a network model and analyze the knowledge graph. Expert decision making methods will be applied in software on the basis of a production model of knowledge. Project implementation will result not only in the development of analytical competencies among students, but also creation of a practically useful expert system based on student models to support medical decisions. Nowadays, this module is being tested in the educational process.
VIP: A knowledge-based design aid for the engineering of space systems
NASA Technical Reports Server (NTRS)
Lewis, Steven M.; Bellman, Kirstie L.
1990-01-01
The Vehicles Implementation Project (VIP), a knowledge-based design aid for the engineering of space systems is described. VIP combines qualitative knowledge in the form of rules, quantitative knowledge in the form of equations, and other mathematical modeling tools. The system allows users rapidly to develop and experiment with models of spacecraft system designs. As information becomes available to the system, appropriate equations are solved symbolically and the results are displayed. Users may browse through the system, observing dependencies and the effects of altering specific parameters. The system can also suggest approaches to the derivation of specific parameter values. In addition to providing a tool for the development of specific designs, VIP aims at increasing the user's understanding of the design process. Users may rapidly examine the sensitivity of a given parameter to others in the system and perform tradeoffs or optimizations of specific parameters. A second major goal of VIP is to integrate the existing corporate knowledge base of models and rules into a central, symbolic form.
The Contact Dynamics method: A nonsmooth story
NASA Astrophysics Data System (ADS)
Dubois, Frédéric; Acary, Vincent; Jean, Michel
2018-03-01
When velocity jumps are occurring, the dynamics is said to be nonsmooth. For instance, in collections of contacting rigid bodies, jumps are caused by shocks and dry friction. Without compliance at the interface, contact laws are not only non-differentiable in the usual sense but also multi-valued. Modeling contacting bodies is of interest in order to understand the behavior of numerous mechanical systems such as flexible multi-body systems, granular materials or masonry. These granular materials behave puzzlingly either like a solid or a fluid and a description in the frame of classical continuous mechanics would be welcome though far to be satisfactory nowadays. Jean-Jacques Moreau greatly contributed to convex analysis, functions of bounded variations, differential measure theory, sweeping process theory, definitive mathematical tools to deal with nonsmooth dynamics. He converted all these underlying theoretical ideas into an original nonsmooth implicit numerical method called Contact Dynamics (CD); a robust and efficient method to simulate large collections of bodies with frictional contacts and impacts. The CD method offers a very interesting complementary alternative to the family of smoothed explicit numerical methods, often called Distinct Elements Method (DEM). In this paper developments and improvements of the CD method are presented together with a critical comparative review of advantages and drawbacks of both approaches. xml:lang="fr"
The Academic Knowledge Management Model of Small Schools in Thailand
ERIC Educational Resources Information Center
Tumtuma, Chamnan; Chantarasombat, Chalard; Yeamsang, Theerawat
2015-01-01
The Academic Knowledge Management Model of Small Schools in Thailand was created by research and development. The quantitative and qualitative data were collected via the following steps: a participatory workshop meeting, the formation of a team according to knowledge base, field study, brainstorming, group discussion, activities carried out…
Knowledge Management System Model for Learning Organisations
ERIC Educational Resources Information Center
Amin, Yousif; Monamad, Roshayu
2017-01-01
Based on the literature of knowledge management (KM), this paper reports on the progress of developing a new knowledge management system (KMS) model with components architecture that are distributed over the widely-recognised socio-technical system (STS) aspects to guide developers for selecting the most applicable components to support their KM…
A future Outlook: Web based Simulation of Hydrodynamic models
NASA Astrophysics Data System (ADS)
Islam, A. S.; Piasecki, M.
2003-12-01
Despite recent advances to present simulation results as 3D graphs or animation contours, the modeling user community still faces some shortcomings when trying to move around and analyze data. Typical problems include the lack of common platforms with standard vocabulary to exchange simulation results from different numerical models, insufficient descriptions about data (metadata), lack of robust search and retrieval tools for data, and difficulties to reuse simulation domain knowledge. This research demonstrates how to create a shared simulation domain in the WWW and run a number of models through multi-user interfaces. Firstly, meta-datasets have been developed to describe hydrodynamic model data based on geographic metadata standard (ISO 19115) that has been extended to satisfy the need of the hydrodynamic modeling community. The Extended Markup Language (XML) is used to publish this metadata by the Resource Description Framework (RDF). Specific domain ontology for Web Based Simulation (WBS) has been developed to explicitly define vocabulary for the knowledge based simulation system. Subsequently, this knowledge based system is converted into an object model using Meta Object Family (MOF). The knowledge based system acts as a Meta model for the object oriented system, which aids in reusing the domain knowledge. Specific simulation software has been developed based on the object oriented model. Finally, all model data is stored in an object relational database. Database back-ends help store, retrieve and query information efficiently. This research uses open source software and technology such as Java Servlet and JSP, Apache web server, Tomcat Servlet Engine, PostgresSQL databases, Protégé ontology editor, RDQL and RQL for querying RDF in semantic level, Jena Java API for RDF. Also, we use international standards such as the ISO 19115 metadata standard, and specifications such as XML, RDF, OWL, XMI, and UML. The final web based simulation product is deployed as Web Archive (WAR) files which is platform and OS independent and can be used by Windows, UNIX, or Linux. Keywords: Apache, ISO 19115, Java Servlet, Jena, JSP, Metadata, MOF, Linux, Ontology, OWL, PostgresSQL, Protégé, RDF, RDQL, RQL, Tomcat, UML, UNIX, Windows, WAR, XML
Smale, Melinda; Assima, Amidou; Kergna, Alpha; Thériault, Véronique; Weltzien, Eva
2018-01-01
Uptake of improved sorghum varieties in the Sudan Savanna of West Africa has been limited, despite the economic importance of the crop and long-term investments in sorghum improvement. One reason why is that attaining substantial yield advantages has been difficult in this harsh, heterogeneous growing environment. Release in Mali of the first sorghum hybrids in Sub-Saharan Africa that have been developed primarily from local germplasm has the potential to change this situation. Utilizing plot data collected in Mali, we explain the adoption of improved seed with an ordered logit model and apply a multivalued treatment effects model to measure impacts on farm families, differentiating between improved varieties and hybrids. Since farm families both consume and sell their sorghum, we consider effects on consumption patterns as well as productivity. Status within the household, conferred by gender combined with marital status, generation, and education, is strongly related to the improvement status of sorghum seed planted in these extended family households. Effects of hybrid use on yields are large, widening the range of food items consumed, reducing the share of sorghum in food purchases, and contributing to a greater share of the sorghum harvest sold. Use of improved seed appears to be associated with a shift toward consumption of other cereals, and also to greater sales shares. Findings support on-farm research concerning yield advantages, also suggesting that the use of well-adapted sorghum hybrids could contribute to diet diversification and the crop's commercialization by smallholders.
Using a knowledge-based planning solution to select patients for proton therapy.
Delaney, Alexander R; Dahele, Max; Tol, Jim P; Kuijper, Ingrid T; Slotman, Ben J; Verbakel, Wilko F A R
2017-08-01
Patient selection for proton therapy by comparing proton/photon treatment plans is time-consuming and prone to bias. RapidPlan™, a knowledge-based-planning solution, uses plan-libraries to model and predict organ-at-risk (OAR) dose-volume-histograms (DVHs). We investigated whether RapidPlan, utilizing an algorithm based only on photon beam characteristics, could generate proton DVH-predictions and whether these could correctly identify patients for proton therapy. Model PROT and Model PHOT comprised 30 head-and-neck cancer proton and photon plans, respectively. Proton and photon knowledge-based-plans (KBPs) were made for ten evaluation-patients. DVH-prediction accuracy was analyzed by comparing predicted-vs-achieved mean OAR doses. KBPs and manual plans were compared using salivary gland and swallowing muscle mean doses. For illustration, patients were selected for protons if predicted Model PHOT mean dose minus predicted Model PROT mean dose (ΔPrediction) for combined OARs was ≥6Gy, and benchmarked using achieved KBP doses. Achieved and predicted Model PROT /Model PHOT mean dose R 2 was 0.95/0.98. Generally, achieved mean dose for Model PHOT /Model PROT KBPs was respectively lower/higher than predicted. Comparing Model PROT /Model PHOT KBPs with manual plans, salivary and swallowing mean doses increased/decreased by <2Gy, on average. ΔPrediction≥6Gy correctly selected 4 of 5 patients for protons. Knowledge-based DVH-predictions can provide efficient, patient-specific selection for protons. A proton-specific RapidPlan-solution could improve results. Copyright © 2017 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Domangue, Thomas J.; Mathews, Robert C.; Sun, Ron; Roussel, Lewis G.; Guidry, Claire E.
2004-01-01
Learners are able to use 2 different types of knowledge to perform a skill. One type is a conscious mental model, and the other is based on memories of instances. The authors conducted 3 experiments that manipulated training conditions designed to affect the availability of 1 or both types of knowledge about an artificial grammar. Participants…
NASA Astrophysics Data System (ADS)
Kuvich, Gary
2003-08-01
Vision is a part of a larger information system that converts visual information into knowledge structures. These structures drive vision process, resolve ambiguity and uncertainty via feedback projections, and provide image understanding that is an interpretation of visual information in terms of such knowledge models. The ability of human brain to emulate knowledge structures in the form of networks-symbolic models is found. And that means an important shift of paradigm in our knowledge about brain from neural networks to "cortical software". Symbols, predicates and grammars naturally emerge in such active multilevel hierarchical networks, and logic is simply a way of restructuring such models. Brain analyzes an image as a graph-type decision structure created via multilevel hierarchical compression of visual information. Mid-level vision processes like clustering, perceptual grouping, separation of figure from ground, are special kinds of graph/network transformations. They convert low-level image structure into the set of more abstract ones, which represent objects and visual scene, making them easy for analysis by higher-level knowledge structures. Higher-level vision phenomena are results of such analysis. Composition of network-symbolic models works similar to frames and agents, combines learning, classification, analogy together with higher-level model-based reasoning into a single framework. Such models do not require supercomputers. Based on such principles, and using methods of Computational intelligence, an Image Understanding system can convert images into the network-symbolic knowledge models, and effectively resolve uncertainty and ambiguity, providing unifying representation for perception and cognition. That allows creating new intelligent computer vision systems for robotic and defense industries.
An architecture for rule based system explanation
NASA Technical Reports Server (NTRS)
Fennel, T. R.; Johannes, James D.
1990-01-01
A system architecture is presented which incorporate both graphics and text into explanations provided by rule based expert systems. This architecture facilitates explanation of the knowledge base content, the control strategies employed by the system, and the conclusions made by the system. The suggested approach combines hypermedia and inference engine capabilities. Advantages include: closer integration of user interface, explanation system, and knowledge base; the ability to embed links to deeper knowledge underlying the compiled knowledge used in the knowledge base; and allowing for more direct control of explanation depth and duration by the user. User models are suggested to control the type, amount, and order of information presented.
Weaver, D; Sorrells-Jones, J
1999-09-01
Our economy is shifting from a hard goods and material products base to one in which knowledge is the primary mode of production. Organizations are experimenting with designs that support knowledge work by clustering individuals with different but complementary skills in focused teams. The goal is to increase applied knowledge that furthers the organization's strategic intent. The team-based knowledge work model holds promise for healthcare organizations that are under pressure to use knowledge to improve clinical care, integrate care across disciplines and settings, and accept accountability for costs. However, the shift from the traditional bureaucratic model to the flexible team-based design mandates changes in the design of the organization, the role of leadership, and the attributes of the teams and team members. In Part 2 of this three-part series, the authors explore the necessary design changes and the new roles for leadership, teams, and their members. Additionally, implications for healthcare clinicians, particularly nurses, are discussed.
Model-based diagnostics for Space Station Freedom
NASA Technical Reports Server (NTRS)
Fesq, Lorraine M.; Stephan, Amy; Martin, Eric R.; Lerutte, Marcel G.
1991-01-01
An innovative approach to fault management was recently demonstrated for the NASA LeRC Space Station Freedom (SSF) power system testbed. This project capitalized on research in model-based reasoning, which uses knowledge of a system's behavior to monitor its health. The fault management system (FMS) can isolate failures online, or in a post analysis mode, and requires no knowledge of failure symptoms to perform its diagnostics. An in-house tool called MARPLE was used to develop and run the FMS. MARPLE's capabilities are similar to those available from commercial expert system shells, although MARPLE is designed to build model-based as opposed to rule-based systems. These capabilities include functions for capturing behavioral knowledge, a reasoning engine that implements a model-based technique known as constraint suspension, and a tool for quickly generating new user interfaces. The prototype produced by applying MARPLE to SSF not only demonstrated that model-based reasoning is a valuable diagnostic approach, but it also suggested several new applications of MARPLE, including an integration and testing aid, and a complement to state estimation.
Privacy preserving data anonymization of spontaneous ADE reporting system dataset.
Lin, Wen-Yang; Yang, Duen-Chuan; Wang, Jie-Teng
2016-07-18
To facilitate long-term safety surveillance of marketing drugs, many spontaneously reporting systems (SRSs) of ADR events have been established world-wide. Since the data collected by SRSs contain sensitive personal health information that should be protected to prevent the identification of individuals, it procures the issue of privacy preserving data publishing (PPDP), that is, how to sanitize (anonymize) raw data before publishing. Although much work has been done on PPDP, very few studies have focused on protecting privacy of SRS data and none of the anonymization methods is favorable for SRS datasets, due to which contain some characteristics such as rare events, multiple individual records, and multi-valued sensitive attributes. We propose a new privacy model called MS(k, θ (*) )-bounding for protecting published spontaneous ADE reporting data from privacy attacks. Our model has the flexibility of varying privacy thresholds, i.e., θ (*) , for different sensitive values and takes the characteristics of SRS data into consideration. We also propose an anonymization algorithm for sanitizing the raw data to meet the requirements specified through the proposed model. Our algorithm adopts a greedy-based clustering strategy to group the records into clusters, conforming to an innovative anonymization metric aiming to minimize the privacy risk as well as maintain the data utility for ADR detection. Empirical study was conducted using FAERS dataset from 2004Q1 to 2011Q4. We compared our model with four prevailing methods, including k-anonymity, (X, Y)-anonymity, Multi-sensitive l-diversity, and (α, k)-anonymity, evaluated via two measures, Danger Ratio (DR) and Information Loss (IL), and considered three different scenarios of threshold setting for θ (*) , including uniform setting, level-wise setting and frequency-based setting. We also conducted experiments to inspect the impact of anonymized data on the strengths of discovered ADR signals. With all three different threshold settings for sensitive value, our method can successively prevent the disclosure of sensitive values (nearly all observed DRs are zeros) without sacrificing too much of data utility. With non-uniform threshold setting, level-wise or frequency-based, our MS(k, θ (*))-bounding exhibits the best data utility and the least privacy risk among all the models. The experiments conducted on selected ADR signals from MedWatch show that only very small difference on signal strength (PRR or ROR) were observed. The results show that our method can effectively prevent the disclosure of patient sensitive information without sacrificing data utility for ADR signal detection. We propose a new privacy model for protecting SRS data that possess some characteristics overlooked by contemporary models and an anonymization algorithm to sanitize SRS data in accordance with the proposed model. Empirical evaluation on the real SRS dataset, i.e., FAERS, shows that our method can effectively solve the privacy problem in SRS data without influencing the ADR signal strength.
NASA Astrophysics Data System (ADS)
Elag, M.; Goodall, J. L.
2013-12-01
Hydrologic modeling often requires the re-use and integration of models from different disciplines to simulate complex environmental systems. Component-based modeling introduces a flexible approach for integrating physical-based processes across disciplinary boundaries. Several hydrologic-related modeling communities have adopted the component-based approach for simulating complex physical systems by integrating model components across disciplinary boundaries in a workflow. However, it is not always straightforward to create these interdisciplinary models due to the lack of sufficient knowledge about a hydrologic process. This shortcoming is a result of using informal methods for organizing and sharing information about a hydrologic process. A knowledge-based ontology provides such standards and is considered the ideal approach for overcoming this challenge. The aims of this research are to present the methodology used in analyzing the basic hydrologic domain in order to identify hydrologic processes, the ontology itself, and how the proposed ontology is integrated with the Water Resources Component (WRC) ontology. The proposed ontology standardizes the definitions of a hydrologic process, the relationships between hydrologic processes, and their associated scientific equations. The objective of the proposed Hydrologic Process (HP) Ontology is to advance the idea of creating a unified knowledge framework for components' metadata by introducing a domain-level ontology for hydrologic processes. The HP ontology is a step toward an explicit and robust domain knowledge framework that can be evolved through the contribution of domain users. Analysis of the hydrologic domain is accomplished using the Formal Concept Approach (FCA), in which the infiltration process, an important hydrologic process, is examined. Two infiltration methods, the Green-Ampt and Philip's methods, were used to demonstrate the implementation of information in the HP ontology. Furthermore, a SPARQL service is provided for semantic-based querying of the ontology.
An, Gary
2009-01-01
The sheer volume of biomedical research threatens to overwhelm the capacity of individuals to effectively process this information. Adding to this challenge is the multiscale nature of both biological systems and the research community as a whole. Given this volume and rate of generation of biomedical information, the research community must develop methods for robust representation of knowledge in order for individuals, and the community as a whole, to "know what they know." Despite increasing emphasis on "data-driven" research, the fact remains that researchers guide their research using intuitively constructed conceptual models derived from knowledge extracted from publications, knowledge that is generally qualitatively expressed using natural language. Agent-based modeling (ABM) is a computational modeling method that is suited to translating the knowledge expressed in biomedical texts into dynamic representations of the conceptual models generated by researchers. The hierarchical object-class orientation of ABM maps well to biomedical ontological structures, facilitating the translation of ontologies into instantiated models. Furthermore, ABM is suited to producing the nonintuitive behaviors that often "break" conceptual models. Verification in this context is focused at determining the plausibility of a particular conceptual model, and qualitative knowledge representation is often sufficient for this goal. Thus, utilized in this fashion, ABM can provide a powerful adjunct to other computational methods within the research process, as well as providing a metamodeling framework to enhance the evolution of biomedical ontologies.
Examining Collaborative Knowledge Construction in Microblogging-Based Learning Environments
ERIC Educational Resources Information Center
Luo, Tian; Clifton, Lacey
2017-01-01
Aim/Purpose: The purpose of the study is to provide foundational research to exemplify how knowledge construction takes place in microblogging-based learning environments, to understand learner interaction representing the knowledge construction process, and to analyze learner perception, thereby suggesting a model of delivery for microblogging.…
Pragmatic User Model Implementation in an Intelligent Help System.
ERIC Educational Resources Information Center
Fernandez-Manjon, Baltasar; Fernandez-Valmayor, Alfredo; Fernandez-Chamizo, Carmen
1998-01-01
Describes Aran, a knowledge-based system designed to help users deal with problems related to Unix operation. Highlights include adaptation to the individual user; user modeling knowledge; stereotypes; content of the individual user model; instantiation, acquisition, and maintenance of the individual model; dynamic acquisition of objective and…
Supporting Students' Knowledge Transfer in Modeling Activities
ERIC Educational Resources Information Center
Piksööt, Jaanika; Sarapuu, Tago
2014-01-01
This study investigates ways to enhance secondary school students' knowledge transfer in complex science domains by implementing question prompts. Two samples of students applied two web-based models to study molecular genetics--the model of genetic code (n = 258) and translation (n = 245). For each model, the samples were randomly divided into…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sazonov, S. V., E-mail: sazonov.sergey@gmail.com; Ustinov, N. V., E-mail: n-ustinov@mail.ru
The nonlinear propagation of extremely short electromagnetic pulses in a medium of symmetric and asymmetric molecules placed in static magnetic and electric fields is theoretically studied. Asymmetric molecules differ in that they have nonzero permanent dipole moments in stationary quantum states. A system of wave equations is derived for the ordinary and extraordinary components of pulses. It is shown that this system can be reduced in some cases to a system of coupled Ostrovsky equations and to the equation intagrable by the method for an inverse scattering transformation, including the vector version of the Ostrovsky–Vakhnenko equation. Different types of solutionsmore » of this system are considered. Only solutions representing the superposition of periodic solutions are single-valued, whereas soliton and breather solutions are multivalued.« less
Sistani, Masiar; Staudinger, Philipp; Greil, Johannes; Holzbauer, Martin; Detz, Hermann; Bertagnolli, Emmerich; Lugstein, Alois
2017-08-09
Conductance quantization at room temperature is a key requirement for the utilizing of ballistic transport for, e.g., high-performance, low-power dissipating transistors operating at the upper limit of "on"-state conductance or multivalued logic gates. So far, studying conductance quantization has been restricted to high-mobility materials at ultralow temperatures and requires sophisticated nanostructure formation techniques and precise lithography for contact formation. Utilizing a thermally induced exchange reaction between single-crystalline Ge nanowires and Al pads, we achieved monolithic Al-Ge-Al NW heterostructures with ultrasmall Ge segments contacted by self-aligned quasi one-dimensional crystalline Al leads. By integration in electrostatically modulated back-gated field-effect transistors, we demonstrate the first experimental observation of room temperature quantum ballistic transport in Ge, favorable for integration in complementary metal-oxide-semiconductor platform technology.
Efficient level set methods for constructing wavefronts in three spatial dimensions
NASA Astrophysics Data System (ADS)
Cheng, Li-Tien
2007-10-01
Wavefront construction in geometrical optics has long faced the twin difficulties of dealing with multi-valued forms and resolution of wavefront surfaces. A recent change in viewpoint, however, has demonstrated that working in phase space on bicharacteristic strips using eulerian methods can bypass both difficulties. The level set method for interface dynamics makes a suitable choice for the eulerian method. Unfortunately, in three-dimensional space, the setting of interest for most practical applications, the advantages of this method are largely offset by a new problem: the high dimension of phase space. In this work, we present new types of level set algorithms that remove this obstacle and demonstrate their abilities to accurately construct wavefronts under high resolution. These results propel the level set method forward significantly as a competitive approach in geometrical optics under realistic conditions.
Graphical representation of robot grasping quality measures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Varma, V.; Tasch, U.
1993-11-01
When an object is held by a multi-fingered hand, the values of the contact forces can be multivalued. An objective function, when used in conjunction with the frictional and geometric constraints of the grasp, can however, give a unique set of finger force values. The selection of the objective function in determining the finger forces is dependent on the type of grasp required, the material properties of the object, and the limitations of the robot fingers. In this paper several optimization functions are studied and their merits highlighted. A graphical representation of the finger force values and the objective functionmore » is introduced that enable one in selecting and comparing various grasping configurations. The impending motion of the object at different torque and finger force values are determined by observing the normalized coefficient of friction plots.« less
A computational exploration of the McCoy-Tracy-Wu solutions of the third Painlevé equation
NASA Astrophysics Data System (ADS)
Fasondini, Marco; Fornberg, Bengt; Weideman, J. A. C.
2018-01-01
The method recently developed by the authors for the computation of the multivalued Painlevé transcendents on their Riemann surfaces (Fasondini et al., 2017) is used to explore families of solutions to the third Painlevé equation that were identified by McCoy et al. (1977) and which contain a pole-free sector. Limiting cases, in which the solutions are singular functions of the parameters, are also investigated and it is shown that a particular set of limiting solutions is expressible in terms of special functions. Solutions that are single-valued, logarithmically (infinitely) branched and algebraically branched, with any number of distinct sheets, are encountered. The algebraically branched solutions have multiple pole-free sectors on their Riemann surfaces that are accounted for by using asymptotic formulae and Bäcklund transformations.
Incorporating Resilience into Dynamic Social Models
2016-07-20
solved by simply using the information provided by the scenario. Instead, additional knowledge is required from relevant fields that study these...resilience function by leveraging Bayesian Knowledge Bases (BKBs), a probabilistic reasoning network framework[5],[6]. BKBs allow for inferencing...reasoning network framework based on Bayesian Knowledge Bases (BKBs). BKBs are central to our social resilience framework as they are used to
Methods and systems for detecting abnormal digital traffic
Goranson, Craig A [Kennewick, WA; Burnette, John R [Kennewick, WA
2011-03-22
Aspects of the present invention encompass methods and systems for detecting abnormal digital traffic by assigning characterizations of network behaviors according to knowledge nodes and calculating a confidence value based on the characterizations from at least one knowledge node and on weighting factors associated with the knowledge nodes. The knowledge nodes include a characterization model based on prior network information. At least one of the knowledge nodes should not be based on fixed thresholds or signatures. The confidence value includes a quantification of the degree of confidence that the network behaviors constitute abnormal network traffic.
Understanding Elementary Astronomy by Making Drawing-Based Models
NASA Astrophysics Data System (ADS)
van Joolingen, W. R.; Aukes, Annika V. A.; Gijlers, H.; Bollen, L.
2015-04-01
Modeling is an important approach in the teaching and learning of science. In this study, we attempt to bring modeling within the reach of young children by creating the SimSketch modeling system, which is based on freehand drawings that can be turned into simulations. This system was used by 247 children (ages ranging from 7 to 15) to create a drawing-based model of the solar system. The results show that children in the target age group are capable of creating a drawing-based model of the solar system and can use it to show the situations in which eclipses occur. Structural equation modeling predicting post-test knowledge scores based on learners' pre-test knowledge scores, the quality of their drawings and motivational aspects yielded some evidence that such drawing contributes to learning. Consequences for using modeling with young children are considered.
Use of cccupancy models to evaluate expert knowledge-based species-habitat relationships
Iglecia, Monica N.; Collazo, Jaime A.; McKerrow, Alexa
2012-01-01
Expert knowledge-based species-habitat relationships are used extensively to guide conservation planning, particularly when data are scarce. Purported relationships describe the initial state of knowledge, but are rarely tested. We assessed support in the data for suitability rankings of vegetation types based on expert knowledge for three terrestrial avian species in the South Atlantic Coastal Plain of the United States. Experts used published studies, natural history, survey data, and field experience to rank vegetation types as optimal, suitable, and marginal. We used single-season occupancy models, coupled with land cover and Breeding Bird Survey data, to examine the hypothesis that patterns of occupancy conformed to species-habitat suitability rankings purported by experts. Purported habitat suitability was validated for two of three species. As predicted for the Eastern Wood-Pewee (Contopus virens) and Brown-headed Nuthatch (Sitta pusilla), occupancy was strongly influenced by vegetation types classified as “optimal habitat” by the species suitability rankings for nuthatches and wood-pewees. Contrary to predictions, Red-headed Woodpecker (Melanerpes erythrocephalus) models that included vegetation types as covariates received similar support by the data as models without vegetation types. For all three species, occupancy was also related to sampling latitude. Our results suggest that covariates representing other habitat requirements might be necessary to model occurrence of generalist species like the woodpecker. The modeling approach described herein provides a means to test expert knowledge-based species-habitat relationships, and hence, help guide conservation planning.
Software-engineering challenges of building and deploying reusable problem solvers.
O'Connor, Martin J; Nyulas, Csongor; Tu, Samson; Buckeridge, David L; Okhmatovskaia, Anna; Musen, Mark A
2009-11-01
Problem solving methods (PSMs) are software components that represent and encode reusable algorithms. They can be combined with representations of domain knowledge to produce intelligent application systems. A goal of research on PSMs is to provide principled methods and tools for composing and reusing algorithms in knowledge-based systems. The ultimate objective is to produce libraries of methods that can be easily adapted for use in these systems. Despite the intuitive appeal of PSMs as conceptual building blocks, in practice, these goals are largely unmet. There are no widely available tools for building applications using PSMs and no public libraries of PSMs available for reuse. This paper analyzes some of the reasons for the lack of widespread adoptions of PSM techniques and illustrate our analysis by describing our experiences developing a complex, high-throughput software system based on PSM principles. We conclude that many fundamental principles in PSM research are useful for building knowledge-based systems. In particular, the task-method decomposition process, which provides a means for structuring knowledge-based tasks, is a powerful abstraction for building systems of analytic methods. However, despite the power of PSMs in the conceptual modeling of knowledge-based systems, software engineering challenges have been seriously underestimated. The complexity of integrating control knowledge modeled by developers using PSMs with the domain knowledge that they model using ontologies creates a barrier to widespread use of PSM-based systems. Nevertheless, the surge of recent interest in ontologies has led to the production of comprehensive domain ontologies and of robust ontology-authoring tools. These developments present new opportunities to leverage the PSM approach.
Software-engineering challenges of building and deploying reusable problem solvers
O’CONNOR, MARTIN J.; NYULAS, CSONGOR; TU, SAMSON; BUCKERIDGE, DAVID L.; OKHMATOVSKAIA, ANNA; MUSEN, MARK A.
2012-01-01
Problem solving methods (PSMs) are software components that represent and encode reusable algorithms. They can be combined with representations of domain knowledge to produce intelligent application systems. A goal of research on PSMs is to provide principled methods and tools for composing and reusing algorithms in knowledge-based systems. The ultimate objective is to produce libraries of methods that can be easily adapted for use in these systems. Despite the intuitive appeal of PSMs as conceptual building blocks, in practice, these goals are largely unmet. There are no widely available tools for building applications using PSMs and no public libraries of PSMs available for reuse. This paper analyzes some of the reasons for the lack of widespread adoptions of PSM techniques and illustrate our analysis by describing our experiences developing a complex, high-throughput software system based on PSM principles. We conclude that many fundamental principles in PSM research are useful for building knowledge-based systems. In particular, the task–method decomposition process, which provides a means for structuring knowledge-based tasks, is a powerful abstraction for building systems of analytic methods. However, despite the power of PSMs in the conceptual modeling of knowledge-based systems, software engineering challenges have been seriously underestimated. The complexity of integrating control knowledge modeled by developers using PSMs with the domain knowledge that they model using ontologies creates a barrier to widespread use of PSM-based systems. Nevertheless, the surge of recent interest in ontologies has led to the production of comprehensive domain ontologies and of robust ontology-authoring tools. These developments present new opportunities to leverage the PSM approach. PMID:23565031
Crowley, D Max; Greenberg, Mark T; Feinberg, Mark E; Spoth, Richard L; Redmond, Cleve R
2012-02-01
A substantial challenge in improving public health is how to facilitate the local adoption of evidence-based interventions (EBIs). To do so, an important step is to build local stakeholders' knowledge and decision-making skills regarding the adoption and implementation of EBIs. One EBI delivery system, called PROSPER (PROmoting School-community-university Partnerships to Enhance Resilience), has effectively mobilized community prevention efforts, implemented prevention programming with quality, and consequently decreased youth substance abuse. While these results are encouraging, another objective is to increase local stakeholder knowledge of best practices for adoption, implementation and evaluation of EBIs. Using a mixed methods approach, we assessed local stakeholder knowledge of these best practices over 5 years, in 28 intervention and control communities. Results indicated that the PROSPER partnership model led to significant increases in expert knowledge regarding the selection, implementation, and evaluation of evidence-based interventions. Findings illustrate the limited programming knowledge possessed by members of local prevention efforts, the difficulty of complete knowledge transfer, and highlight one method for cultivating that knowledge.
Developing Statistical Knowledge for Teaching during Design-Based Research
ERIC Educational Resources Information Center
Groth, Randall E.
2017-01-01
Statistical knowledge for teaching is not precisely equivalent to statistics subject matter knowledge. Teachers must know how to make statistics understandable to others as well as understand the subject matter themselves. This dual demand on teachers calls for the development of viable teacher education models. This paper offers one such model,…
García-Alonso, Carlos; Pérez-Naranjo, Leonor
2009-01-01
Introduction Knowledge management, based on information transfer between experts and analysts, is crucial for the validity and usability of data envelopment analysis (DEA). Aim To design and develop a methodology: i) to assess technical efficiency of small health areas (SHA) in an uncertainty environment, and ii) to transfer information between experts and operational models, in both directions, for improving expert’s knowledge. Method A procedure derived from knowledge discovery from data (KDD) is used to select, interpret and weigh DEA inputs and outputs. Based on KDD results, an expert-driven Monte-Carlo DEA model has been designed to assess the technical efficiency of SHA in Andalusia. Results In terms of probability, SHA 29 is the most efficient being, on the contrary, SHA 22 very inefficient. 73% of analysed SHA have a probability of being efficient (Pe) >0.9 and 18% <0.5. Conclusions Expert knowledge is necessary to design and validate any operational model. KDD techniques make the transfer of information from experts to any operational model easy and results obtained from the latter improve expert’s knowledge.
Speech recognition: Acoustic-phonetic knowledge acquisition and representation
NASA Astrophysics Data System (ADS)
Zue, Victor W.
1988-09-01
The long-term research goal is to develop and implement speaker-independent continuous speech recognition systems. It is believed that the proper utilization of speech-specific knowledge is essential for such advanced systems. This research is thus directed toward the acquisition, quantification, and representation, of acoustic-phonetic and lexical knowledge, and the application of this knowledge to speech recognition algorithms. In addition, we are exploring new speech recognition alternatives based on artificial intelligence and connectionist techniques. We developed a statistical model for predicting the acoustic realization of stop consonants in various positions in the syllable template. A unification-based grammatical formalism was developed for incorporating this model into the lexical access algorithm. We provided an information-theoretic justification for the hierarchical structure of the syllable template. We analyzed segmented duration for vowels and fricatives in continuous speech. Based on contextual information, we developed durational models for vowels and fricatives that account for over 70 percent of the variance, using data from multiple, unknown speakers. We rigorously evaluated the ability of human spectrogram readers to identify stop consonants spoken by many talkers and in a variety of phonetic contexts. Incorporating the declarative knowledge used by the readers, we developed a knowledge-based system for stop identification. We achieved comparable system performance to that to the readers.
Gebru, Kerstin; Willman, Ania
2003-01-01
As Sweden changes toward a multicultural society, scientific knowledge of transcultural nursing care becomes increasingly important. Earlier studies in Swedish nursing education have demonstrated a lack of knowledge base in transcultural nursing. Through an extensive review of the literature, a didactic model was developed to help facilitate the establishment of this body of knowledge in transcultural nursing. The article demonstrates how the model applies the content and structure of Leininger's theory of culture care diversity and universality and ethnonursing method in a 3-year nursing program in theory as well as clinical education. The model includes a written guide for faculty members, with references to scientific articles and documents to be used.
ERIC Educational Resources Information Center
Lee, Chia-Jung; Kim, ChanMin
2017-01-01
This paper presents the third version of a technological pedagogical content knowledge (TPACK) based instructional design model that incorporates the distinctive, transformative, and integrative views of TPACK into a comprehensive actionable framework. Strategies of relating TPACK domains to real-life learning experiences, role-playing, and…
VHBuild.com: A Web-Based System for Managing Knowledge in Projects.
ERIC Educational Resources Information Center
Li, Heng; Tang, Sandy; Man, K. F.; Love, Peter E. D.
2002-01-01
Describes an intelligent Web-based construction project management system called VHBuild.com which integrates project management, knowledge management, and artificial intelligence technologies. Highlights include an information flow model; time-cost optimization based on genetic algorithms; rule-based drawing interpretation; and a case-based…
Risk Management of New Microelectronics for NASA: Radiation Knowledge-base
NASA Technical Reports Server (NTRS)
LaBel, Kenneth A.
2004-01-01
Contents include the following: NASA Missions - implications to reliability and radiation constraints. Approach to Insertion of New Technologies Technology Knowledge-base development. Technology model/tool development and validation. Summary comments.
NASA Astrophysics Data System (ADS)
Nam, Younkyeong
2012-06-01
This review explores Ben-Zvi Assaraf, Eshach, Orion, and Alamour's paper titled "Cultural Differences and Students' Spontaneous Models of the Water Cycle: A Case Study of Jewish and Bedouin Children in Israel" by examining how the authors use the concept of spontaneous mental models to explain cultural knowledge source of Bedouin children's mental model of water compared to Jewish children's mental model of water in nature. My response to Ben-Zvi Assaraf et al.'s work expands upon their explanations of the Bedouin children's cultural knowledge source. Bedouin children's mental model is based on their culture, religion, place of living and everyday life practices related to water. I suggest a different knowledge source for spontaneous mental model of water in nature based on unique history and traditions of South Korea where people think of water in nature in different ways. This forum also addresses how western science dominates South Korean science curriculum and ways of assessing students' conceptual understanding of scientific concepts. Additionally I argue that western science curriculum models could diminish Korean students' understanding of natural world which are based on Korean cultural ways of thinking about the natural world. Finally, I also suggest two different ways of considering this unique knowledge source for a more culturally relevant teaching Earth system education.
Expert systems and simulation models; Proceedings of the Seminar, Tucson, AZ, November 18, 19, 1985
NASA Technical Reports Server (NTRS)
1986-01-01
The seminar presents papers on modeling and simulation methodology, artificial intelligence and expert systems, environments for simulation/expert system development, and methodology for simulation/expert system development. Particular attention is given to simulation modeling concepts and their representation, modular hierarchical model specification, knowledge representation, and rule-based diagnostic expert system development. Other topics include the combination of symbolic and discrete event simulation, real time inferencing, and the management of large knowledge-based simulation projects.
A Framework for Understanding Physics Students' Computational Modeling Practices
NASA Astrophysics Data System (ADS)
Lunk, Brandon Robert
With the growing push to include computational modeling in the physics classroom, we are faced with the need to better understand students' computational modeling practices. While existing research on programming comprehension explores how novices and experts generate programming algorithms, little of this discusses how domain content knowledge, and physics knowledge in particular, can influence students' programming practices. In an effort to better understand this issue, I have developed a framework for modeling these practices based on a resource stance towards student knowledge. A resource framework models knowledge as the activation of vast networks of elements called "resources." Much like neurons in the brain, resources that become active can trigger cascading events of activation throughout the broader network. This model emphasizes the connectivity between knowledge elements and provides a description of students' knowledge base. Together with resources resources, the concepts of "epistemic games" and "frames" provide a means for addressing the interaction between content knowledge and practices. Although this framework has generally been limited to describing conceptual and mathematical understanding, it also provides a means for addressing students' programming practices. In this dissertation, I will demonstrate this facet of a resource framework as well as fill in an important missing piece: a set of epistemic games that can describe students' computational modeling strategies. The development of this theoretical framework emerged from the analysis of video data of students generating computational models during the laboratory component of a Matter & Interactions: Modern Mechanics course. Student participants across two semesters were recorded as they worked in groups to fix pre-written computational models that were initially missing key lines of code. Analysis of this video data showed that the students' programming practices were highly influenced by their existing physics content knowledge, particularly their knowledge of analytic procedures. While this existing knowledge was often applied in inappropriate circumstances, the students were still able to display a considerable amount of understanding of the physics content and of analytic solution procedures. These observations could not be adequately accommodated by the existing literature of programming comprehension. In extending the resource framework to the task of computational modeling, I model students' practices in terms of three important elements. First, a knowledge base includes re- sources for understanding physics, math, and programming structures. Second, a mechanism for monitoring and control describes students' expectations as being directed towards numerical, analytic, qualitative or rote solution approaches and which can be influenced by the problem representation. Third, a set of solution approaches---many of which were identified in this study---describe what aspects of the knowledge base students use and how they use that knowledge to enact their expectations. This framework allows us as researchers to track student discussions and pinpoint the source of difficulties. This work opens up many avenues of potential research. First, this framework gives researchers a vocabulary for extending Resource Theory to other domains of instruction, such as modeling how physics students use graphs. Second, this framework can be used as the basis for modeling expert physicists' programming practices. Important instructional implications also follow from this research. Namely, as we broaden the use of computational modeling in the physics classroom, our instructional practices should focus on helping students understand the step-by-step nature of programming in contrast to the already salient analytic procedures.
Knowledge-Based Hierarchies: Using Organizations to Understand the Economy
ERIC Educational Resources Information Center
Garicano, Luis; Rossi-Hansberg, Esteban
2015-01-01
Incorporating the decision of how to organize the acquisition, use, and communication of knowledge into economic models is essential to understand a wide variety of economic phenomena. We survey the literature that has used knowledge-based hierarchies to study issues such as the evolution of wage inequality, the growth and productivity of firms,…
ERIC Educational Resources Information Center
Guldberg, Karen; Parsons, Sarah; Porayska-Pomsta, Kaska; Keay-Bright, Wendy
2017-01-01
Experimental intervention studies constitute the current dominant research designs in the autism education field. Such designs are based on a "knowledge-transfer" model of evidence-based practice in which research is conducted by researchers, and is then "transferred" to practitioners to enable them to implement evidence-based…
An, Gary C
2010-01-01
The greatest challenge facing the biomedical research community is the effective translation of basic mechanistic knowledge into clinically effective therapeutics. This challenge is most evident in attempts to understand and modulate "systems" processes/disorders, such as sepsis, cancer, and wound healing. Formulating an investigatory strategy for these issues requires the recognition that these are dynamic processes. Representation of the dynamic behavior of biological systems can aid in the investigation of complex pathophysiological processes by augmenting existing discovery procedures by integrating disparate information sources and knowledge. This approach is termed Translational Systems Biology. Focusing on the development of computational models capturing the behavior of mechanistic hypotheses provides a tool that bridges gaps in the understanding of a disease process by visualizing "thought experiments" to fill those gaps. Agent-based modeling is a computational method particularly well suited to the translation of mechanistic knowledge into a computational framework. Utilizing agent-based models as a means of dynamic hypothesis representation will be a vital means of describing, communicating, and integrating community-wide knowledge. The transparent representation of hypotheses in this dynamic fashion can form the basis of "knowledge ecologies," where selection between competing hypotheses will apply an evolutionary paradigm to the development of community knowledge.
Knowledge modeling tool for evidence-based design.
Durmisevic, Sanja; Ciftcioglu, Ozer
2010-01-01
The aim of this study is to take evidence-based design (EBD) to the next level by activating available knowledge, integrating new knowledge, and combining them for more efficient use by the planning and design community. This article outlines a framework for a performance-based measurement tool that can provide the necessary decision support during the design or evaluation of a healthcare environment by estimating the overall design performance of multiple variables. New knowledge in EBD adds continuously to complexity (the "information explosion"), and it becomes impossible to consider all aspects (design features) at the same time, much less their impact on final building performance. How can existing knowledge and the information explosion in healthcare-specifically the domain of EBD-be rendered manageable? Is it feasible to create a computational model that considers many design features and deals with them in an integrated way, rather than one at a time? The found evidence is structured and readied for computation through a "fuzzification" process. The weights are calculated using an analytical hierarchy process. Actual knowledge modeling is accomplished through a fuzzy neural tree structure. The impact of all inputs on the outcome-in this case, patient recovery-is calculated using sensitivity analysis. Finally, the added value of the model is discussed using a hypothetical case study of a patient room. The proposed model can deal with the complexities of various aspects and the relationships among variables in a coordinated way, allowing existing and new pieces of evidence to be integrated in a knowledge tree structure that facilitates understanding of the effects of various design interventions on overall design performance.
Modelling robot's behaviour using finite automata
NASA Astrophysics Data System (ADS)
Janošek, Michal; Žáček, Jaroslav
2017-07-01
This paper proposes a model of a robot's behaviour described by finite automata. We split robot's knowledge into several knowledge bases which are used by the inference mechanism of the robot's expert system to make a logic deduction. Each knowledgebase is dedicated to the particular behaviour domain and the finite automaton helps us switching among these knowledge bases with the respect of actual situation. Our goal is to simplify and reduce complexity of one big knowledgebase splitting it into several pieces. The advantage of this model is that we can easily add new behaviour by adding new knowledgebase and add this behaviour into the finite automaton and define necessary states and transitions.
Research and application of knowledge resources network for product innovation.
Li, Chuan; Li, Wen-qiang; Li, Yan; Na, Hui-zhen; Shi, Qian
2015-01-01
In order to enhance the capabilities of knowledge service in product innovation design service platform, a method of acquiring knowledge resources supporting for product innovation from the Internet and providing knowledge active push is proposed. Through knowledge modeling for product innovation based on ontology, the integrated architecture of knowledge resources network is put forward. The technology for the acquisition of network knowledge resources based on focused crawler and web services is studied. Knowledge active push is provided for users by user behavior analysis and knowledge evaluation in order to improve users' enthusiasm for participation in platform. Finally, an application example is illustrated to prove the effectiveness of the method.
Formal Representations of Eligibility Criteria: A Literature Review
Weng, Chunhua; Tu, Samson W.; Sim, Ida; Richesson, Rachel
2010-01-01
Standards-based, computable knowledge representations for eligibility criteria are increasingly needed to provide computer-based decision support for automated research participant screening, clinical evidence application, and clinical research knowledge management. We surveyed the literature and identified five aspects of eligibility criteria knowledge representations that contribute to the various research and clinical applications: the intended use of computable eligibility criteria, the classification of eligibility criteria, the expression language for representing eligibility rules, the encoding of eligibility concepts, and the modeling of patient data. We consider three of them (expression language, codification of eligibility concepts, and patient data modeling), to be essential constructs of a formal knowledge representation for eligibility criteria. The requirements for each of the three knowledge constructs vary for different use cases, which therefore should inform the development and choice of the constructs toward cost-effective knowledge representation efforts. We discuss the implications of our findings for standardization efforts toward sharable knowledge representation of eligibility criteria. PMID:20034594
A Portal of Educational Resources: Providing Evidence for Matching Pedagogy with Technology
ERIC Educational Resources Information Center
Di Blas, Nicoletta; Fiore, Alessandro; Mainetti, Luca; Vergallo, Roberto; Paolini, Paolo
2014-01-01
The TPACK (Technology, Pedagogy and Content Knowledge) model presents the three types of knowledge that are necessary to implement a successful technology-based educational activity. It highlights how the intersections between TPK (Technological Pedagogical Knowledge), PCK (Pedagogical Content Knowledge) and TCK (Technological Content Knowledge)…
NASA Astrophysics Data System (ADS)
Yusoff, Mohd Zairol; Mahmuddin, Massudi; Ahmad, Mazida
2016-08-01
Knowledge and skill are necessary to develop the capability of knowledge workers. However, there is very little understanding of what the necessary knowledge work (KW) is, and how they influence the quality of knowledge work or knowledge work productivity (KWP) in software development process, including that in small and medium-sized (SME) enterprise. The SME constitutes a major part of the economy and it has been relatively unsuccessful in developing KWP. Accordingly, this paper seeks to explore the influencing dimensions of KWP that effect on the quality of KW in SME environment. First, based on the analysis of the existing literatures, the key characteristics of KW productivity are defined. Second, the conceptual model is proposed, which explores the dimensions of the KWP and its quality. This study analyses data collected from 150 respondents (based on [1], who involve in SME in Malaysia and validates the models by using structural equation modeling (SEM). The results provide an analysis of the effect of KWP on the quality of KW and business success, and have a significant relevance for both research and practice in the SME
Nonlinear multilayers as optical limiters
NASA Astrophysics Data System (ADS)
Turner-Valle, Jennifer Anne
1998-10-01
In this work we present a non-iterative technique for computing the steady-state optical properties of nonlinear multilayers and we examine nonlinear multilayer designs for optical limiters. Optical limiters are filters with intensity-dependent transmission designed to curtail the transmission of incident light above a threshold irradiance value in order to protect optical sensors from damage due to intense light. Thin film multilayers composed of nonlinear materials exhibiting an intensity-dependent refractive index are used as the basis for optical limiter designs in order to enhance the nonlinear filter response by magnifying the electric field in the nonlinear materials through interference effects. The nonlinear multilayer designs considered in this work are based on linear optical interference filter designs which are selected for their spectral properties and electric field distributions. Quarter wave stacks and cavity filters are examined for their suitability as sensor protectors and their manufacturability. The underlying non-iterative technique used to calculate the optical response of these filters derives from recognizing that the multi-valued calculation of output irradiance as a function of incident irradiance may be turned into a single-valued calculation of incident irradiance as a function of output irradiance. Finally, the benefits and drawbacks of using nonlinear multilayer for optical limiting are examined and future research directions are proposed.
Radial sets: interactive visual analysis of large overlapping sets.
Alsallakh, Bilal; Aigner, Wolfgang; Miksch, Silvia; Hauser, Helwig
2013-12-01
In many applications, data tables contain multi-valued attributes that often store the memberships of the table entities to multiple sets such as which languages a person masters, which skills an applicant documents, or which features a product comes with. With a growing number of entities, the resulting element-set membership matrix becomes very rich of information about how these sets overlap. Many analysis tasks targeted at set-typed data are concerned with these overlaps as salient features of such data. This paper presents Radial Sets, a novel visual technique to analyze set memberships for a large number of elements. Our technique uses frequency-based representations to enable quickly finding and analyzing different kinds of overlaps between the sets, and relating these overlaps to other attributes of the table entities. Furthermore, it enables various interactions to select elements of interest, find out if they are over-represented in specific sets or overlaps, and if they exhibit a different distribution for a specific attribute compared to the rest of the elements. These interactions allow formulating highly-expressive visual queries on the elements in terms of their set memberships and attribute values. As we demonstrate via two usage scenarios, Radial Sets enable revealing and analyzing a multitude of overlapping patterns between large sets, beyond the limits of state-of-the-art techniques.
Unbiased feature selection in learning random forests for high-dimensional data.
Nguyen, Thanh-Tung; Huang, Joshua Zhexue; Nguyen, Thuy Thi
2015-01-01
Random forests (RFs) have been widely used as a powerful classification method. However, with the randomization in both bagging samples and feature selection, the trees in the forest tend to select uninformative features for node splitting. This makes RFs have poor accuracy when working with high-dimensional data. Besides that, RFs have bias in the feature selection process where multivalued features are favored. Aiming at debiasing feature selection in RFs, we propose a new RF algorithm, called xRF, to select good features in learning RFs for high-dimensional data. We first remove the uninformative features using p-value assessment, and the subset of unbiased features is then selected based on some statistical measures. This feature subset is then partitioned into two subsets. A feature weighting sampling technique is used to sample features from these two subsets for building trees. This approach enables one to generate more accurate trees, while allowing one to reduce dimensionality and the amount of data needed for learning RFs. An extensive set of experiments has been conducted on 47 high-dimensional real-world datasets including image datasets. The experimental results have shown that RFs with the proposed approach outperformed the existing random forests in increasing the accuracy and the AUC measures.
Parallel heat transport in integrable and chaotic magnetic fields
DOE Office of Scientific and Technical Information (OSTI.GOV)
Del-Castillo-Negrete, Diego B; Chacon, Luis
2012-01-01
The study of transport in magnetized plasmas is a problem of fundamental interest in controlled fusion, space plasmas, and astrophysics research. Three issues make this problem particularly chal- lenging: (i) The extreme anisotropy between the parallel (i.e., along the magnetic field), , and the perpendicular, , conductivities ( / may exceed 1010 in fusion plasmas); (ii) Magnetic field lines chaos which in general complicates (and may preclude) the construction of magnetic field line coordinates; and (iii) Nonlocal parallel transport in the limit of small collisionality. Motivated by these issues, we present a Lagrangian Green s function method to solve themore » local and non-local parallel transport equation applicable to integrable and chaotic magnetic fields in arbitrary geom- etry. The method avoids by construction the numerical pollution issues of grid-based algorithms. The potential of the approach is demonstrated with nontrivial applications to integrable (magnetic island chain), weakly chaotic (devil s staircase), and fully chaotic magnetic field configurations. For the latter, numerical solutions of the parallel heat transport equation show that the effective radial transport, with local and non-local closures, is non-diffusive, thus casting doubts on the appropriateness of the applicability of quasilinear diffusion descriptions. General conditions for the existence of non-diffusive, multivalued flux-gradient relations in the temperature evolution are derived.« less
Modelling students' knowledge organisation: Genealogical conceptual networks
NASA Astrophysics Data System (ADS)
Koponen, Ismo T.; Nousiainen, Maija
2018-04-01
Learning scientific knowledge is largely based on understanding what are its key concepts and how they are related. The relational structure of concepts also affects how concepts are introduced in teaching scientific knowledge. We model here how students organise their knowledge when they represent their understanding of how physics concepts are related. The model is based on assumptions that students use simple basic linking-motifs in introducing new concepts and mostly relate them to concepts that were introduced a few steps earlier, i.e. following a genealogical ordering. The resulting genealogical networks have relatively high local clustering coefficients of nodes but otherwise resemble networks obtained with an identical degree distribution of nodes but with random linking between them (i.e. the configuration-model). However, a few key nodes having a special structural role emerge and these nodes have a higher than average communicability betweenness centralities. These features agree with the empirically found properties of students' concept networks.
NED-IIS: An Intelligent Information System for Forest Ecosystem Management
W.D. Potter; S. Somasekar; R. Kommineni; H.M. Rauscher
1999-01-01
We view Intelligent Information System (IIS) as composed of a unified knowledge base, database, and model base. The model base includes decision support models, forecasting models, and cvsualization models for example. In addition, we feel that the model base should include domain specific porblems solving modules as well as decision support models. This, then,...
ERIC Educational Resources Information Center
Leask, Marilyn; Younie, Sarah
2013-01-01
If teacher quality is the most critical factor in improving educational outcomes, then why is so little attention drawn to the knowledge and evidence base available to support teachers in improving the quality of their professional knowledge? This paper draws together findings from a range of sources to propose national models for continuing…
Momentum Concept in the Process of Knowledge Construction
ERIC Educational Resources Information Center
Ergul, N. Remziye
2013-01-01
Abstraction is one of the methods for learning knowledge with using mental processes that cannot be obtained through experiment and observation. RBC model that is based on abstraction in the process of creating knowledge is directly related to mental processes. In this study, the RBC model is used for the high school students' processes of…
ERIC Educational Resources Information Center
Rampai, Nattaphon; Sopeerak, Saroch
2011-01-01
This research explores that the model of knowledge management and web technology for teachers' professional development as well as its impact in the classroom on learning and teaching, especially in pre-service teacher's competency and practices that refer to knowledge creating, analyzing, nurturing, disseminating, and optimizing process as part…
ERIC Educational Resources Information Center
Harlow, Danielle B.; Bianchini, Julie A.; Swanson, Lauren H.; Dwyer, Hilary A.
2013-01-01
We used a "knowledge in pieces" perspective on teacher learning to document undergraduates' pedagogical resources in a model-based physics course for potential teachers. We defined pedagogical resources as small, discrete ideas about teaching science that are applied appropriately or inappropriately in specific contexts. Neither…
Kunstaetter, Robert
1986-01-01
This presentation describes the design and implementation of a knowledge based physiologic modeling system (KBPMS) and a preliminary evaluation of its use as a learning resource within the context of an experimental medical curriculum -- the Harvard New Pathway. KBPMS possesses combined numeric and qualitative simulation capabilities and can provide explanations of its knowledge and behaviour. It has been implemented on a microcomputer with a user interface incorporating interactive graphics. The preliminary evaluation of KBPMS is based on anecdotal data which suggests that the system might have pedagogic potential. Much work remains to be done in enhancing and further evaluating KBPMS.
Lamprecht, Daniel; Strohmaier, Markus; Helic, Denis; Nyulas, Csongor; Tudorache, Tania; Noy, Natalya F; Musen, Mark A
The need to examine the behavior of different user groups is a fundamental requirement when building information systems. In this paper, we present Ontology-based Decentralized Search (OBDS), a novel method to model the navigation behavior of users equipped with different types of background knowledge. Ontology-based Decentralized Search combines decentralized search, an established method for navigation in social networks, and ontologies to model navigation behavior in information networks. The method uses ontologies as an explicit representation of background knowledge to inform the navigation process and guide it towards navigation targets. By using different ontologies, users equipped with different types of background knowledge can be represented. We demonstrate our method using four biomedical ontologies and their associated Wikipedia articles. We compare our simulation results with base line approaches and with results obtained from a user study. We find that our method produces click paths that have properties similar to those originating from human navigators. The results suggest that our method can be used to model human navigation behavior in systems that are based on information networks, such as Wikipedia. This paper makes the following contributions: (i) To the best of our knowledge, this is the first work to demonstrate the utility of ontologies in modeling human navigation and (ii) it yields new insights and understanding about the mechanisms of human navigation in information networks.
Lamprecht, Daniel; Strohmaier, Markus; Helic, Denis; Nyulas, Csongor; Tudorache, Tania; Noy, Natalya F.; Musen, Mark A.
2015-01-01
The need to examine the behavior of different user groups is a fundamental requirement when building information systems. In this paper, we present Ontology-based Decentralized Search (OBDS), a novel method to model the navigation behavior of users equipped with different types of background knowledge. Ontology-based Decentralized Search combines decentralized search, an established method for navigation in social networks, and ontologies to model navigation behavior in information networks. The method uses ontologies as an explicit representation of background knowledge to inform the navigation process and guide it towards navigation targets. By using different ontologies, users equipped with different types of background knowledge can be represented. We demonstrate our method using four biomedical ontologies and their associated Wikipedia articles. We compare our simulation results with base line approaches and with results obtained from a user study. We find that our method produces click paths that have properties similar to those originating from human navigators. The results suggest that our method can be used to model human navigation behavior in systems that are based on information networks, such as Wikipedia. This paper makes the following contributions: (i) To the best of our knowledge, this is the first work to demonstrate the utility of ontologies in modeling human navigation and (ii) it yields new insights and understanding about the mechanisms of human navigation in information networks. PMID:26568745
A fuzzy case based reasoning tool for model based approach to rocket engine health monitoring
NASA Technical Reports Server (NTRS)
Krovvidy, Srinivas; Nolan, Adam; Hu, Yong-Lin; Wee, William G.
1992-01-01
In this system we develop a fuzzy case based reasoner that can build a case representation for several past anomalies detected, and we develop case retrieval methods that can be used to index a relevant case when a new problem (case) is presented using fuzzy sets. The choice of fuzzy sets is justified by the uncertain data. The new problem can be solved using knowledge of the model along with the old cases. This system can then be used to generalize the knowledge from previous cases and use this generalization to refine the existing model definition. This in turn can help to detect failures using the model based algorithms.
Theory-based Bayesian models of inductive learning and reasoning.
Tenenbaum, Joshua B; Griffiths, Thomas L; Kemp, Charles
2006-07-01
Inductive inference allows humans to make powerful generalizations from sparse data when learning about word meanings, unobserved properties, causal relationships, and many other aspects of the world. Traditional accounts of induction emphasize either the power of statistical learning, or the importance of strong constraints from structured domain knowledge, intuitive theories or schemas. We argue that both components are necessary to explain the nature, use and acquisition of human knowledge, and we introduce a theory-based Bayesian framework for modeling inductive learning and reasoning as statistical inferences over structured knowledge representations.
Learning a Health Knowledge Graph from Electronic Medical Records.
Rotmensch, Maya; Halpern, Yoni; Tlimat, Abdulhakim; Horng, Steven; Sontag, David
2017-07-20
Demand for clinical decision support systems in medicine and self-diagnostic symptom checkers has substantially increased in recent years. Existing platforms rely on knowledge bases manually compiled through a labor-intensive process or automatically derived using simple pairwise statistics. This study explored an automated process to learn high quality knowledge bases linking diseases and symptoms directly from electronic medical records. Medical concepts were extracted from 273,174 de-identified patient records and maximum likelihood estimation of three probabilistic models was used to automatically construct knowledge graphs: logistic regression, naive Bayes classifier and a Bayesian network using noisy OR gates. A graph of disease-symptom relationships was elicited from the learned parameters and the constructed knowledge graphs were evaluated and validated, with permission, against Google's manually-constructed knowledge graph and against expert physician opinions. Our study shows that direct and automated construction of high quality health knowledge graphs from medical records using rudimentary concept extraction is feasible. The noisy OR model produces a high quality knowledge graph reaching precision of 0.85 for a recall of 0.6 in the clinical evaluation. Noisy OR significantly outperforms all tested models across evaluation frameworks (p < 0.01).
BiGG Models: A platform for integrating, standardizing and sharing genome-scale models
King, Zachary A.; Lu, Justin; Drager, Andreas; ...
2015-10-17
In this study, genome-scale metabolic models are mathematically structured knowledge bases that can be used to predict metabolic pathway usage and growth phenotypes. Furthermore, they can generate and test hypotheses when integrated with experimental data. To maximize the value of these models, centralized repositories of high-quality models must be established, models must adhere to established standards and model components must be linked to relevant databases. Tools for model visualization further enhance their utility. To meet these needs, we present BiGG Models (http://bigg.ucsd.edu), a completely redesigned Biochemical, Genetic and Genomic knowledge base. BiGG Models contains more than 75 high-quality, manually-curated genome-scalemore » metabolic models. On the website, users can browse, search and visualize models. BiGG Models connects genome-scale models to genome annotations and external databases. Reaction and metabolite identifiers have been standardized across models to conform to community standards and enable rapid comparison across models. Furthermore, BiGG Models provides a comprehensive application programming interface for accessing BiGG Models with modeling and analysis tools. As a resource for highly curated, standardized and accessible models of metabolism, BiGG Models will facilitate diverse systems biology studies and support knowledge-based analysis of diverse experimental data.« less
BiGG Models: A platform for integrating, standardizing and sharing genome-scale models
King, Zachary A.; Lu, Justin; Dräger, Andreas; Miller, Philip; Federowicz, Stephen; Lerman, Joshua A.; Ebrahim, Ali; Palsson, Bernhard O.; Lewis, Nathan E.
2016-01-01
Genome-scale metabolic models are mathematically-structured knowledge bases that can be used to predict metabolic pathway usage and growth phenotypes. Furthermore, they can generate and test hypotheses when integrated with experimental data. To maximize the value of these models, centralized repositories of high-quality models must be established, models must adhere to established standards and model components must be linked to relevant databases. Tools for model visualization further enhance their utility. To meet these needs, we present BiGG Models (http://bigg.ucsd.edu), a completely redesigned Biochemical, Genetic and Genomic knowledge base. BiGG Models contains more than 75 high-quality, manually-curated genome-scale metabolic models. On the website, users can browse, search and visualize models. BiGG Models connects genome-scale models to genome annotations and external databases. Reaction and metabolite identifiers have been standardized across models to conform to community standards and enable rapid comparison across models. Furthermore, BiGG Models provides a comprehensive application programming interface for accessing BiGG Models with modeling and analysis tools. As a resource for highly curated, standardized and accessible models of metabolism, BiGG Models will facilitate diverse systems biology studies and support knowledge-based analysis of diverse experimental data. PMID:26476456
Knowledge-Based Object Detection in Laser Scanning Point Clouds
NASA Astrophysics Data System (ADS)
Boochs, F.; Karmacharya, A.; Marbs, A.
2012-07-01
Object identification and object processing in 3D point clouds have always posed challenges in terms of effectiveness and efficiency. In practice, this process is highly dependent on human interpretation of the scene represented by the point cloud data, as well as the set of modeling tools available for use. Such modeling algorithms are data-driven and concentrate on specific features of the objects, being accessible to numerical models. We present an approach that brings the human expert knowledge about the scene, the objects inside, and their representation by the data and the behavior of algorithms to the machine. This "understanding" enables the machine to assist human interpretation of the scene inside the point cloud. Furthermore, it allows the machine to understand possibilities and limitations of algorithms and to take this into account within the processing chain. This not only assists the researchers in defining optimal processing steps, but also provides suggestions when certain changes or new details emerge from the point cloud. Our approach benefits from the advancement in knowledge technologies within the Semantic Web framework. This advancement has provided a strong base for applications based on knowledge management. In the article we will present and describe the knowledge technologies used for our approach such as Web Ontology Language (OWL), used for formulating the knowledge base and the Semantic Web Rule Language (SWRL) with 3D processing and topologic built-ins, aiming to combine geometrical analysis of 3D point clouds, and specialists' knowledge of the scene and algorithmic processing.
NASA Astrophysics Data System (ADS)
Mercan, Fatih C.
This study examines epistemological beliefs of physics undergraduate and graduate students and faculty in the context of solving a well-structured and an ill-structured problem. The data collection consisted of a think aloud problem solving session followed by a semi-structured interview conducted with 50 participants, 10 participants at freshmen, seniors, masters, PhD, and faculty levels. The data analysis involved (a) identification of the range of beliefs about knowledge in the context of the well-structured and the ill-structured problem solving, (b) construction of a framework that unites the individual beliefs identified in each problem context under the same conceptual base, and (c) comparisons of the problem contexts and expertise level groups using the framework. The results of the comparison of the contexts of the well-structured and the ill-structured problem showed that (a) authoritative beliefs about knowledge were expressed in the well-structured problem context, (b) relativistic and religious beliefs about knowledge were expressed in the ill-structured problem context, and (c) rational, empirical, modeling beliefs about knowledge were expressed in both problem contexts. The results of the comparison of the expertise level groups showed that (a) undergraduates expressed authoritative beliefs about knowledge more than graduate students and faculty did not express authoritative beliefs, (b) faculty expressed modeling beliefs about knowledge more than graduate students and undergraduates did not express modeling beliefs, and (c) there were no differences in rational, empirical, experiential, relativistic, and religious beliefs about knowledge among the expertise level groups. As the expertise level increased the number of participants who expressed authoritative beliefs about knowledge decreased and the number of participants who expressed modeling based beliefs about knowledge increased. The results of this study implied that existing developmental and cognitive models of personal epistemology can explain personal epistemology in physics to a limited extent, however, these models cannot adequately account for the variation of epistemological beliefs across problem contexts. Modeling beliefs about knowledge emerged as a part of personal epistemology and an indicator of epistemological sophistication, which do not develop until extensive experience in the field. Based on these findings, the researcher recommended providing opportunities for practicing model construction for students.
Linking Earth Observations and Models to Societal Information Needs: The Case of Coastal Flooding
NASA Astrophysics Data System (ADS)
Buzzanga, B. A.; Plag, H. P.
2016-12-01
Coastal flooding is expected to increase in many areas due to sea level rise (SLR). Many societal applications such as emergency planning and designing public services depend on information on how the flooding spectrum may change as a result of SLR. To identify the societal information needs a conceptual model is needed that identifies the key stakeholders, applications, and information and observation needs. In the context of the development of the Global Earth Observation System of Systems (GEOSS), which is implemented by the Group on Earth Observations (GEO), the Socio-Economic and Environmental Information Needs Knowledge Base (SEE-IN KB) is developed as part of the GEOSS Knowledge Base. A core function of the SEE-IN KB is to facilitate the linkage of societal information needs to observations, models, information and knowledge. To achieve this, the SEE-IN KB collects information on objects such as user types, observational requirements, societal goals, models, and datasets. Comprehensive information concerning the interconnections between instances of these objects is used to capture the connectivity and to establish a conceptual model as a network of networks. The captured connectivity can be used in searches to allow users to discover products and services for their information needs, and providers to search for users and applications benefiting from their products. It also allows to answer "What if?" questions and supports knowledge creation. We have used the SEE-IN KB to develop a conceptual model capturing the stakeholders in coastal flooding and their information needs, and to link these elements to objects. We show how the knowledge base enables the transition of scientific data to useable information by connecting individuals such as city managers to flood maps. Within the knowledge base, these same users can request information that improves their ability to make specific planning decisions. These needs are linked to entities within research institutions that have the capabilities to meet them. Further, current research such as that investigating precipitation-induced flooding under different SLR scenarios is linked to the users who benefit from the knowledge, effectively creating a bi-directional channel between science and society that increases knowledge and improves foresight.
Stone, Vathsala I; Lane, Joseph P
2012-05-16
Government-sponsored science, technology, and innovation (STI) programs support the socioeconomic aspects of public policies, in addition to expanding the knowledge base. For example, beneficial healthcare services and devices are expected to result from investments in research and development (R&D) programs, which assume a causal link to commercial innovation. Such programs are increasingly held accountable for evidence of impact-that is, innovative goods and services resulting from R&D activity. However, the absence of comprehensive models and metrics skews evidence gathering toward bibliometrics about research outputs (published discoveries), with less focus on transfer metrics about development outputs (patented prototypes) and almost none on econometrics related to production outputs (commercial innovations). This disparity is particularly problematic for the expressed intent of such programs, as most measurable socioeconomic benefits result from the last category of outputs. This paper proposes a conceptual framework integrating all three knowledge-generating methods into a logic model, useful for planning, obtaining, and measuring the intended beneficial impacts through the implementation of knowledge in practice. Additionally, the integration of the Context-Input-Process-Product (CIPP) model of evaluation proactively builds relevance into STI policies and programs while sustaining rigor. The resulting logic model framework explicitly traces the progress of knowledge from inputs, following it through the three knowledge-generating processes and their respective knowledge outputs (discovery, invention, innovation), as it generates the intended socio-beneficial impacts. It is a hybrid model for generating technology-based innovations, where best practices in new product development merge with a widely accepted knowledge-translation approach. Given the emphasis on evidence-based practice in the medical and health fields and "bench to bedside" expectations for knowledge transfer, sponsors and grantees alike should find the model useful for planning, implementing, and evaluating innovation processes. High-cost/high-risk industries like healthcare require the market deployment of technology-based innovations to improve domestic society in a global economy. An appropriate balance of relevance and rigor in research, development, and production is crucial to optimize the return on public investment in such programs. The technology-innovation process needs a comprehensive operational model to effectively allocate public funds and thereby deliberately and systematically accomplish socioeconomic benefits.
2012-01-01
Background Government-sponsored science, technology, and innovation (STI) programs support the socioeconomic aspects of public policies, in addition to expanding the knowledge base. For example, beneficial healthcare services and devices are expected to result from investments in research and development (R&D) programs, which assume a causal link to commercial innovation. Such programs are increasingly held accountable for evidence of impact—that is, innovative goods and services resulting from R&D activity. However, the absence of comprehensive models and metrics skews evidence gathering toward bibliometrics about research outputs (published discoveries), with less focus on transfer metrics about development outputs (patented prototypes) and almost none on econometrics related to production outputs (commercial innovations). This disparity is particularly problematic for the expressed intent of such programs, as most measurable socioeconomic benefits result from the last category of outputs. Methods This paper proposes a conceptual framework integrating all three knowledge-generating methods into a logic model, useful for planning, obtaining, and measuring the intended beneficial impacts through the implementation of knowledge in practice. Additionally, the integration of the Context-Input-Process-Product (CIPP) model of evaluation proactively builds relevance into STI policies and programs while sustaining rigor. Results The resulting logic model framework explicitly traces the progress of knowledge from inputs, following it through the three knowledge-generating processes and their respective knowledge outputs (discovery, invention, innovation), as it generates the intended socio-beneficial impacts. It is a hybrid model for generating technology-based innovations, where best practices in new product development merge with a widely accepted knowledge-translation approach. Given the emphasis on evidence-based practice in the medical and health fields and “bench to bedside” expectations for knowledge transfer, sponsors and grantees alike should find the model useful for planning, implementing, and evaluating innovation processes. Conclusions High-cost/high-risk industries like healthcare require the market deployment of technology-based innovations to improve domestic society in a global economy. An appropriate balance of relevance and rigor in research, development, and production is crucial to optimize the return on public investment in such programs. The technology-innovation process needs a comprehensive operational model to effectively allocate public funds and thereby deliberately and systematically accomplish socioeconomic benefits. PMID:22591638
A prototype knowledge-based simulation support system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hill, T.R.; Roberts, S.D.
1987-04-01
As a preliminary step toward the goal of an intelligent automated system for simulation modeling support, we explore the feasibility of the overall concept by generating and testing a prototypical framework. A prototype knowledge-based computer system was developed to support a senior level course in industrial engineering so that the overall feasibility of an expert simulation support system could be studied in a controlled and observable setting. The system behavior mimics the diagnostic (intelligent) process performed by the course instructor and teaching assistants, finding logical errors in INSIGHT simulation models and recommending appropriate corrective measures. The system was programmed inmore » a non-procedural language (PROLOG) and designed to run interactively with students working on course homework and projects. The knowledge-based structure supports intelligent behavior, providing its users with access to an evolving accumulation of expert diagnostic knowledge. The non-procedural approach facilitates the maintenance of the system and helps merge the roles of expert and knowledge engineer by allowing new knowledge to be easily incorporated without regard to the existing flow of control. The background, features and design of the system are describe and preliminary results are reported. Initial success is judged to demonstrate the utility of the reported approach and support the ultimate goal of an intelligent modeling system which can support simulation modelers outside the classroom environment. Finally, future extensions are suggested.« less
Knowledge Management through the Equilibrium Pattern Model for Learning
NASA Astrophysics Data System (ADS)
Sarirete, Akila; Noble, Elizabeth; Chikh, Azeddine
Contemporary students are characterized by having very applied learning styles and methods of acquiring knowledge. This behavior is consistent with the constructivist models where students are co-partners in the learning process. In the present work the authors developed a new model of learning based on the constructivist theory coupled with the cognitive development theory of Piaget. The model considers the level of learning based on several stages and the move from one stage to another requires learners' challenge. At each time a new concept is introduced creates a disequilibrium that needs to be worked out to return back to its equilibrium stage. This process of "disequilibrium/equilibrium" has been analyzed and validated using a course in computer networking as part of Cisco Networking Academy Program at Effat College, a women college in Saudi Arabia. The model provides a theoretical foundation for teaching especially in a complex knowledge domain such as engineering and can be used in a knowledge economy.
Blackboard architecture for medical image interpretation
NASA Astrophysics Data System (ADS)
Davis, Darryl N.; Taylor, Christopher J.
1991-06-01
There is a growing interest in using sophisticated knowledge-based systems for biomedical image interpretation. We present a principled attempt to use artificial intelligence methodologies in interpreting lateral skull x-ray images. Such radiographs are routinely used in cephalometric analysis to provide quantitative measurements useful to clinical orthodontists. Manual and interactive methods of analysis are known to be error prone and previous attempts to automate this analysis typically fail to capture the expertise and adaptability required to cope with the variability in biological structure and image quality. An integrated model-based system has been developed which makes use of a blackboard architecture and multiple knowledge sources. A model definition interface allows quantitative models, of feature appearance and location, to be built from examples as well as more qualitative modelling constructs. Visual task definition and blackboard control modules allow task-specific knowledge sources to act on information available to the blackboard in a hypothesise and test reasoning cycle. Further knowledge-based modules include object selection, location hypothesis, intelligent segmentation, and constraint propagation systems. Alternative solutions to given tasks are permitted.
Health-Related Fitness Knowledge Development through Project-Based Learning
ERIC Educational Resources Information Center
Hastle, Peter A.; Chen, Senlin; Guarino, Anthony J.
2017-01-01
Purpose: The purpose of this study was to examine the process and outcome of an intervention using the project-based learning (PBL) model to increase students' health-related fitness (HRF) knowledge. Method: The participants were 185 fifth-grade students from three schools in Alabama (PBL group: n = 109; control group: n = 76). HRF knowledge was…
SAFOD Brittle Microstructure and Mechanics Knowledge Base (SAFOD BM2KB)
NASA Astrophysics Data System (ADS)
Babaie, H. A.; Hadizadeh, J.; di Toro, G.; Mair, K.; Kumar, A.
2008-12-01
We have developed a knowledge base to store and present the data collected by a group of investigators studying the microstructures and mechanics of brittle faulting using core samples from the SAFOD (San Andreas Fault Observatory at Depth) project. The investigations are carried out with a variety of analytical and experimental methods primarily to better understand the physics of strain localization in fault gouge. The knowledge base instantiates an specially-designed brittle rock deformation ontology developed at Georgia State University. The inference rules embedded in the semantic web languages, such as OWL, RDF, and RDFS, which are used in our ontology, allow the Pellet reasoner used in this application to derive additional truths about the ontology and knowledge of this domain. Access to the knowledge base is via a public website, which is designed to provide the knowledge acquired by all the investigators involved in the project. The stored data will be products of studies such as: experiments (e.g., high-velocity friction experiment), analyses (e.g., microstructural, chemical, mass transfer, mineralogical, surface, image, texture), microscopy (optical, HRSEM, FESEM, HRTEM]), tomography, porosity measurement, microprobe, and cathodoluminesence. Data about laboratories, experimental conditions, methods, assumptions, equipments, and mechanical properties and lithology of the studied samples will also be presented on the website per investigation. The ontology was modeled applying the UML (Unified Modeling Language) in Rational Rose, and implemented in OWL-DL (Ontology Web Language) using the Protégé ontology editor. The UML model was converted to OWL-DL by first mapping it to Ecore (.ecore) and Generator model (.genmodel) with the help of the EMF (Eclipse Modeling Framework) plugin in Eclipse. The Ecore model was then mapped to a .uml file, which later was converted into an .owl file and subsequently imported into the Protégé ontology editing environment. The web-interface was developed in java using eclipse as the IDE. The web interfaces to query and submit data were implemented applying JSP, servlets, javascript, and AJAX. The Jena API, a Java framework for building Semantic Web applications, was used to develop the web-interface. Jena provided a programmatic environment for RDF, RDFS, OWL, and SPARQL query engine. Building web applications with AJAX helps retrieving data from the server asynchronously in the background without interfering with the display and behavior of the existing page. The application was deployed on an apache tomcat server at GSU. The SAFOD BM2KB website provides user-friendly search, submit, feedback, and other services. The General Search option allows users to search the knowledge base by selecting the classes (e.g., Experiment, Surface Analysis), their respective attributes (e.g., apparatus, date performed), and the relationships to other classes (e.g., Sample, Laboratory). The Search by Sample option allows users to search the knowledge base based on sample number. The Search by Investigator lets users to search the knowledge base by choosing an investigator who is involved in this project. The website also allows users to submit new data. The Submit Data option opens a page where users can submit the SAFOD data to our knowledge base by selecting specific classes and attributes. The submitted data then become available for query as part of the knowledge base. The SAFOD BM2KB can be accessed from the main SAFOD website.
Pre-Service Physics Teachers' Knowledge of Models and Perceptions of Modelling
ERIC Educational Resources Information Center
Ogan-Bekiroglu, Feral
2006-01-01
One of the purposes of this study was to examine the differences between knowledge of pre-service physics teachers who experienced model-based teaching in pre-service education and those who did not. Moreover, it was aimed to determine pre-service physics teachers' perceptions of modelling. Posttest-only control group experimental design was used…
Hubble Space Telescope Design Engineering Knowledgebase (HSTDEK)
NASA Technical Reports Server (NTRS)
Johannes, James D.; Everetts, Clark
1989-01-01
The research covered here pays specific attention to the development of tools to assist knowledge engineers in acquiring knowledge and to assist other technical, engineering, and management personnel in automatically performing knowledge capture as part of their everyday work without adding any extra work to what they already do. Requirements for data products, the knowledge base, and methods for mapping knowledge in the documents onto the knowledge representations are discussed, as are some of the difficulties of capturing in the knowledge base the structure of the design process itself, along with a model of the system designed. The capture of knowledge describing the interactions of different components is also discussed briefly.
Knowledge assistant for robotic environmental characterization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Feddema, J.; Rivera, J.; Tucker, S.
1996-08-01
A prototype sensor fusion framework called the {open_quotes}Knowledge Assistant{close_quotes} has been developed and tested on a gantry robot at Sandia National Laboratories. This Knowledge Assistant guides the robot operator during the planning, execution, and post analysis stages of the characterization process. During the planning stage, the Knowledge Assistant suggests robot paths and speeds based on knowledge of sensors available and their physical characteristics. During execution, the Knowledge Assistant coordinates the collection of data through a data acquisition {open_quotes}specialist.{close_quotes} During execution and postanalysis, the Knowledge Assistant sends raw data to other {open_quotes}specialists,{close_quotes} which include statistical pattern recognition software, a neural network,more » and model-based search software. After the specialists return their results, the Knowledge Assistant consolidates the information and returns a report to the robot control system where the sensed objects and their attributes (e.g., estimated dimensions, weight, material composition, etc.) are displayed in the world model. This report highlights the major components of this system.« less
Predicting Mycobacterium tuberculosis Complex Clades Using Knowledge-Based Bayesian Networks
Bennett, Kristin P.
2014-01-01
We develop a novel approach for incorporating expert rules into Bayesian networks for classification of Mycobacterium tuberculosis complex (MTBC) clades. The proposed knowledge-based Bayesian network (KBBN) treats sets of expert rules as prior distributions on the classes. Unlike prior knowledge-based support vector machine approaches which require rules expressed as polyhedral sets, KBBN directly incorporates the rules without any modification. KBBN uses data to refine rule-based classifiers when the rule set is incomplete or ambiguous. We develop a predictive KBBN model for 69 MTBC clades found in the SITVIT international collection. We validate the approach using two testbeds that model knowledge of the MTBC obtained from two different experts and large DNA fingerprint databases to predict MTBC genetic clades and sublineages. These models represent strains of MTBC using high-throughput biomarkers called spacer oligonucleotide types (spoligotypes), since these are routinely gathered from MTBC isolates of tuberculosis (TB) patients. Results show that incorporating rules into problems can drastically increase classification accuracy if data alone are insufficient. The SITVIT KBBN is publicly available for use on the World Wide Web. PMID:24864238
Knowledge-based vision and simple visual machines.
Cliff, D; Noble, J
1997-01-01
The vast majority of work in machine vision emphasizes the representation of perceived objects and events: it is these internal representations that incorporate the 'knowledge' in knowledge-based vision or form the 'models' in model-based vision. In this paper, we discuss simple machine vision systems developed by artificial evolution rather than traditional engineering design techniques, and note that the task of identifying internal representations within such systems is made difficult by the lack of an operational definition of representation at the causal mechanistic level. Consequently, we question the nature and indeed the existence of representations posited to be used within natural vision systems (i.e. animals). We conclude that representations argued for on a priori grounds by external observers of a particular vision system may well be illusory, and are at best place-holders for yet-to-be-identified causal mechanistic interactions. That is, applying the knowledge-based vision approach in the understanding of evolved systems (machines or animals) may well lead to theories and models that are internally consistent, computationally plausible, and entirely wrong. PMID:9304684
Common IED exploitation target set ontology
NASA Astrophysics Data System (ADS)
Russomanno, David J.; Qualls, Joseph; Wowczuk, Zenovy; Franken, Paul; Robinson, William
2010-04-01
The Common IED Exploitation Target Set (CIEDETS) ontology provides a comprehensive semantic data model for capturing knowledge about sensors, platforms, missions, environments, and other aspects of systems under test. The ontology also includes representative IEDs; modeled as explosives, camouflage, concealment objects, and other background objects, which comprise an overall threat scene. The ontology is represented using the Web Ontology Language and the SPARQL Protocol and RDF Query Language, which ensures portability of the acquired knowledge base across applications. The resulting knowledge base is a component of the CIEDETS application, which is intended to support the end user sensor test and evaluation community. CIEDETS associates a system under test to a subset of cataloged threats based on the probability that the system will detect the threat. The associations between systems under test, threats, and the detection probabilities are established based on a hybrid reasoning strategy, which applies a combination of heuristics and simplified modeling techniques. Besides supporting the CIEDETS application, which is focused on efficient and consistent system testing, the ontology can be leveraged in a myriad of other applications, including serving as a knowledge source for mission planning tools.
ERIC Educational Resources Information Center
Pauleen, David J.; Corbitt, Brian; Yoong, Pak
2007-01-01
Purpose: To provide a conceptual model for the discovery and articulation of emergent organizational knowledge, particularly knowledge that develops when people work with new technologies. Design/methodology/approach: The model is based on two widely accepted research methods--action learning and grounded theory--and is illustrated using a case…
When craft and science collide: Improving therapeutic practices through evidence-based innovations.
Justice, Laura M
2010-04-01
Evidence-based practice (EBP) is a model of clinical decision-making that is increasingly being advocated for use in the field of speech-language pathology. With the increased emphasis on scientific evidence as a form of knowledge important to EBP, clinicians may wonder whether their craft-based knowledge (i.e., knowledge derived from theory and practice), remains a legitimate form of knowledge for use in clinician decisions. This article describes forms of knowledge that may be used to address clinical questions, to include both craft and science. Additionally, the steps used when engaging in EBP are described so that clinicians understand when and how craft comes into play. The major premise addressed within this article is that craft is a legitimate form of knowledge and that engagement in EBP requires one to employ craft-based knowledge.
Applying Service-Oriented Architecture on The Development of Groundwater Modeling Support System
NASA Astrophysics Data System (ADS)
Li, C. Y.; WANG, Y.; Chang, L. C.; Tsai, J. P.; Hsiao, C. T.
2016-12-01
Groundwater simulation has become an essential step on the groundwater resources management and assessment. There are many stand-alone pre- and post-processing software packages to alleviate the model simulation loading, but the stand-alone software do not consider centralized management of data and simulation results neither do they provide network sharing functions. Hence, it is difficult to share and reuse the data and knowledge (simulation cases) systematically within or across companies. Therefore, this study develops a centralized and network based groundwater modeling support system to assist model construction. The system is based on service-oriented architecture and allows remote user to develop their modeling cases on internet. The data and cases (knowledge) are thus easy to manage centralized. MODFLOW is the modeling engine of the system, which is the most popular groundwater model in the world. The system provides a data warehouse to restore groundwater observations, MODFLOW Support Service, MODFLOW Input File & Shapefile Convert Service, MODFLOW Service, and Expert System Service to assist researchers to build models. Since the system architecture is service-oriented, it is scalable and flexible. The system can be easily extended to include the scenarios analysis and knowledge management to facilitate the reuse of groundwater modeling knowledge.
Using texts in science education: cognitive processes and knowledge representation.
van den Broek, Paul
2010-04-23
Texts form a powerful tool in teaching concepts and principles in science. How do readers extract information from a text, and what are the limitations in this process? Central to comprehension of and learning from a text is the construction of a coherent mental representation that integrates the textual information and relevant background knowledge. This representation engenders learning if it expands the reader's existing knowledge base or if it corrects misconceptions in this knowledge base. The Landscape Model captures the reading process and the influences of reader characteristics (such as working-memory capacity, reading goal, prior knowledge, and inferential skills) and text characteristics (such as content/structure of presented information, processing demands, and textual cues). The model suggests factors that can optimize--or jeopardize--learning science from text.
Boegl, Karl; Adlassnig, Klaus-Peter; Hayashi, Yoichi; Rothenfluh, Thomas E; Leitich, Harald
2004-01-01
This paper describes the fuzzy knowledge representation framework of the medical computer consultation system MedFrame/CADIAG-IV as well as the specific knowledge acquisition techniques that have been developed to support the definition of knowledge concepts and inference rules. As in its predecessor system CADIAG-II, fuzzy medical knowledge bases are used to model the uncertainty and the vagueness of medical concepts and fuzzy logic reasoning mechanisms provide the basic inference processes. The elicitation and acquisition of medical knowledge from domain experts has often been described as the most difficult and time-consuming task in knowledge-based system development in medicine. It comes as no surprise that this is even more so when unfamiliar representations like fuzzy membership functions are to be acquired. From previous projects we have learned that a user-centered approach is mandatory in complex and ill-defined knowledge domains such as internal medicine. This paper describes the knowledge acquisition framework that has been developed in order to make easier and more accessible the three main tasks of: (a) defining medical concepts; (b) providing appropriate interpretations for patient data; and (c) constructing inferential knowledge in a fuzzy knowledge representation framework. Special emphasis is laid on the motivations for some system design and data modeling decisions. The theoretical framework has been implemented in a software package, the Knowledge Base Builder Toolkit. The conception and the design of this system reflect the need for a user-centered, intuitive, and easy-to-handle tool. First results gained from pilot studies have shown that our approach can be successfully implemented in the context of a complex fuzzy theoretical framework. As a result, this critical aspect of knowledge-based system development can be accomplished more easily.
Research and Application of Knowledge Resources Network for Product Innovation
Li, Chuan; Li, Wen-qiang; Li, Yan; Na, Hui-zhen; Shi, Qian
2015-01-01
In order to enhance the capabilities of knowledge service in product innovation design service platform, a method of acquiring knowledge resources supporting for product innovation from the Internet and providing knowledge active push is proposed. Through knowledge modeling for product innovation based on ontology, the integrated architecture of knowledge resources network is put forward. The technology for the acquisition of network knowledge resources based on focused crawler and web services is studied. Knowledge active push is provided for users by user behavior analysis and knowledge evaluation in order to improve users' enthusiasm for participation in platform. Finally, an application example is illustrated to prove the effectiveness of the method. PMID:25884031
Novel nonlinear knowledge-based mean force potentials based on machine learning.
Dong, Qiwen; Zhou, Shuigeng
2011-01-01
The prediction of 3D structures of proteins from amino acid sequences is one of the most challenging problems in molecular biology. An essential task for solving this problem with coarse-grained models is to deduce effective interaction potentials. The development and evaluation of new energy functions is critical to accurately modeling the properties of biological macromolecules. Knowledge-based mean force potentials are derived from statistical analysis of proteins of known structures. Current knowledge-based potentials are almost in the form of weighted linear sum of interaction pairs. In this study, a class of novel nonlinear knowledge-based mean force potentials is presented. The potential parameters are obtained by nonlinear classifiers, instead of relative frequencies of interaction pairs against a reference state or linear classifiers. The support vector machine is used to derive the potential parameters on data sets that contain both native structures and decoy structures. Five knowledge-based mean force Boltzmann-based or linear potentials are introduced and their corresponding nonlinear potentials are implemented. They are the DIH potential (single-body residue-level Boltzmann-based potential), the DFIRE-SCM potential (two-body residue-level Boltzmann-based potential), the FS potential (two-body atom-level Boltzmann-based potential), the HR potential (two-body residue-level linear potential), and the T32S3 potential (two-body atom-level linear potential). Experiments are performed on well-established decoy sets, including the LKF data set, the CASP7 data set, and the Decoys “R”Us data set. The evaluation metrics include the energy Z score and the ability of each potential to discriminate native structures from a set of decoy structures. Experimental results show that all nonlinear potentials significantly outperform the corresponding Boltzmann-based or linear potentials, and the proposed discriminative framework is effective in developing knowledge-based mean force potentials. The nonlinear potentials can be widely used for ab initio protein structure prediction, model quality assessment, protein docking, and other challenging problems in computational biology.
Khan, Taimoor; De, Asok
2014-01-01
In the last decade, artificial neural networks have become very popular techniques for computing different performance parameters of microstrip antennas. The proposed work illustrates a knowledge-based neural networks model for predicting the appropriate shape and accurate size of the slot introduced on the radiating patch for achieving desired level of resonance, gain, directivity, antenna efficiency, and radiation efficiency for dual-frequency operation. By incorporating prior knowledge in neural model, the number of required training patterns is drastically reduced. Further, the neural model incorporated with prior knowledge can be used for predicting response in extrapolation region beyond the training patterns region. For validation, a prototype is also fabricated and its performance parameters are measured. A very good agreement is attained between measured, simulated, and predicted results.
De, Asok
2014-01-01
In the last decade, artificial neural networks have become very popular techniques for computing different performance parameters of microstrip antennas. The proposed work illustrates a knowledge-based neural networks model for predicting the appropriate shape and accurate size of the slot introduced on the radiating patch for achieving desired level of resonance, gain, directivity, antenna efficiency, and radiation efficiency for dual-frequency operation. By incorporating prior knowledge in neural model, the number of required training patterns is drastically reduced. Further, the neural model incorporated with prior knowledge can be used for predicting response in extrapolation region beyond the training patterns region. For validation, a prototype is also fabricated and its performance parameters are measured. A very good agreement is attained between measured, simulated, and predicted results. PMID:27382616
Somekh, Judith; Choder, Mordechai; Dori, Dov
2012-01-01
We propose a Conceptual Model-based Systems Biology framework for qualitative modeling, executing, and eliciting knowledge gaps in molecular biology systems. The framework is an adaptation of Object-Process Methodology (OPM), a graphical and textual executable modeling language. OPM enables concurrent representation of the system's structure—the objects that comprise the system, and behavior—how processes transform objects over time. Applying a top-down approach of recursively zooming into processes, we model a case in point—the mRNA transcription cycle. Starting with this high level cell function, we model increasingly detailed processes along with participating objects. Our modeling approach is capable of modeling molecular processes such as complex formation, localization and trafficking, molecular binding, enzymatic stimulation, and environmental intervention. At the lowest level, similar to the Gene Ontology, all biological processes boil down to three basic molecular functions: catalysis, binding/dissociation, and transporting. During modeling and execution of the mRNA transcription model, we discovered knowledge gaps, which we present and classify into various types. We also show how model execution enhances a coherent model construction. Identification and pinpointing knowledge gaps is an important feature of the framework, as it suggests where research should focus and whether conjectures about uncertain mechanisms fit into the already verified model. PMID:23308089
Knowledge network model of the energy consumption in discrete manufacturing system
NASA Astrophysics Data System (ADS)
Xu, Binzi; Wang, Yan; Ji, Zhicheng
2017-07-01
Discrete manufacturing system generates a large amount of data and information because of the development of information technology. Hence, a management mechanism is urgently required. In order to incorporate knowledge generated from manufacturing data and production experience, a knowledge network model of the energy consumption in the discrete manufacturing system was put forward based on knowledge network theory and multi-granularity modular ontology technology. This model could provide a standard representation for concepts, terms and their relationships, which could be understood by both human and computer. Besides, the formal description of energy consumption knowledge elements (ECKEs) in the knowledge network was also given. Finally, an application example was used to verify the feasibility of the proposed method.
NASA Astrophysics Data System (ADS)
Telipenko, E.; Chernysheva, T.; Zakharova, A.; Dumchev, A.
2015-10-01
The article represents research results about the knowledge base development for the intellectual information system for the bankruptcy risk assessment of the enterprise. It is described the process analysis of the knowledge base development; the main process stages, some problems and their solutions are given. The article introduces the connectionist model for the bankruptcy risk assessment based on the analysis of industrial enterprise financial accounting. The basis for this connectionist model is a three-layer perceptron with the back propagation of error algorithm. The knowledge base for the intellectual information system consists of processed information and the processing operation method represented as the connectionist model. The article represents the structure of the intellectual information system, the knowledge base, and the information processing algorithm for neural network training. The paper shows mean values of 10 indexes for industrial enterprises; with the help of them it is possible to carry out a financial analysis of industrial enterprises and identify correctly the current situation for well-timed managerial decisions. Results are given about neural network testing on the data of both bankrupt and financially strong enterprises, which were not included into training and test sets.
Goossen, William T F
2014-07-01
This paper will present an overview of the developmental effort in harmonizing clinical knowledge modeling using the Detailed Clinical Models (DCMs), and will explain how it can contribute to the preservation of Electronic Health Records (EHR) data. Clinical knowledge modeling is vital for the management and preservation of EHR and data. Such modeling provides common data elements and terminology binding with the intention of capturing and managing clinical information over time and location independent from technology. Any EHR data exchange without an agreed clinical knowledge modeling will potentially result in loss of information. Many attempts exist from the past to model clinical knowledge for the benefits of semantic interoperability using standardized data representation and common terminologies. The objective of each project is similar with respect to consistent representation of clinical data, using standardized terminologies, and an overall logical approach. However, the conceptual, logical, and the technical expressions are quite different in one clinical knowledge modeling approach versus another. There currently are synergies under the Clinical Information Modeling Initiative (CIMI) in order to create a harmonized reference model for clinical knowledge models. The goal for the CIMI is to create a reference model and formalisms based on for instance the DCM (ISO/TS 13972), among other work. A global repository of DCMs may potentially be established in the future.
Anselmi, Pasquale; Stefanutti, Luca; de Chiusole, Debora; Robusto, Egidio
2017-11-01
The gain-loss model (GaLoM) is a formal model for assessing knowledge and learning. In its original formulation, the GaLoM assumes independence among the skills. Such an assumption is not reasonable in several domains, in which some preliminary knowledge is the foundation for other knowledge. This paper presents an extension of the GaLoM to the case in which the skills are not independent, and the dependence relation among them is described by a well-graded competence space. The probability of mastering skill s at the pretest is conditional on the presence of all skills on which s depends. The probabilities of gaining or losing skill s when moving from pretest to posttest are conditional on the mastery of s at the pretest, and on the presence at the posttest of all skills on which s depends. Two formulations of the model are presented, in which the learning path is allowed to change from pretest to posttest or not. A simulation study shows that models based on the true competence space obtain a better fit than models based on false competence spaces, and are also characterized by a higher assessment accuracy. An empirical application shows that models based on pedagogically sound assumptions about the dependencies among the skills obtain a better fit than models assuming independence among the skills. © 2017 The British Psychological Society.
A new approach to the effect of sound on vortex dynamics
NASA Technical Reports Server (NTRS)
Lund, Fernando; Zabusky, Norman J.
1987-01-01
Analytical results are presented on the effect of acoustic radiation on three-dimensional vortex motions in a homogeneous, slightly compressible, inviscid fluid. The flow is considered as linear and irrotational everywhere except inside a very thin cylindrical core region around the vortex filament. In the outside region, a velocity potential is introduced that must be multivalued, and it is shown how to compute this scalar potential if the motion of the vortex filament is prescribed. To find the motion of this singularity in an external potential flow, a variational principle involving a volume integral that must exclude the singular region is considered. A functional of the external potential and vortex filament position is obtained whose extrema give equations to determine the sought-after evolution. Thus, a generalization of the Biot-Savart law to flows with constant sound speed at low Mach number is obtained.
Extension of the root-locus method to a certain class of fractional-order systems.
Merrikh-Bayat, Farshad; Afshar, Mahdi; Karimi-Ghartemani, Masoud
2009-01-01
In this paper, the well-known root-locus method is developed for the special subset of linear time-invariant systems commonly known as fractional-order systems. Transfer functions of these systems are rational functions with polynomials of rational powers of the Laplace variable s. Such systems are defined on a Riemann surface because of their multi-valued nature. A set of rules for plotting the root loci on the first Riemann sheet is presented. The important features of the classical root-locus method such as asymptotes, roots condition on the real axis and breakaway points are extended to the fractional case. It is also shown that the proposed method can assess the closed-loop stability of fractional-order systems in the presence of a varying gain in the loop. Moreover, the effect of perturbation on the root loci is discussed. Three illustrative examples are presented to confirm the effectiveness of the proposed algorithm.
Blocked Force and Loading Calculations for LaRC THUNDER Actuators
NASA Technical Reports Server (NTRS)
Campbell, Joel F.
2007-01-01
An analytic approach is developed to predict the performance of LaRC Thunder actuators under load and under blocked conditions. The problem is treated with the Von Karman non-linear analysis combined with a simple Raleigh-Ritz calculation. From this, shape and displacement under load combined with voltage are calculated. A method is found to calculate the blocked force vs voltage and spring force vs distance. It is found that under certain conditions, the blocked force and displacement is almost linear with voltage. It is also found that the spring force is multivalued and has at least one bifurcation point. This bifurcation point is where the device collapses under load and locks to a different bending solution. This occurs at a particular critical load. It is shown this other bending solution has a reduced amplitude and is proportional to the original amplitude times the square of the aspect ratio.
Non-Market Values in a Cost-Benefit World: Evidence from a Choice Experiment.
Eppink, Florian V; Winden, Matthew; Wright, Will C C; Greenhalgh, Suzie
2016-01-01
In support of natural resource and ecosystem service policy, monetary value estimates are often presented to decision makers along with other types of information. There is some evidence that, presented with such 'mixed' information, people prioritise monetary over non-monetary information. We conduct a discrete choice experiment among New Zealand decision makers in which we manipulate the information presented to participants. We find that providing explicit monetary information strengthens the pursuit of economic benefits as well as the avoidance of environmental damage. Cultural impacts, of which we provided only qualitative descriptions, did not affect respondents' choices. Our study provides further evidence that concerns regarding the use of monetary information in decisions with complex, multi-value impacts are valid. Further research is needed to validate our results and find ways to reduce any bias in monetary and non-market information.
Non-Market Values in a Cost-Benefit World: Evidence from a Choice Experiment
Eppink, Florian V.; Winden, Matthew; Wright, Will C. C.; Greenhalgh, Suzie
2016-01-01
In support of natural resource and ecosystem service policy, monetary value estimates are often presented to decision makers along with other types of information. There is some evidence that, presented with such ‘mixed’ information, people prioritise monetary over non-monetary information. We conduct a discrete choice experiment among New Zealand decision makers in which we manipulate the information presented to participants. We find that providing explicit monetary information strengthens the pursuit of economic benefits as well as the avoidance of environmental damage. Cultural impacts, of which we provided only qualitative descriptions, did not affect respondents’ choices. Our study provides further evidence that concerns regarding the use of monetary information in decisions with complex, multi-value impacts are valid. Further research is needed to validate our results and find ways to reduce any bias in monetary and non-market information. PMID:27783657
Integrated logic circuits using single-atom transistors
Mol, J. A.; Verduijn, J.; Levine, R. D.; Remacle, F.
2011-01-01
Scaling down the size of computing circuits is about to reach the limitations imposed by the discrete atomic structure of matter. Reducing the power requirements and thereby dissipation of integrated circuits is also essential. New paradigms are needed to sustain the rate of progress that society has become used to. Single-atom transistors, SATs, cascaded in a circuit are proposed as a promising route that is compatible with existing technology. We demonstrate the use of quantum degrees of freedom to perform logic operations in a complementary-metal–oxide–semiconductor device. Each SAT performs multilevel logic by electrically addressing the electronic states of a dopant atom. A single electron transistor decodes the physical multivalued output into the conventional binary output. A robust scalable circuit of two concatenated full adders is reported, where by utilizing charge and quantum degrees of freedom, the functionality of the transistor is pushed far beyond that of a simple switch. PMID:21808050
Background Knowledge in Learning-Based Relation Extraction
ERIC Educational Resources Information Center
Do, Quang Xuan
2012-01-01
In this thesis, we study the importance of background knowledge in relation extraction systems. We not only demonstrate the benefits of leveraging background knowledge to improve the systems' performance but also propose a principled framework that allows one to effectively incorporate knowledge into statistical machine learning models for…
Mathematical Learning Models that Depend on Prior Knowledge and Instructional Strategies
ERIC Educational Resources Information Center
Pritchard, David E.; Lee, Young-Jin; Bao, Lei
2008-01-01
We present mathematical learning models--predictions of student's knowledge vs amount of instruction--that are based on assumptions motivated by various theories of learning: tabula rasa, constructivist, and tutoring. These models predict the improvement (on the post-test) as a function of the pretest score due to intervening instruction and also…
NASA Astrophysics Data System (ADS)
Salloum, Sara
2017-06-01
This conceptual paper aims to characterize science teachers' practical knowledge utilizing a virtue-based theory of knowledge and the Aristotelian notion of phronesis/practical wisdom. The article argues that a greater understanding of the concept of phronesis and its relevance to science education would enrich our understandings of teacher knowledge, its development, and consequently models of teacher education. Views of teacher knowledge presented in this paper are informed by philosophical literature that questions normative views of knowledge and argues for a virtue-based epistemology rather than a belief-based one. The paper first outlines general features of phronesis/practical wisdom. Later, a virtue-based view of knowledge is described. A virtue-based view binds knowledge with moral concepts and suggests that knowledge development is motivated by intellectual virtues such as intellectual sobriety, perseverance, fairness, and humility. A virtue-based theory of knowledge gives prominence to the virtue of phronesis/practical wisdom, whose primary function is to mediate among virtues and theoretical knowledge into a line of action that serves human goods. The role of phronesis and its relevance to teaching science are explained accordingly. I also discuss differences among various characterizations of practical knowledge in science education and a virtue-based characterization. Finally, implications and further questions for teacher education are presented.
Ahlberg, Ernst; Amberg, Alexander; Beilke, Lisa D; Bower, David; Cross, Kevin P; Custer, Laura; Ford, Kevin A; Van Gompel, Jacky; Harvey, James; Honma, Masamitsu; Jolly, Robert; Joossens, Elisabeth; Kemper, Raymond A; Kenyon, Michelle; Kruhlak, Naomi; Kuhnke, Lara; Leavitt, Penny; Naven, Russell; Neilan, Claire; Quigley, Donald P; Shuey, Dana; Spirkl, Hans-Peter; Stavitskaya, Lidiya; Teasdale, Andrew; White, Angela; Wichard, Joerg; Zwickl, Craig; Myatt, Glenn J
2016-06-01
Statistical-based and expert rule-based models built using public domain mutagenicity knowledge and data are routinely used for computational (Q)SAR assessments of pharmaceutical impurities in line with the approach recommended in the ICH M7 guideline. Knowledge from proprietary corporate mutagenicity databases could be used to increase the predictive performance for selected chemical classes as well as expand the applicability domain of these (Q)SAR models. This paper outlines a mechanism for sharing knowledge without the release of proprietary data. Primary aromatic amine mutagenicity was selected as a case study because this chemical class is often encountered in pharmaceutical impurity analysis and mutagenicity of aromatic amines is currently difficult to predict. As part of this analysis, a series of aromatic amine substructures were defined and the number of mutagenic and non-mutagenic examples for each chemical substructure calculated across a series of public and proprietary mutagenicity databases. This information was pooled across all sources to identify structural classes that activate or deactivate aromatic amine mutagenicity. This structure activity knowledge, in combination with newly released primary aromatic amine data, was incorporated into Leadscope's expert rule-based and statistical-based (Q)SAR models where increased predictive performance was demonstrated. Copyright © 2016 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Kim, Jonnathan H.
1995-01-01
Humans can perform many complicated tasks without explicit rules. This inherent and advantageous capability becomes a hurdle when a task is to be automated. Modern computers and numerical calculations require explicit rules and discrete numerical values. In order to bridge the gap between human knowledge and automating tools, a knowledge model is proposed. Knowledge modeling techniques are discussed and utilized to automate a labor and time intensive task of detecting anomalous bearing wear patterns in the Space Shuttle Main Engine (SSME) High Pressure Oxygen Turbopump (HPOTP).
NASA Astrophysics Data System (ADS)
Chang, Ya-Ting; Chang, Li-Chiu; Chang, Fi-John
2005-04-01
To bridge the gap between academic research and actual operation, we propose an intelligent control system for reservoir operation. The methodology includes two major processes, the knowledge acquired and implemented, and the inference system. In this study, a genetic algorithm (GA) and a fuzzy rule base (FRB) are used to extract knowledge based on the historical inflow data with a design objective function and on the operating rule curves respectively. The adaptive network-based fuzzy inference system (ANFIS) is then used to implement the knowledge, to create the fuzzy inference system, and then to estimate the optimal reservoir operation. To investigate its applicability and practicability, the Shihmen reservoir, Taiwan, is used as a case study. For the purpose of comparison, a simulation of the currently used M-5 operating rule curve is also performed. The results demonstrate that (1) the GA is an efficient way to search the optimal input-output patterns, (2) the FRB can extract the knowledge from the operating rule curves, and (3) the ANFIS models built on different types of knowledge can produce much better performance than the traditional M-5 curves in real-time reservoir operation. Moreover, we show that the model can be more intelligent for reservoir operation if more information (or knowledge) is involved.
Automation of energy demand forecasting
NASA Astrophysics Data System (ADS)
Siddique, Sanzad
Automation of energy demand forecasting saves time and effort by searching automatically for an appropriate model in a candidate model space without manual intervention. This thesis introduces a search-based approach that improves the performance of the model searching process for econometrics models. Further improvements in the accuracy of the energy demand forecasting are achieved by integrating nonlinear transformations within the models. This thesis introduces machine learning techniques that are capable of modeling such nonlinearity. Algorithms for learning domain knowledge from time series data using the machine learning methods are also presented. The novel search based approach and the machine learning models are tested with synthetic data as well as with natural gas and electricity demand signals. Experimental results show that the model searching technique is capable of finding an appropriate forecasting model. Further experimental results demonstrate an improved forecasting accuracy achieved by using the novel machine learning techniques introduced in this thesis. This thesis presents an analysis of how the machine learning techniques learn domain knowledge. The learned domain knowledge is used to improve the forecast accuracy.
Chen, Yen-Lin; Chiang, Hsin-Han; Yu, Chao-Wei; Chiang, Chuan-Yen; Liu, Chuan-Ming; Wang, Jenq-Haur
2012-01-01
This study develops and integrates an efficient knowledge-based system and a component-based framework to design an intelligent and flexible home health care system. The proposed knowledge-based system integrates an efficient rule-based reasoning model and flexible knowledge rules for determining efficiently and rapidly the necessary physiological and medication treatment procedures based on software modules, video camera sensors, communication devices, and physiological sensor information. This knowledge-based system offers high flexibility for improving and extending the system further to meet the monitoring demands of new patient and caregiver health care by updating the knowledge rules in the inference mechanism. All of the proposed functional components in this study are reusable, configurable, and extensible for system developers. Based on the experimental results, the proposed intelligent homecare system demonstrates that it can accomplish the extensible, customizable, and configurable demands of the ubiquitous healthcare systems to meet the different demands of patients and caregivers under various rehabilitation and nursing conditions.
Chen, Yen-Lin; Chiang, Hsin-Han; Yu, Chao-Wei; Chiang, Chuan-Yen; Liu, Chuan-Ming; Wang, Jenq-Haur
2012-01-01
This study develops and integrates an efficient knowledge-based system and a component-based framework to design an intelligent and flexible home health care system. The proposed knowledge-based system integrates an efficient rule-based reasoning model and flexible knowledge rules for determining efficiently and rapidly the necessary physiological and medication treatment procedures based on software modules, video camera sensors, communication devices, and physiological sensor information. This knowledge-based system offers high flexibility for improving and extending the system further to meet the monitoring demands of new patient and caregiver health care by updating the knowledge rules in the inference mechanism. All of the proposed functional components in this study are reusable, configurable, and extensible for system developers. Based on the experimental results, the proposed intelligent homecare system demonstrates that it can accomplish the extensible, customizable, and configurable demands of the ubiquitous healthcare systems to meet the different demands of patients and caregivers under various rehabilitation and nursing conditions. PMID:23112650
ProbOnto: ontology and knowledge base of probability distributions.
Swat, Maciej J; Grenon, Pierre; Wimalaratne, Sarala
2016-09-01
Probability distributions play a central role in mathematical and statistical modelling. The encoding, annotation and exchange of such models could be greatly simplified by a resource providing a common reference for the definition of probability distributions. Although some resources exist, no suitably detailed and complex ontology exists nor any database allowing programmatic access. ProbOnto, is an ontology-based knowledge base of probability distributions, featuring more than 80 uni- and multivariate distributions with their defining functions, characteristics, relationships and re-parameterization formulas. It can be used for model annotation and facilitates the encoding of distribution-based models, related functions and quantities. http://probonto.org mjswat@ebi.ac.uk Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.
Evidence-Based Administration for Decision Making in the Framework of Knowledge Strategic Management
ERIC Educational Resources Information Center
Del Junco, Julio Garcia; Zaballa, Rafael De Reyna; de Perea, Juan Garcia Alvarez
2010-01-01
Purpose: This paper seeks to present a model based on evidence-based administration (EBA), which aims to facilitate the creation, transformation and diffusion of knowledge in learning organizations. Design/methodology/approach: A theoretical framework is proposed based on EBA and the case method. Accordingly, an empirical study was carried out in…
New Proposals for Generating and Exploiting Solution-Oriented Knowledge
ERIC Educational Resources Information Center
Gredig, Daniel; Sommerfeld, Peter
2008-01-01
The claim that professional social work should be based on scientific knowledge is many decades old with knowledge transfer usually moving in the direction from science to practice. The authors critique this model of knowledge transfer and support a hybrid one that places more of an emphasis on professional knowledge and action occurring in the…
Knowledge-based segmentation and feature analysis of hand and wrist radiographs
NASA Astrophysics Data System (ADS)
Efford, Nicholas D.
1993-07-01
The segmentation of hand and wrist radiographs for applications such as skeletal maturity assessment is best achieved by model-driven approaches incorporating anatomical knowledge. The reasons for this are discussed, and a particular frame-based or 'blackboard' strategy for the simultaneous segmentation of the hand and estimation of bone age via the TW2 method is described. The new approach is structured for optimum robustness and computational efficiency: features of interest are detected and analyzes in order of their size and prominence in the image, the largest and most distinctive being dealt with first, and the evidence generated by feature analysis is used to update a model of hand anatomy and hence guide later stages of the segmentation. Closed bone boundaries are formed by a hybrid technique combining knowledge-based, one-dimensional edge detection with model-assisted heuristic tree searching.
Knowledge-acquisition tools for medical knowledge-based systems.
Lanzola, G; Quaglini, S; Stefanelli, M
1995-03-01
Knowledge-based systems (KBS) have been proposed to solve a large variety of medical problems. A strategic issue for KBS development and maintenance are the efforts required for both knowledge engineers and domain experts. The proposed solution is building efficient knowledge acquisition (KA) tools. This paper presents a set of KA tools we are developing within a European Project called GAMES II. They have been designed after the formulation of an epistemological model of medical reasoning. The main goal is that of developing a computational framework which allows knowledge engineers and domain experts to interact cooperatively in developing a medical KBS. To this aim, a set of reusable software components is highly recommended. Their design was facilitated by the development of a methodology for KBS construction. It views this process as comprising two activities: the tailoring of the epistemological model to the specific medical task to be executed and the subsequent translation of this model into a computational architecture so that the connections between computational structures and their knowledge level counterparts are maintained. The KA tools we developed are illustrated taking examples from the behavior of a KBS we are building for the management of children with acute myeloid leukemia.
ERIC Educational Resources Information Center
Briand-Lamarche, Mélodie; Pinard, Renée; Thériault, Pascale; Dagenais, Christian
2016-01-01
To encourage the use of research-based information (RBI) in education in Quebec, the "Centre de transfert pour la réussite educative du Québec" CTREQ and the RENARD team, a knowledge transfer research team, developed the Competency Model for Knowledge Translation to Support Educational Achievement among Quebec Youth. They then developed…
Expert system for web based collaborative CAE
NASA Astrophysics Data System (ADS)
Hou, Liang; Lin, Zusheng
2006-11-01
An expert system for web based collaborative CAE was developed based on knowledge engineering, relational database and commercial FEA (Finite element analysis) software. The architecture of the system was illustrated. In this system, the experts' experiences, theories and typical examples and other related knowledge, which will be used in the stage of pre-process in FEA, were categorized into analysis process and object knowledge. Then, the integrated knowledge model based on object-oriented method and rule based method was described. The integrated reasoning process based on CBR (case based reasoning) and rule based reasoning was presented. Finally, the analysis process of this expert system in web based CAE application was illustrated, and an analysis example of a machine tool's column was illustrated to prove the validity of the system.
NASA Astrophysics Data System (ADS)
Bamberger, Yael M.; Davis, Elizabeth A.
2013-01-01
This paper focuses on students' ability to transfer modelling performances across content areas, taking into consideration their improvement of content knowledge as a result of a model-based instruction. Sixty-five sixth grade students of one science teacher in an urban public school in the Midwestern USA engaged in scientific modelling practices that were incorporated into a curriculum focused on the nature of matter. Concept-process models were embedded in the curriculum, as well as emphasis on meta-modelling knowledge and modelling practices. Pre-post test items that required drawing scientific models of smell, evaporation, and friction were analysed. The level of content understanding was coded and scored, as were the following elements of modelling performance: explanation, comparativeness, abstraction, and labelling. Paired t-tests were conducted to analyse differences in students' pre-post tests scores on content knowledge and on each element of the modelling performances. These are described in terms of the amount of transfer. Students significantly improved in their content knowledge for the smell and the evaporation models, but not for the friction model, which was expected as that topic was not taught during the instruction. However, students significantly improved in some of their modelling performances for all the three models. This improvement serves as evidence that the model-based instruction can help students acquire modelling practices that they can apply in a new content area.
Systematic analysis of signaling pathways using an integrative environment.
Visvanathan, Mahesh; Breit, Marc; Pfeifer, Bernhard; Baumgartner, Christian; Modre-Osprian, Robert; Tilg, Bernhard
2007-01-01
Understanding the biological processes of signaling pathways as a whole system requires an integrative software environment that has comprehensive capabilities. The environment should include tools for pathway design, visualization, simulation and a knowledge base concerning signaling pathways as one. In this paper we introduce a new integrative environment for the systematic analysis of signaling pathways. This system includes environments for pathway design, visualization, simulation and a knowledge base that combines biological and modeling information concerning signaling pathways that provides the basic understanding of the biological system, its structure and functioning. The system is designed with a client-server architecture. It contains a pathway designing environment and a simulation environment as upper layers with a relational knowledge base as the underlying layer. The TNFa-mediated NF-kB signal trans-duction pathway model was designed and tested using our integrative framework. It was also useful to define the structure of the knowledge base. Sensitivity analysis of this specific pathway was performed providing simulation data. Then the model was extended showing promising initial results. The proposed system offers a holistic view of pathways containing biological and modeling data. It will help us to perform biological interpretation of the simulation results and thus contribute to a better understanding of the biological system for drug identification.
The KATE shell: An implementation of model-based control, monitor and diagnosis
NASA Technical Reports Server (NTRS)
Cornell, Matthew
1987-01-01
The conventional control and monitor software currently used by the Space Center for Space Shuttle processing has many limitations such as high maintenance costs, limited diagnostic capabilities and simulation support. These limitations have caused the development of a knowledge based (or model based) shell to generically control and monitor electro-mechanical systems. The knowledge base describes the system's structure and function and is used by a software shell to do real time constraints checking, low level control of components, diagnosis of detected faults, sensor validation, automatic generation of schematic diagrams and automatic recovery from failures. This approach is more versatile and more powerful than the conventional hard coded approach and offers many advantages over it, although, for systems which require high speed reaction times or aren't well understood, knowledge based control and monitor systems may not be appropriate.
Knowledge acquisition and learning process description in context of e-learning
NASA Astrophysics Data System (ADS)
Kiselev, B. G.; Yakutenko, V. A.; Yuriev, M. A.
2017-01-01
This paper investigates the problem of design of e-learning and MOOC systems. It describes instructional design-based approaches to e-learning systems design: IMS Learning Design, MISA and TELOS. To solve this problem we present Knowledge Field of Educational Environment with Competence boundary conditions - instructional engineering method for self-learning systems design. It is based on the simplified TELOS approach and enables a user to create their individual learning path by choosing prerequisite and target competencies. The paper provides the ontology model for the described instructional engineering method, real life use cases and the classification of the presented model. Ontology model consists of 13 classes and 15 properties. Some of them are inherited from Knowledge Field of Educational Environment and some are new and describe competence boundary conditions and knowledge validation objects. Ontology model uses logical constraints and is described using OWL 2 standard. To give TELOS users better understanding of our approach we list mapping between TELOS and KFEEC.
Organizational Learning through Transformational Leadership
ERIC Educational Resources Information Center
Imran, Muhammad Kashif; Ilyas, Muhammad; Aslam, Usman; Ubaid-Ur-Rahman
2016-01-01
Purpose: The transformation of firms from resource-based-view to knowledge-based-view has extended the importance of organizational learning. Thus, this study aims to develop an organizational learning model through transformational leadership with indirect effect of knowledge management process capability and interactive role of…
Knowledge-Based Information Retrieval.
ERIC Educational Resources Information Center
Ford, Nigel
1991-01-01
Discussion of information retrieval focuses on theoretical and empirical advances in knowledge-based information retrieval. Topics discussed include the use of natural language for queries; the use of expert systems; intelligent tutoring systems; user modeling; the need for evaluation of system effectiveness; and examples of systems, including…
Knowledge Management in healthcare libraries: the current picture.
Hopkins, Emily
2017-06-01
Knowledge management has seen something of a resurgence in attention amongst health librarians recently. Of course it has never ceased to exist, but now many library staff are becoming more involved in organisational knowledge management, and positioning themselves as key players in the sphere. No single model of knowledge management is proliferating, but approaches that best fit the organisation's size, structure and culture, and a blending of evidence based practice and knowledge sharing. Whatever it is called and whatever models are used, it's clear that for librarians and information professionals, the importance of putting knowledge and evidence into practice, sharing knowledge well and capturing it effectively, are still what we will continue to do. © 2017 Health Libraries Group.
A Study of the Efficacy of Project-Based Learning Integrated with Computer-Based Simulation--STELLA
ERIC Educational Resources Information Center
Eskrootchi, Rogheyeh; Oskrochi, G. Reza
2010-01-01
Incorporating computer-simulation modelling into project-based learning may be effective but requires careful planning and implementation. Teachers, especially, need pedagogical content knowledge which refers to knowledge about how students learn from materials infused with technology. This study suggests that students learn best by actively…
ERIC Educational Resources Information Center
Honda, Hidehito; Matsuka, Toshihiko; Ueda, Kazuhiro
2017-01-01
Some researchers on binary choice inference have argued that people make inferences based on simple heuristics, such as recognition, fluency, or familiarity. Others have argued that people make inferences based on available knowledge. To examine the boundary between heuristic and knowledge usage, we examine binary choice inference processes in…
a Study on Satellite Diagnostic Expert Systems Using Case-Based Approach
NASA Astrophysics Data System (ADS)
Park, Young-Tack; Kim, Jae-Hoon; Park, Hyun-Soo
1997-06-01
Many research works are on going to monitor and diagnose diverse malfunctions of satellite systems as the complexity and number of satellites increase. Currently, many works on monitoring and diagnosis are carried out by human experts but there are needs to automate much of the routine works of them. Hence, it is necessary to study on using expert systems which can assist human experts routine work by doing automatically, thereby allow human experts devote their expertise more critical and important areas of monitoring and diagnosis. In this paper, we are employing artificial intelligence techniques to model human experts' knowledge and inference the constructed knowledge. Especially, case-based approaches are used to construct a knowledge base to model human expert capabilities which use previous typical exemplars. We have designed and implemented a prototype case-based system for diagnosing satellite malfunctions using cases. Our system remembers typical failure cases and diagnoses a current malfunction by indexing the case base. Diverse methods are used to build a more user friendly interface which allows human experts can build a knowledge base in as easy way.
Baumbusch, Jennifer L; Kirkham, Sheryl Reimer; Khan, Koushambhi Basu; McDonald, Heather; Semeniuk, Pat; Tan, Elsie; Anderson, Joan M
2008-04-01
There is an emerging discourse of knowledge translation that advocates a shift away from unidirectional research utilization and evidence-based practice models toward more interactive models of knowledge transfer. In this paper, we describe how our participatory approach to knowledge translation developed during an ongoing program of research concerning equitable care for diverse populations. At the core of our approach is a collaborative relationship between researchers and practitioners, which underpins the knowledge translation cycle, and occurs simultaneously with data collection/analysis/synthesis. We discuss lessons learned including: the complexities of translating knowledge within the political landscape of healthcare delivery, the need to negotiate the agendas of researchers and practitioners in a collaborative approach, and the kinds of resources needed to support this process.
Past and Future of Astronomy and SETI Cast in Maths
NASA Astrophysics Data System (ADS)
Maccone, C.
Assume that the history of Astronomy and SETI is the leading proof of the evolution of human knowledge on Earth over the last 3000 years. Then, human knowledge has increased a lot, although not at a uniform pace. A mathematical description of how much human knowledge has increased, however, is difficult to achieve. In this paper, we cast a mathematical model of the evolution of human knowledge over the last three thousand years that seems to reflect reasonably well both what is known from the past and might be extrapolated into the future. Our model is based on two seminal books by Sagan and Finney and Jones. Our model is based on the use of two cubic curves, representing the evolution of Astronomy and of SETI, respectively. We conclude by extrapolating these curves into the future and reach the conclusion that the "Star Trek" age of humankind might possibly begin by the end of this century.
C-Language Integrated Production System, Version 6.0
NASA Technical Reports Server (NTRS)
Riley, Gary; Donnell, Brian; Ly, Huyen-Anh Bebe; Ortiz, Chris
1995-01-01
C Language Integrated Production System (CLIPS) computer programs are specifically intended to model human expertise or other knowledge. CLIPS is designed to enable research on, and development and delivery of, artificial intelligence on conventional computers. CLIPS 6.0 provides cohesive software tool for handling wide variety of knowledge with support for three different programming paradigms: rule-based, object-oriented, and procedural. Rule-based programming: representation of knowledge as heuristics - essentially, rules of thumb that specify set of actions performed in given situation. Object-oriented programming: modeling of complex systems comprised of modular components easily reused to model other systems or create new components. Procedural-programming: representation of knowledge in ways similar to those of such languages as C, Pascal, Ada, and LISP. Version of CLIPS 6.0 for IBM PC-compatible computers requires DOS v3.3 or later and/or Windows 3.1 or later.
Knowledge diffusion in the collaboration hypernetwork
NASA Astrophysics Data System (ADS)
Yang, Guang-Yong; Hu, Zhao-Long; Liu, Jian-Guo
2015-02-01
As knowledge constitutes a primary productive force, it is important to understand the performance of knowledge diffusion. In this paper, we present a knowledge diffusion model based on the local-world non-uniform hypernetwork, which introduces the preferential diffusion mechanism and the knowledge absorptive capability αj, where αj is correlated with the hyperdegree dH(j) of node j. At each time step, we randomly select a node i as the sender; a receiver node is selected from the set of nodes that the sender i has published with previously, with probability proportional to the number of papers they have published together. Applying the average knowledge stock V bar(t) , the variance σ2(t) and the variance coefficient c(t) of knowledge stock to measure the growth and diffusion of knowledge and the adequacy of knowledge diffusion, we have made 3 groups of comparative experiments to investigate how different network structures, hypernetwork sizes and knowledge evolution mechanisms affect the knowledge diffusion, respectively. As the diffusion mechanisms based on the hypernetwork combine with the hyperdegree of node, the hypernetwork is more suitable for investigating the performance of knowledge diffusion. Therefore, the proposed model could be helpful for deeply understanding the process of the knowledge diffusion in the collaboration hypernetwork.
The Knowledge Building Paradigm: A Model of Learning for Net Generation Students
ERIC Educational Resources Information Center
Philip, Donald
2005-01-01
In this article Donald Philip describes Knowledge Building, a pedagogy based on the way research organizations function. The global economy, Philip argues, is driving a shift from older, industrial models to the model of the business as a learning organization. The cognitive patterns of today's Net Generation students, formed by lifetime exposure…
Acquiring, Representing, and Evaluating a Competence Model of Diagnostic Strategy.
ERIC Educational Resources Information Center
Clancey, William J.
This paper describes NEOMYCIN, a computer program that models one physician's diagnostic reasoning within a limited area of medicine. NEOMYCIN's knowledge base and reasoning procedure constitute a model of how human knowledge is organized and how it is used in diagnosis. The hypothesis is tested that such a procedure can be used to simulate both…
Nursing knowledge: hints from the placebo effect.
Zanotti, Renzo; Chiffi, Daniele
2017-07-01
Nursing knowledge stems from a dynamic interplay between population-based scientific knowledge (the general) and specific clinical cases (the particular). We compared the 'cascade model of knowledge translation', also known as 'classical biomedical model' in clinical practice (in which knowledge gained at population level may be applied directly to a specific clinical context), with an emergentist model of knowledge translation. The structure and dynamics of nursing knowledge are outlined, adopting the distinction between epistemic and non-epistemic values. Then, a (moderately) emergentist approach to nursing knowledge is proposed, based on the assumption of a two-way flow from the general to the particular and vice versa. The case of the 'placebo effect' is analysed as an example of emergentist knowledge. The placebo effect is usually considered difficult to be explained within the classical biomedical model, and we underscore its importance in shaping nursing knowledge. In fact, nurses are primarily responsible for administering placebo in the clinical setting and have an essential role in promoting the placebo effect and reducing the nocebo effect. The beliefs responsible for the placebo effect are as follows: (1) interactive, because they depend on the relationship between patients and health care professionals; (2) situated, because they occur in a given clinical context related to certain rituals; and (3) grounded on higher order beliefs concerning what an individual thinks about the beliefs of others. It is essential to know the clinical context and to understand other people's beliefs to make sense of the placebo effect. The placebo effect only works when the (higher order) beliefs of doctors, nurses and patients interact in a given setting. Finally, we argue for a close relationship between placebo effect and nursing knowledge. © 2016 John Wiley & Sons Ltd.
Cano, Isaac; Tényi, Ákos; Schueller, Christine; Wolff, Martin; Huertas Migueláñez, M Mercedes; Gomez-Cabrero, David; Antczak, Philipp; Roca, Josep; Cascante, Marta; Falciani, Francesco; Maier, Dieter
2014-11-28
Previously we generated a chronic obstructive pulmonary disease (COPD) specific knowledge base (http://www.copdknowledgebase.eu) from clinical and experimental data, text-mining results and public databases. This knowledge base allowed the retrieval of specific molecular networks together with integrated clinical and experimental data. The COPDKB has now been extended to integrate over 40 public data sources on functional interaction (e.g. signal transduction, transcriptional regulation, protein-protein interaction, gene-disease association). In addition we integrated COPD-specific expression and co-morbidity networks connecting over 6 000 genes/proteins with physiological parameters and disease states. Three mathematical models describing different aspects of systemic effects of COPD were connected to clinical and experimental data. We have completely redesigned the technical architecture of the user interface and now provide html and web browser-based access and form-based searches. A network search enables the use of interconnecting information and the generation of disease-specific sub-networks from general knowledge. Integration with the Synergy-COPD Simulation Environment enables multi-scale integrated simulation of individual computational models while integration with a Clinical Decision Support System allows delivery into clinical practice. The COPD Knowledge Base is the only publicly available knowledge resource dedicated to COPD and combining genetic information with molecular, physiological and clinical data as well as mathematical modelling. Its integrated analysis functions provide overviews about clinical trends and connections while its semantically mapped content enables complex analysis approaches. We plan to further extend the COPDKB by offering it as a repository to publish and semantically integrate data from relevant clinical trials. The COPDKB is freely available after registration at http://www.copdknowledgebase.eu.
Iyama, Yuji; Nakaura, Takeshi; Kidoh, Masafumi; Oda, Seitaro; Utsunomiya, Daisuke; Sakaino, Naritsugu; Tokuyasu, Shinichi; Osakabe, Hirokazu; Harada, Kazunori; Yamashita, Yasuyuki
2016-11-01
The purpose of this study was to evaluate the noise and image quality of images reconstructed with a knowledge-based iterative model reconstruction (knowledge-based IMR) in ultra-low dose cardiac computed tomography (CT). We performed submillisievert radiation dose coronary CT angiography on 43 patients. We also performed a phantom study to evaluate the influence of object size with the automatic exposure control phantom. We reconstructed clinical and phantom studies with filtered back projection (FBP), hybrid iterative reconstruction (hybrid IR), and knowledge-based IMR. We measured effective dose of patients and compared CT number, image noise, and contrast noise ratio in ascending aorta of each reconstruction technique. We compared the relationship between image noise and body mass index for the clinical study, and object size for phantom study. The mean effective dose was 0.98 ± 0.25 mSv. The image noise of knowledge-based IMR images was significantly lower than those of FBP and hybrid IR images (knowledge-based IMR: 19.4 ± 2.8; FBP: 126.7 ± 35.0; hybrid IR: 48.8 ± 12.8, respectively) (P < .01). The contrast noise ratio of knowledge-based IMR images was significantly higher than those of FBP and hybrid IR images (knowledge-based IMR: 29.1 ± 5.4; FBP: 4.6 ± 1.3; hybrid IR: 13.1 ± 3.5, respectively) (P < .01). There were moderate correlations between image noise and body mass index in FBP (r = 0.57, P < .01) and hybrid IR techniques (r = 0.42, P < .01); however, these correlations were weak in knowledge-based IMR (r = 0.27, P < .01). Compared to FBP and hybrid IR, the knowledge-based IMR offers significant noise reduction and improvement in image quality in submillisievert radiation dose cardiac CT. Copyright © 2016 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Floryan, Mark
2013-01-01
This dissertation presents a novel effort to develop ITS technologies that adapt by observing student behavior. In particular, we define an evolving expert knowledge base (EEKB) that structures a domain's information as a set of nodes and the relationships that exist between those nodes. The structure of this model is not the particularly novel…
Methodological Developments in Geophysical Assimilation Modeling
NASA Astrophysics Data System (ADS)
Christakos, George
2005-06-01
This work presents recent methodological developments in geophysical assimilation research. We revisit the meaning of the term "solution" of a mathematical model representing a geophysical system, and we examine its operational formulations. We argue that an assimilation solution based on epistemic cognition (which assumes that the model describes incomplete knowledge about nature and focuses on conceptual mechanisms of scientific thinking) could lead to more realistic representations of the geophysical situation than a conventional ontologic assimilation solution (which assumes that the model describes nature as is and focuses on form manipulations). Conceptually, the two approaches are fundamentally different. Unlike the reasoning structure of conventional assimilation modeling that is based mainly on ad hoc technical schemes, the epistemic cognition approach is based on teleologic criteria and stochastic adaptation principles. In this way some key ideas are introduced that could open new areas of geophysical assimilation to detailed understanding in an integrated manner. A knowledge synthesis framework can provide the rational means for assimilating a variety of knowledge bases (general and site specific) that are relevant to the geophysical system of interest. Epistemic cognition-based assimilation techniques can produce a realistic representation of the geophysical system, provide a rigorous assessment of the uncertainty sources, and generate informative predictions across space-time. The mathematics of epistemic assimilation involves a powerful and versatile spatiotemporal random field theory that imposes no restriction on the shape of the probability distributions or the form of the predictors (non-Gaussian distributions, multiple-point statistics, and nonlinear models are automatically incorporated) and accounts rigorously for the uncertainty features of the geophysical system. In the epistemic cognition context the assimilation concept may be used to investigate critical issues related to knowledge reliability, such as uncertainty due to model structure error (conceptual uncertainty).
Reducing a Knowledge-Base Search Space When Data Are Missing
NASA Technical Reports Server (NTRS)
James, Mark
2007-01-01
This software addresses the problem of how to efficiently execute a knowledge base in the presence of missing data. Computationally, this is an exponentially expensive operation that without heuristics generates a search space of 1 + 2n possible scenarios, where n is the number of rules in the knowledge base. Even for a knowledge base of the most modest size, say 16 rules, it would produce 65,537 possible scenarios. The purpose of this software is to reduce the complexity of this operation to a more manageable size. The problem that this system solves is to develop an automated approach that can reason in the presence of missing data. This is a meta-reasoning capability that repeatedly calls a diagnostic engine/model to provide prognoses and prognosis tracking. In the big picture, the scenario generator takes as its input the current state of a system, including probabilistic information from Data Forecasting. Using model-based reasoning techniques, it returns an ordered list of fault scenarios that could be generated from the current state, i.e., the plausible future failure modes of the system as it presently stands. The scenario generator models a Potential Fault Scenario (PFS) as a black box, the input of which is a set of states tagged with priorities and the output of which is one or more potential fault scenarios tagged by a confidence factor. The results from the system are used by a model-based diagnostician to predict the future health of the monitored system.
Tacit Beginnings Towards a Model of Scientific Thinking
NASA Astrophysics Data System (ADS)
Glass, Rory J.
2013-10-01
The purpose of this paper is to provide an examination of the role tacit knowledge plays in understanding, and to provide a model to make such knowledge identifiable. To do this I first consider the needs of society, the ubiquity of information in our world and the future demands of the science classroom. I propose the use of more implicit or tacit understandings as foundational elements for the development of student knowledge. To justify this proposition I consider a wide range of philosophical and psychological perspectives on knowledge. Then develop a Model of Scientific Knowledge, based in large part on a similar model created by Paul Ernest (Social constructivism as a philosophy of mathematics, SUNY Press, Albany, NY, 1998a; Situated cognition and the learning of mathematics, University of Oxford Department of Educational Studies, Oxford, 1998b). Finally, I consider the work that has been done by those in fields beyond education and the ways in which tacit knowledge can be used as a starting point for knowledge building.
Electronic health records (EHRs): supporting ASCO's vision of cancer care.
Yu, Peter; Artz, David; Warner, Jeremy
2014-01-01
ASCO's vision for cancer care in 2030 is built on the expanding importance of panomics and big data, and envisions enabling better health for patients with cancer by the rapid transformation of systems biology knowledge into cancer care advances. This vision will be heavily dependent on the use of health information technology for computational biology and clinical decision support systems (CDSS). Computational biology will allow us to construct models of cancer biology that encompass the complexity of cancer panomics data and provide us with better understanding of the mechanisms governing cancer behavior. The Agency for Healthcare Research and Quality promotes CDSS based on clinical practice guidelines, which are knowledge bases that grow too slowly to match the rate of panomic-derived knowledge. CDSS that are based on systems biology models will be more easily adaptable to rapid advancements and translational medicine. We describe the characteristics of health data representation, a model for representing molecular data that supports data extraction and use for panomic-based clinical research, and argue for CDSS that are based on systems biology and are algorithm-based.
ERIC Educational Resources Information Center
Krauskopf, Karsten; Zahn, Carmen; Hesse, Friedrich W.
2012-01-01
Web-based digital video tools enable learners to access video sources in constructive ways. To leverage these affordances teachers need to integrate their knowledge of a technology with their professional knowledge about teaching. We suggest that this is a cognitive process, which is strongly connected to a teacher's mental model of the tool's…
ERIC Educational Resources Information Center
Naito, Eisuke
This paper discusses knowledge management (KM) in relation to a shared cataloging system in Japanese university libraries. The first section describes the Japanese scene related to knowledge management and the working environment, including the SECI (Socialization, Externalization, Combination, Internalization) model, the context of knowledge, and…
ERIC Educational Resources Information Center
Stender, Anita; Brückmann, Maja; Neumann, Knut
2017-01-01
This study investigates the relationship between two different types of pedagogical content knowledge (PCK): the topic-specific professional knowledge (TSPK) and practical routines, so-called teaching scripts. Based on the Transformation Model of Lesson Planning, we assume that teaching scripts originate from a transformation of TSPK during lesson…
A Process-Based Knowledge Management System for Schools: A Case Study in Taiwan
ERIC Educational Resources Information Center
Lee, Chi-Lung; Lu, Hsi-Peng; Yang, Chyan; Hou, Huei-Tse
2010-01-01
Knowledge management systems, or KMSs, have been widely adopted in business organizations, yet little research exists on the actual integration of the knowledge management model and the application of KMSs in secondary schools. In the present study, the common difficulties and limitations regarding the implementation of knowledge management into…
Analysis of Knowledge-Sharing Evolutionary Game in University Teacher Team
ERIC Educational Resources Information Center
Huo, Mingkui
2013-01-01
The knowledge-sharing activity is a major drive force behind the progress and innovation of university teacher team. Based on the evolutionary game theory, this article analyzes the knowledge-sharing process model of this team, studies the influencing mechanism of various factors such as knowledge aggregate gap, incentive coefficient and risk…
Knowledge assistant: A sensor fusion framework for robotic environmental characterization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Feddema, J.T.; Rivera, J.J.; Tucker, S.D.
1996-12-01
A prototype sensor fusion framework called the {open_quotes}Knowledge Assistant{close_quotes} has been developed and tested on a gantry robot at Sandia National Laboratories. This Knowledge Assistant guides the robot operator during the planning, execution, and post analysis stages of the characterization process. During the planning stage, the Knowledge Assistant suggests robot paths and speeds based on knowledge of sensors available and their physical characteristics. During execution, the Knowledge Assistant coordinates the collection of data through a data acquisition {open_quotes}specialist.{close_quotes} During execution and post analysis, the Knowledge Assistant sends raw data to other {open_quotes}specialists,{close_quotes} which include statistical pattern recognition software, a neuralmore » network, and model-based search software. After the specialists return their results, the Knowledge Assistant consolidates the information and returns a report to the robot control system where the sensed objects and their attributes (e.g. estimated dimensions, weight, material composition, etc.) are displayed in the world model. This paper highlights the major components of this system.« less
Dutch Research on Knowledge-Based Instructional Systems: Introduction to the Special Issue.
ERIC Educational Resources Information Center
van Merrienboer, Jeroen J. G.
1994-01-01
Provides an overview of this issue that reviews Dutch research concerning knowledge-based instructional systems. Topics discussed include experimental research, conceptual models, design considerations, and guidelines; the design of student diagnostic modules, instructional modules, and interface modules; second-language teaching; intelligent…
Knowledge-based approach for generating target system specifications from a domain model
NASA Technical Reports Server (NTRS)
Gomaa, Hassan; Kerschberg, Larry; Sugumaran, Vijayan
1992-01-01
Several institutions in industry and academia are pursuing research efforts in domain modeling to address unresolved issues in software reuse. To demonstrate the concepts of domain modeling and software reuse, a prototype software engineering environment is being developed at George Mason University to support the creation of domain models and the generation of target system specifications. This prototype environment, which is application domain independent, consists of an integrated set of commercial off-the-shelf software tools and custom-developed software tools. This paper describes the knowledge-based tool that was developed as part of the environment to generate target system specifications from a domain model.
Corporate knowledge repository: Adopting academic LMS into corporate environment
NASA Astrophysics Data System (ADS)
Bakar, Muhamad Shahbani Abu; Jalil, Dzulkafli
2017-10-01
The growth of Knowledge Economy has transformed human capital to be the vital asset in business organization of the 21st century. Arguably, due to its white-collar nature, knowledge-based industry is more favorable than traditional manufacturing business. However, over dependency on human capital can also be a major challenge as any workers will inevitably leave the company or retire. This situation will possibly create knowledge gap that may impact business continuity of the enterprise. Knowledge retention in the corporate environment has been of many research interests. Learning Management System (LMS) refers to the system that provides the delivery, assessment and management tools for an organization to handle its knowledge repository. By using the aspirations of a proven LMS implemented in an academic environment, this paper proposes LMS model that can be used to enable peer-to-peer knowledge capture and sharing in the knowledge-based organization. Cloud Enterprise Resource Planning (ERP), referred to an ERP solution in the internet cloud environment was chosen as the domain knowledge. The complexity of the Cloud ERP business and its knowledge make it very vulnerable to the knowledge retention problem. This paper discusses how the company's essential knowledge can be retained using the LMS system derived from academic environment into the corporate model.
Sentiment classification technology based on Markov logic networks
NASA Astrophysics Data System (ADS)
He, Hui; Li, Zhigang; Yao, Chongchong; Zhang, Weizhe
2016-07-01
With diverse online media emerging, there is a growing concern of sentiment classification problem. At present, text sentiment classification mainly utilizes supervised machine learning methods, which feature certain domain dependency. On the basis of Markov logic networks (MLNs), this study proposed a cross-domain multi-task text sentiment classification method rooted in transfer learning. Through many-to-one knowledge transfer, labeled text sentiment classification, knowledge was successfully transferred into other domains, and the precision of the sentiment classification analysis in the text tendency domain was improved. The experimental results revealed the following: (1) the model based on a MLN demonstrated higher precision than the single individual learning plan model. (2) Multi-task transfer learning based on Markov logical networks could acquire more knowledge than self-domain learning. The cross-domain text sentiment classification model could significantly improve the precision and efficiency of text sentiment classification.
Modeling Guru: Knowledge Base for NASA Modelers
NASA Astrophysics Data System (ADS)
Seablom, M. S.; Wojcik, G. S.; van Aartsen, B. H.
2009-05-01
Modeling Guru is an on-line knowledge-sharing resource for anyone involved with or interested in NASA's scientific models or High End Computing (HEC) systems. Developed and maintained by the NASA's Software Integration and Visualization Office (SIVO) and the NASA Center for Computational Sciences (NCCS), Modeling Guru's combined forums and knowledge base for research and collaboration is becoming a repository for the accumulated expertise of NASA's scientific modeling and HEC communities. All NASA modelers and associates are encouraged to participate and provide knowledge about the models and systems so that other users may benefit from their experience. Modeling Guru is divided into a hierarchy of communities, each with its own set forums and knowledge base documents. Current modeling communities include those for space science, land and atmospheric dynamics, atmospheric chemistry, and oceanography. In addition, there are communities focused on NCCS systems, HEC tools and libraries, and programming and scripting languages. Anyone may view most of the content on Modeling Guru (available at http://modelingguru.nasa.gov/), but you must log in to post messages and subscribe to community postings. The site offers a full range of "Web 2.0" features, including discussion forums, "wiki" document generation, document uploading, RSS feeds, search tools, blogs, email notification, and "breadcrumb" links. A discussion (a.k.a. forum "thread") is used to post comments, solicit feedback, or ask questions. If marked as a question, SIVO will monitor the thread, and normally respond within a day. Discussions can include embedded images, tables, and formatting through the use of the Rich Text Editor. Also, the user can add "Tags" to their thread to facilitate later searches. The "knowledge base" is comprised of documents that are used to capture and share expertise with others. The default "wiki" document lets users edit within the browser so others can easily collaborate on the same document, even allowing the author to select those who may edit and approve the document. To maintain knowledge integrity, all documents are moderated before they are visible to the public. Modeling Guru, running on Clearspace by Jive Software, has been an active resource to the NASA modeling and HEC communities for more than a year and currently has more than 100 active users. SIVO will soon install live instant messaging support, as well as a user-customizable homepage with social-networking features. In addition, SIVO plans to implement a large dataset/file storage capability so that users can quickly and easily exchange datasets and files with one another. Continued active community participation combined with periodic software updates and improved features will ensure that Modeling Guru remains a vibrant, effective, easy-to-use tool for the NASA scientific community.
NASA Astrophysics Data System (ADS)
Luo, Keqin
1999-11-01
The electroplating industry of over 10,000 planting plants nationwide is one of the major waste generators in the industry. Large quantities of wastewater, spent solvents, spent process solutions, and sludge are the major wastes generated daily in plants, which costs the industry tremendously for waste treatment and disposal and hinders the further development of the industry. It becomes, therefore, an urgent need for the industry to identify technically most effective and economically most attractive methodologies and technologies to minimize the waste, while the production competitiveness can be still maintained. This dissertation aims at developing a novel WM methodology using artificial intelligence, fuzzy logic, and fundamental knowledge in chemical engineering, and an intelligent decision support tool. The WM methodology consists of two parts: the heuristic knowledge-based qualitative WM decision analysis and support methodology and fundamental knowledge-based quantitative process analysis methodology for waste reduction. In the former, a large number of WM strategies are represented as fuzzy rules. This becomes the main part of the knowledge base in the decision support tool, WMEP-Advisor. In the latter, various first-principles-based process dynamic models are developed. These models can characterize all three major types of operations in an electroplating plant, i.e., cleaning, rinsing, and plating. This development allows us to perform a thorough process analysis on bath efficiency, chemical consumption, wastewater generation, sludge generation, etc. Additional models are developed for quantifying drag-out and evaporation that are critical for waste reduction. The models are validated through numerous industrial experiments in a typical plating line of an industrial partner. The unique contribution of this research is that it is the first time for the electroplating industry to (i) use systematically available WM strategies, (ii) know quantitatively and accurately what is going on in each tank, and (iii) identify all WM opportunities through process improvement. This work has formed a solid foundation for the further development of powerful WM technologies for comprehensive WM in the following decade.
Temporal and contextual knowledge in model-based expert systems
NASA Technical Reports Server (NTRS)
Toth-Fejel, Tihamer; Heher, Dennis
1987-01-01
A basic paradigm that allows representation of physical systems with a focus on context and time is presented. Paragon provides the capability to quickly capture an expert's knowledge in a cognitively resonant manner. From that description, Paragon creates a simulation model in LISP, which when executed, verifies that the domain expert did not make any mistakes. The Achille's heel of rule-based systems has been the lack of a systematic methodology for testing, and Paragon's developers are certain that the model-based approach overcomes that problem. The reason this testing is now possible is that software, which is very difficult to test, has in essence been transformed into hardware.
NASA Astrophysics Data System (ADS)
Ţîţu, M. A.; Pop, A. B.; Ţîţu, Ș
2017-06-01
This paper presents a study on the modelling and optimization of certain variables by using the Taguchi Method with a view to modelling and optimizing the process of pressing tappets into anchors, process conducted in an organization that promotes knowledge-based management. The paper promotes practical concepts of the Taguchi Method and describes the way in which the objective functions are obtained and used during the modelling and optimization of the process of pressing tappets into the anchors.
Value-based choice: An integrative, neuroscience-informed model of health goals.
Berkman, Elliot T
2018-01-01
Traditional models of health behaviour focus on the roles of cognitive, personality and social-cognitive constructs (e.g. executive function, grit, self-efficacy), and give less attention to the process by which these constructs interact in the moment that a health-relevant choice is made. Health psychology needs a process-focused account of how various factors are integrated to produce the decisions that determine health behaviour. I present an integrative value-based choice model of health behaviour, which characterises the mechanism by which a variety of factors come together to determine behaviour. This model imports knowledge from research on behavioural economics and neuroscience about how choices are made to the study of health behaviour, and uses that knowledge to generate novel predictions about how to change health behaviour. I describe anomalies in value-based choice that can be exploited for health promotion, and review neuroimaging evidence about the involvement of midline dopamine structures in tracking and integrating value-related information during choice. I highlight how this knowledge can bring insights to health psychology using illustrative case of healthy eating. Value-based choice is a viable model for health behaviour and opens new avenues for mechanism-focused intervention.
NASA Technical Reports Server (NTRS)
Bailin, Sydney; Paterra, Frank; Henderson, Scott; Truszkowski, Walt
1993-01-01
This paper presents a discussion of current work in the area of graphical modeling and model-based reasoning being undertaken by the Automation Technology Section, Code 522.3, at Goddard. The work was initially motivated by the growing realization that the knowledge acquisition process was a major bottleneck in the generation of fault detection, isolation, and repair (FDIR) systems for application in automated Mission Operations. As with most research activities this work started out with a simple objective: to develop a proof-of-concept system demonstrating that a draft rule-base for a FDIR system could be automatically realized by reasoning from a graphical representation of the system to be monitored. This work was called Knowledge From Pictures (KFP) (Truszkowski et. al. 1992). As the work has successfully progressed the KFP tool has become an environment populated by a set of tools that support a more comprehensive approach to model-based reasoning. This paper continues by giving an overview of the graphical modeling objectives of the work, describing the three tools that now populate the KFP environment, briefly presenting a discussion of related work in the field, and by indicating future directions for the KFP environment.
A knowledge representation of local pandemic influenza planning models.
Islam, Runa; Brandeau, Margaret L; Das, Amar K
2007-10-11
Planning for pandemic flu outbreak at the small-government level can be aided through the use of mathematical policy models. Formulating and analyzing policy models, however, can be a time- and expertise-expensive process. We believe that a knowledge-based system for facilitating the instantiation of locale- and problem-specific policy models can reduce some of these costs. In this work, we present the ontology we have developed for pandemic influenza policy models.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hyung Lee; Rich Johnson, Ph.D.; Kimberlyn C. Moussesau
2011-12-01
The Nuclear Energy - Knowledge base for Advanced Modeling and Simulation (NE-KAMS) is being developed at the Idaho National Laboratory in conjunction with Bettis Laboratory, Sandia National Laboratories, Argonne National Laboratory, Oak Ridge National Laboratory, Utah State University and others. The objective of this consortium is to establish a comprehensive knowledge base to provide Verification and Validation (V&V) and Uncertainty Quantification (UQ) and other resources for advanced modeling and simulation (M&S) in nuclear reactor design and analysis. NE-KAMS will become a valuable resource for the nuclear industry, the national laboratories, the U.S. NRC and the public to help ensure themore » safe operation of existing and future nuclear reactors. A survey and evaluation of the state-of-the-art of existing V&V and M&S databases, including the Department of Energy and commercial databases, has been performed to ensure that the NE-KAMS effort will not be duplicating existing resources and capabilities and to assess the scope of the effort required to develop and implement NE-KAMS. The survey and evaluation have indeed highlighted the unique set of value-added functionality and services that NE-KAMS will provide to its users. Additionally, the survey has helped develop a better understanding of the architecture and functionality of these data and knowledge bases that can be used to leverage the development of NE-KAMS.« less
Learning to Teach: Pedagogical Content Knowledge in Adventure-Based Learning
ERIC Educational Resources Information Center
Sutherland, Sue; Stuhr, Paul T.; Ayvazo, Shiri
2016-01-01
Background: Many alternative curricular models exist in physical education to better meet the needs of students than the multi-activity team sports curriculum that dominates in the USA. These alternative curricular models typically require different content knowledge (CK) and pedagogical CK (PCK) to implement successfully. One of the complexities…
ERIC Educational Resources Information Center
Stranieri, Andrew; Yearwood, John
2008-01-01
This paper describes a narrative-based interactive learning environment which aims to elucidate reasoning using interactive scenarios that may be used in training novices in decision-making. Its design is based on an approach to generating narrative from knowledge that has been modelled in specific decision/reasoning domains. The approach uses a…
A Method for Cognitive Task Analysis
1992-07-01
A method for cognitive task analysis is described based on the notion of ’generic tasks’. The method distinguishes three layers of analysis. At the...model for applied areas such as the development of knowledge-based systems and training, are discussed. Problem solving, Cognitive Task Analysis , Knowledge, Strategies.
ERIC Educational Resources Information Center
Sikorski, Eric G.; Johnson, Tristan E.; Ruscher, Paul H.
2012-01-01
The purpose of this study was to examine the effects of a shared mental model (SMM) based intervention on student team mental model similarity and ultimately team performance in an undergraduate meteorology course. The team knowledge sharing (TKS) intervention was designed to promote team reflection, communication, and improvement planning.…
Computer Model of the Empirical Knowledge of Physics Formation: Coordination with Testing Results
ERIC Educational Resources Information Center
Mayer, Robert V.
2016-01-01
The use of method of imitational modeling to study forming the empirical knowledge in pupil's consciousness is discussed. The offered model is based on division of the physical facts into three categories: 1) the facts established in everyday life; 2) the facts, which the pupil can experimentally establish at a physics lesson; 3) the facts which…
NASA Astrophysics Data System (ADS)
Okuzawa, Yuki; Kato, Shohei; Kanoh, Masayoshi; Itoh, Hidenori
A knowledge-based approach to imitation learning of motion generation for humanoid robots and an imitative motion generation system based on motion knowledge learning and modification are described. The system has three parts: recognizing, learning, and modifying parts. The first part recognizes an instructed motion distinguishing it from the motion knowledge database by the continuous hidden markov model. When the motion is recognized as being unfamiliar, the second part learns it using locally weighted regression and acquires a knowledge of the motion. When a robot recognizes the instructed motion as familiar or judges that its acquired knowledge is applicable to the motion generation, the third part imitates the instructed motion by modifying a learned motion. This paper reports some performance results: the motion imitation of several radio gymnastics motions.
Strengthening community participation in reducing GHG emission from forest and peatland fire
NASA Astrophysics Data System (ADS)
Thoha, A. S.; Saharjo, B. H.; Boer, R.; Ardiansyah, M.
2018-02-01
Strengthening community participation is needed to find solutions to encourage community more participate in reducing Green House Gas (GHG) from forest and peatland fire. This research aimed to identify stakeholders that have the role in forest and peatland fire control and to formulate strengthening model of community participation through community-based early warning fire. Stakeholder mapping and action research were used to determine stakeholders that had potential influence and interest and to formulate strengthening model of community participation in reducing GHG from forest and peatland fire. There was found that position of key players in the mapping of stakeholders came from the government institution. The existence of community-based fire control group can strengthen government institution through collaborating with stakeholders having strong interest and influence. Moreover, it was found several local knowledge in Kapuas District about how communities predict drought that have potential value for developing the community-based early warning fire system. Formulated institutional model in this research also can be further developed as a model institution in the preservation of natural resources based on local knowledge. In conclusion, local knowledge and community-based fire groups can be integrated within strengthening model of community participation in reducing GHG from forest and peatland fire.
Towards an Age-Phenome Knowledge-base
2011-01-01
Background Currently, data about age-phenotype associations are not systematically organized and cannot be studied methodically. Searching for scientific articles describing phenotypic changes reported as occurring at a given age is not possible for most ages. Results Here we present the Age-Phenome Knowledge-base (APK), in which knowledge about age-related phenotypic patterns and events can be modeled and stored for retrieval. The APK contains evidence connecting specific ages or age groups with phenotypes, such as disease and clinical traits. Using a simple text mining tool developed for this purpose, we extracted instances of age-phenotype associations from journal abstracts related to non-insulin-dependent Diabetes Mellitus. In addition, links between age and phenotype were extracted from clinical data obtained from the NHANES III survey. The knowledge stored in the APK is made available for the relevant research community in the form of 'Age-Cards', each card holds the collection of all the information stored in the APK about a particular age. These Age-Cards are presented in a wiki, allowing community review, amendment and contribution of additional information. In addition to the wiki interaction, complex searches can also be conducted which require the user to have some knowledge of database query construction. Conclusions The combination of a knowledge model based repository with community participation in the evolution and refinement of the knowledge-base makes the APK a useful and valuable environment for collecting and curating existing knowledge of the connections between age and phenotypes. PMID:21651792
Knowledge represented using RDF semantic network in the concept of semantic web
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lukasova, A., E-mail: alena.lukasova@osu.cz; Vajgl, M., E-mail: marek.vajgl@osu.cz; Zacek, M., E-mail: martin.zacek@osu.cz
The RDF(S) model has been declared as the basic model to capture knowledge of the semantic web. It provides a common and flexible way to decompose composed knowledge to elementary statements, which can be represented by RDF triples or by RDF graph vectors. From the logical point of view, elements of knowledge can be expressed using at most binary predicates, which can be converted to RDF-triples or graph vectors. However, it is not able to capture implicit knowledge representable by logical formulas. This contribution shows how existing approaches (semantic networks and clausal form logic) can be combined together with RDFmore » to obtain RDF-compatible system with ability to represent implicit knowledge and inference over knowledge base.« less
The Shrinkage Model And Expert System Of Plastic Lens Formation
NASA Astrophysics Data System (ADS)
Chang, Rong-Seng
1988-06-01
Shrinkage causes both the appearance & dimension defects of the injected plastic lens. We have built up a model of state equations with the help of finite element analysis program to estimate the volume change (shrinkage and swelling) under the combinations of injection variables such as pressure and temperature etc., then the personal computer expert system has been build up to make that knowledge conveniently available to the user in the model design, process planning, process operation and some other work. The domain knowledge is represented by a R-graph (Relationship-graph) model which states the relationships of variables & equations. This model could be compare with other models in the expert system. If the user has better model to solve the shrinkage problem, the program will evaluate it automatically and a learning file will be trigger by the expert system to teach the user to update their knowledge base and modify the old model by this better model. The Rubin's model and Gilmore's model have been input to the expert system. The conflict has been solved both from the user and the deeper knowledge base. A cube prism and the convex lens examples have been shown in this paper. This program is written by MULISP language in IBM PC-AT. The natural language provides English Explaination of know why and know how and the automatic English translation for the equation rules and the production rules.
Consulting as a Strategy for Knowledge Transfer
Jacobson, Nora; Butterill, Dale; Goering, Paula
2005-01-01
Academic researchers who work on health policy and health services are expected to transfer knowledge to decision makers. Decision makers often do not, however, regard academics’ traditional ways of doing research and disseminating their findings as relevant or useful. This article argues that consulting can be a strategy for transferring knowledge between researchers and decision makers and is effective at promoting the “enlightenment” and “interactive” models of knowledge use. Based on three case studies, it develops a model of knowledge transfer–focused consulting that consists of six stages and four types of work. Finally, the article explores how knowledge is generated in consulting and identifies several classes of factors facilitating its use by decision makers. PMID:15960773
NASA Astrophysics Data System (ADS)
Pennington, D. D.; Vincent, S.
2017-12-01
The NSF-funded project "Employing Model-Based Reasoning in Socio-Environmental Synthesis (EMBeRS)" has developed a generic model for exchanging knowledge across disciplines that is based on findings from the cognitive, learning, social, and organizational sciences addressing teamwork in complex problem solving situations. Two ten-day summer workshops for PhD students from large, NSF-funded interdisciplinary projects working on a variety of water issues were conducted in 2016 and 2017, testing the model by collecting a variety of data, including surveys, interviews, audio/video recordings, material artifacts and documents, and photographs. This presentation will introduce the EMBeRS model, the design of workshop activities based on the model, and results from surveys and interviews with the participating students. Findings suggest that this approach is very effective for developing a shared, integrated research vision across disciplines, compared with activities typically provided by most large research projects, and that students believe the skills developed in the EMBeRS workshops are unique and highly desireable.
Scalable Learning for Geostatistics and Speaker Recognition
2011-01-01
of prior knowledge of the model or due to improved robustness requirements). Both these methods have their own advantages and disadvantages. The use...application. If the data is well-correlated and low-dimensional, any prior knowledge available on the data can be used to build a parametric model. In the...absence of prior knowledge , non-parametric methods can be used. If the data is high-dimensional, PCA based dimensionality reduction is often the first
Learning Physics-based Models in Hydrology under the Framework of Generative Adversarial Networks
NASA Astrophysics Data System (ADS)
Karpatne, A.; Kumar, V.
2017-12-01
Generative adversarial networks (GANs), that have been highly successful in a number of applications involving large volumes of labeled and unlabeled data such as computer vision, offer huge potential for modeling the dynamics of physical processes that have been traditionally studied using simulations of physics-based models. While conventional physics-based models use labeled samples of input/output variables for model calibration (estimating the right parametric forms of relationships between variables) or data assimilation (identifying the most likely sequence of system states in dynamical systems), there is a greater opportunity to explore the full power of machine learning (ML) methods (e.g, GANs) for studying physical processes currently suffering from large knowledge gaps, e.g. ground-water flow. However, success in this endeavor requires a principled way of combining the strengths of ML methods with physics-based numerical models that are founded on a wealth of scientific knowledge. This is especially important in scientific domains like hydrology where the number of data samples is small (relative to Internet-scale applications such as image recognition where machine learning methods has found great success), and the physical relationships are complex (high-dimensional) and non-stationary. We will present a series of methods for guiding the learning of GANs using physics-based models, e.g., by using the outputs of physics-based models as input data to the generator-learner framework, and by using physics-based models as generators trained using validation data in the adversarial learning framework. These methods are being developed under the broad paradigm of theory-guided data science that we are developing to integrate scientific knowledge with data science methods for accelerating scientific discovery.
Towards a Food Safety Knowledge Base Applicable in Crisis Situations and Beyond
Falenski, Alexander; Weiser, Armin A.; Thöns, Christian; Appel, Bernd; Käsbohrer, Annemarie; Filter, Matthias
2015-01-01
In case of contamination in the food chain, fast action is required in order to reduce the numbers of affected people. In such situations, being able to predict the fate of agents in foods would help risk assessors and decision makers in assessing the potential effects of a specific contamination event and thus enable them to deduce the appropriate mitigation measures. One efficient strategy supporting this is using model based simulations. However, application in crisis situations requires ready-to-use and easy-to-adapt models to be available from the so-called food safety knowledge bases. Here, we illustrate this concept and its benefits by applying the modular open source software tools PMM-Lab and FoodProcess-Lab. As a fictitious sample scenario, an intentional ricin contamination at a beef salami production facility was modelled. Predictive models describing the inactivation of ricin were reviewed, relevant models were implemented with PMM-Lab, and simulations on residual toxin amounts in the final product were performed with FoodProcess-Lab. Due to the generic and modular modelling concept implemented in these tools, they can be applied to simulate virtually any food safety contamination scenario. Apart from the application in crisis situations, the food safety knowledge base concept will also be useful in food quality and safety investigations. PMID:26247028
Towards a Food Safety Knowledge Base Applicable in Crisis Situations and Beyond.
Falenski, Alexander; Weiser, Armin A; Thöns, Christian; Appel, Bernd; Käsbohrer, Annemarie; Filter, Matthias
2015-01-01
In case of contamination in the food chain, fast action is required in order to reduce the numbers of affected people. In such situations, being able to predict the fate of agents in foods would help risk assessors and decision makers in assessing the potential effects of a specific contamination event and thus enable them to deduce the appropriate mitigation measures. One efficient strategy supporting this is using model based simulations. However, application in crisis situations requires ready-to-use and easy-to-adapt models to be available from the so-called food safety knowledge bases. Here, we illustrate this concept and its benefits by applying the modular open source software tools PMM-Lab and FoodProcess-Lab. As a fictitious sample scenario, an intentional ricin contamination at a beef salami production facility was modelled. Predictive models describing the inactivation of ricin were reviewed, relevant models were implemented with PMM-Lab, and simulations on residual toxin amounts in the final product were performed with FoodProcess-Lab. Due to the generic and modular modelling concept implemented in these tools, they can be applied to simulate virtually any food safety contamination scenario. Apart from the application in crisis situations, the food safety knowledge base concept will also be useful in food quality and safety investigations.
Gainotti, Guido
2011-04-01
In recent years, the anatomical and functional bases of conceptual activity have attracted a growing interest. In particular, Patterson and Lambon-Ralph have proposed the existence, in the anterior parts of the temporal lobes, of a mechanism (the 'amodal semantic hub') supporting the interactive activation of semantic representations in all modalities and for all semantic categories. The aim of then present paper is to discuss this model, arguing against the notion of an 'amodal' semantic hub, because we maintain, in agreement with the Damasio's construct of 'higher-order convergence zone', that a continuum exists between perceptual information and conceptual representations, whereas the 'amodal' account views perceptual informations only as a channel through which abstract semantic knowledge can be activated. According to our model, semantic organization can be better explained by two orthogonal higher-order convergence systems, concerning, on one hand, the right vs. left hemisphere and, on the other hand, the ventral vs. dorsal processing pathways. This model posits that conceptual representations may be mainly based upon perceptual activities in the right hemisphere and upon verbal mediation in the left side of the brain. It also assumes that conceptual knowledge based on the convergence of highly processed visual information with other perceptual data (and mainly concerning living categories) may be bilaterally represented in the anterior parts of the temporal lobes, whereas knowledge based on the integration of visual data with action schemata (namely knowledge of actions, body parts and artefacts) may be more represented in the left fronto-temporo-parietal areas. Copyright © 2010 Elsevier Inc. All rights reserved.
SPARK: A Framework for Multi-Scale Agent-Based Biomedical Modeling.
Solovyev, Alexey; Mikheev, Maxim; Zhou, Leming; Dutta-Moscato, Joyeeta; Ziraldo, Cordelia; An, Gary; Vodovotz, Yoram; Mi, Qi
2010-01-01
Multi-scale modeling of complex biological systems remains a central challenge in the systems biology community. A method of dynamic knowledge representation known as agent-based modeling enables the study of higher level behavior emerging from discrete events performed by individual components. With the advancement of computer technology, agent-based modeling has emerged as an innovative technique to model the complexities of systems biology. In this work, the authors describe SPARK (Simple Platform for Agent-based Representation of Knowledge), a framework for agent-based modeling specifically designed for systems-level biomedical model development. SPARK is a stand-alone application written in Java. It provides a user-friendly interface, and a simple programming language for developing Agent-Based Models (ABMs). SPARK has the following features specialized for modeling biomedical systems: 1) continuous space that can simulate real physical space; 2) flexible agent size and shape that can represent the relative proportions of various cell types; 3) multiple spaces that can concurrently simulate and visualize multiple scales in biomedical models; 4) a convenient graphical user interface. Existing ABMs of diabetic foot ulcers and acute inflammation were implemented in SPARK. Models of identical complexity were run in both NetLogo and SPARK; the SPARK-based models ran two to three times faster.
Towards an Intelligent Planning Knowledge Base Development Environment
NASA Technical Reports Server (NTRS)
Chien, S.
1994-01-01
ract describes work in developing knowledge base editing and debugging tools for the Multimission VICAR Planner (MVP) system. MVP uses artificial intelligence planning techniques to automatically construct executable complex image processing procedures (using models of the smaller constituent image processing requests made to the JPL Multimission Image Processing Laboratory.
USDA-ARS?s Scientific Manuscript database
Metabolic reconstructions (MRs) are common denominators in systems biology and represent biochemical, genetic, and genomic (BiGG) knowledge-bases for target organisms by capturing currently available information in a consistent, structured manner. Salmonella enterica subspecies I serovar Typhimurium...
Web-video-mining-supported workflow modeling for laparoscopic surgeries.
Liu, Rui; Zhang, Xiaoli; Zhang, Hao
2016-11-01
As quality assurance is of strong concern in advanced surgeries, intelligent surgical systems are expected to have knowledge such as the knowledge of the surgical workflow model (SWM) to support their intuitive cooperation with surgeons. For generating a robust and reliable SWM, a large amount of training data is required. However, training data collected by physically recording surgery operations is often limited and data collection is time-consuming and labor-intensive, severely influencing knowledge scalability of the surgical systems. The objective of this research is to solve the knowledge scalability problem in surgical workflow modeling with a low cost and labor efficient way. A novel web-video-mining-supported surgical workflow modeling (webSWM) method is developed. A novel video quality analysis method based on topic analysis and sentiment analysis techniques is developed to select high-quality videos from abundant and noisy web videos. A statistical learning method is then used to build the workflow model based on the selected videos. To test the effectiveness of the webSWM method, 250 web videos were mined to generate a surgical workflow for the robotic cholecystectomy surgery. The generated workflow was evaluated by 4 web-retrieved videos and 4 operation-room-recorded videos, respectively. The evaluation results (video selection consistency n-index ≥0.60; surgical workflow matching degree ≥0.84) proved the effectiveness of the webSWM method in generating robust and reliable SWM knowledge by mining web videos. With the webSWM method, abundant web videos were selected and a reliable SWM was modeled in a short time with low labor cost. Satisfied performances in mining web videos and learning surgery-related knowledge show that the webSWM method is promising in scaling knowledge for intelligent surgical systems. Copyright © 2016 Elsevier B.V. All rights reserved.
Method of Testing and Predicting Failures of Electronic Mechanical Systems
NASA Technical Reports Server (NTRS)
Iverson, David L.; Patterson-Hine, Frances A.
1996-01-01
A method employing a knowledge base of human expertise comprising a reliability model analysis implemented for diagnostic routines is disclosed. The reliability analysis comprises digraph models that determine target events created by hardware failures human actions, and other factors affecting the system operation. The reliability analysis contains a wealth of human expertise information that is used to build automatic diagnostic routines and which provides a knowledge base that can be used to solve other artificial intelligence problems.
Ferrada, Evandro; Vergara, Ismael A; Melo, Francisco
2007-01-01
The correct discrimination between native and near-native protein conformations is essential for achieving accurate computer-based protein structure prediction. However, this has proven to be a difficult task, since currently available physical energy functions, empirical potentials and statistical scoring functions are still limited in achieving this goal consistently. In this work, we assess and compare the ability of different full atom knowledge-based potentials to discriminate between native protein structures and near-native protein conformations generated by comparative modeling. Using a benchmark of 152 near-native protein models and their corresponding native structures that encompass several different folds, we demonstrate that the incorporation of close non-bonded pairwise atom terms improves the discriminating power of the empirical potentials. Since the direct and unbiased derivation of close non-bonded terms from current experimental data is not possible, we obtained and used those terms from the corresponding pseudo-energy functions of a non-local knowledge-based potential. It is shown that this methodology significantly improves the discrimination between native and near-native protein conformations, suggesting that a proper description of close non-bonded terms is important to achieve a more complete and accurate description of native protein conformations. Some external knowledge-based energy functions that are widely used in model assessment performed poorly, indicating that the benchmark of models and the specific discrimination task tested in this work constitutes a difficult challenge.
Character and Local Wisdom-Based Instructional Model of Bahasa Indonesia in Vocational High Schools
ERIC Educational Resources Information Center
Anggraini, Purwati; Kusniarti, Tuti
2017-01-01
This research aimed at establishing a character and local wisdom-based instructional model of Bahasa Indonesia. The learning model based on local wisdom literature is very important to prepared, because this model can enrich the knowledge and develop the character of students. Meanwhile, the textbook can broaden the student teachers about the…
Enhancements to the KATE model-based reasoning system
NASA Technical Reports Server (NTRS)
Thomas, Stan J.
1994-01-01
KATE (Knowledge-based Autonomous Test Engineer) is a model-based software system developed in the Artificial Intelligence Laboratory at the Kennedy Space Center for monitoring, fault detection, and control of launch vehicles and ground support systems. This report describes two software efforts which enhance the functionality and usability of KATE. The first addition, a flow solver, adds to KATE a tool for modeling the flow of liquid in a pipe system. The second addition adds support for editing KATE knowledge base files to the Emacs editor. The body of this report discusses design and implementation issues having to do with these two tools. It will be useful to anyone maintaining or extending either the flow solver or the editor enhancements.
Demonopolizing medical knowledge.
Arora, Sanjeev; Thornton, Karla; Komaromy, Miriam; Kalishman, Summers; Katzman, Joanna; Duhigg, Daniel
2014-01-01
In the past 100 years, there has been an explosion of medical knowledge-and in the next 50 years, more medical knowledge will be available than ever before. Regrettably, current medical practice has been unable to keep pace with this explosion of medical knowledge. Specialized medical knowledge has been confined largely to academic medical centers (i.e., teaching hospitals) and to specialists in major cities; it has been disconnected from primary care clinicians on the front lines of patient care. To bridge this disconnect, medical knowledge must be demonopolized, and a platform for collaborative practice amongst all clinicians needs to be created. A new model of health care and education delivery called Project ECHO (Extension for Community Healthcare Outcomes), developed by the first author, does just this. Using videoconferencing technology and case-based learning, ECHO's medical specialists provide training and mentoring to primary care clinicians working in rural and urban underserved areas so that the latter can deliver the best evidence-based care to patients with complex health conditions in their own communities. The ECHO model increases access to care in rural and underserved areas, and it demonopolizes specialized medical knowledge and expertise.
Chang, Hui-Chin; Wang, Ning-Yen; Ko, Wen-Ru; Yu, You-Tsz; Lin, Long-Yau; Tsai, Hui-Fang
2017-06-01
The effective education method of medico-jurisprudence for medical students is unclear. The study was designed to evaluate the effectiveness of problem-based learning (PBL) model teaching medico-jurisprudence in clinical setting on General Law Knowledge (GLK) for medical students. Senior medical students attending either campus-based law curriculum or Obstetrics/Gynecology (Ob/Gyn) clinical setting morning meeting from February to July in 2015 were enrolled. A validated questionnaire comprising 45 questions were completed before and after the law education. The interns attending clinical setting small group improvisation medico-jurisprudence problem-based learning education had significantly better GLK scores than the GLK of students attending campus-based medical law education course after the period studied. PBL teaching model of medico-jurisprudence is an ideal alternative pedagogy model in medical law education curriculum. Copyright © 2017. Published by Elsevier B.V.
The knowledge-value chain: A conceptual framework for knowledge translation in health.
Landry, Réjean; Amara, Nabil; Pablos-Mendes, Ariel; Shademani, Ramesh; Gold, Irving
2006-08-01
This article briefly discusses knowledge translation and lists the problems associated with it. Then it uses knowledge-management literature to develop and propose a knowledge-value chain framework in order to provide an integrated conceptual model of knowledge management and application in public health organizations. The knowledge-value chain is a non-linear concept and is based on the management of five dyadic capabilities: mapping and acquisition, creation and destruction, integration and sharing/transfer, replication and protection, and performance and innovation.
The knowledge-value chain: A conceptual framework for knowledge translation in health.
Landry, Réjean; Amara, Nabil; Pablos-Mendes, Ariel; Shademani, Ramesh; Gold, Irving
2006-01-01
This article briefly discusses knowledge translation and lists the problems associated with it. Then it uses knowledge-management literature to develop and propose a knowledge-value chain framework in order to provide an integrated conceptual model of knowledge management and application in public health organizations. The knowledge-value chain is a non-linear concept and is based on the management of five dyadic capabilities: mapping and acquisition, creation and destruction, integration and sharing/transfer, replication and protection, and performance and innovation. PMID:16917645
FGMReview: design of a knowledge management tool on female genital mutilation.
Martínez Pérez, Guillermo; Turetsky, Risa
2015-11-01
Web-based literature search engines may not be user-friendly for some readers searching for information on female genital mutilation. This is a traditional practice that has no health benefits, and about 140 million girls and women worldwide have undergone it. In 2012, the website FGMReview was created with the aim to offer a user-friendly, accessible, scalable, and innovative knowledge management tool specialized in female genital mutilation. The design of this website was guided by a conceptual model based on the use of benchmarking techniques and requirements engineering, an area of knowledge from the computer informatics field, influenced by the Transcultural Nursing model. The purpose of this article is to describe this conceptual model. Nurses and other health care providers can use this conceptual model to guide their methodological approach to design and launch other eHealth projects. © The Author(s) 2014.
A model for indexing medical documents combining statistical and symbolic knowledge.
Avillach, Paul; Joubert, Michel; Fieschi, Marius
2007-10-11
To develop and evaluate an information processing method based on terminologies, in order to index medical documents in any given documentary context. We designed a model using both symbolic general knowledge extracted from the Unified Medical Language System (UMLS) and statistical knowledge extracted from a domain of application. Using statistical knowledge allowed us to contextualize the general knowledge for every particular situation. For each document studied, the extracted terms are ranked to highlight the most significant ones. The model was tested on a set of 17,079 French standardized discharge summaries (SDSs). The most important ICD-10 term of each SDS was ranked 1st or 2nd by the method in nearly 90% of the cases. The use of several terminologies leads to more precise indexing. The improvement achieved in the models implementation performances as a result of using semantic relationships is encouraging.
Universities and the Knowledge-Based Economy: Perceptions from a Developing Country
ERIC Educational Resources Information Center
Bano, Shah; Taylor, John
2015-01-01
This paper considers the role of universities in the creation of a knowledge-based economy (KBE) in a developing country, Pakistan. Some developing countries have moved quickly to develop a KBE, but progress in Pakistan is much slower. Higher education plays a crucial role as part of the triple helix model for innovation. Based on the perceptions…
Corzo, Gerald; Solomatine, Dimitri
2007-05-01
Natural phenomena are multistationary and are composed of a number of interacting processes, so one single model handling all processes often suffers from inaccuracies. A solution is to partition data in relation to such processes using the available domain knowledge or expert judgment, to train separate models for each of the processes, and to merge them in a modular model (committee). In this paper a problem of water flow forecast in watershed hydrology is considered where the flow process can be presented as consisting of two subprocesses -- base flow and excess flow, so that these two processes can be separated. Several approaches to data separation techniques are studied. Two case studies with different forecast horizons are considered. Parameters of the algorithms responsible for data partitioning are optimized using genetic algorithms and global pattern search. It was found that modularization of ANN models using domain knowledge makes models more accurate, if compared with a global model trained on the whole data set, especially when forecast horizon (and hence the complexity of the modelled processes) is increased.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robertson, SP; Quon, H; Cheng, Z
2015-06-15
Purpose: To extend the capabilities of knowledge-based treatment planning beyond simple dose queries by incorporating validated patient outcome models. Methods: From an analytic, relational database of 684 head and neck cancer patients, 372 patients were identified having dose data for both left and right parotid glands as well as baseline and follow-up xerostomia assessments. For each existing patient, knowledge-based treatment planning was simulated for by querying the dose-volume histograms and geometric shape relationships (overlap volume histograms) for all other patients. Dose predictions were captured at normalized volume thresholds (NVT) of 0%, 10%, 20, 30%, 40%, 50%, and 85% and weremore » compared with the actual achieved doses using the Wilcoxon signed-rank test. Next, a logistic regression model was used to predict the maximum severity of xerostomia up to three months following radiotherapy. Baseline xerostomia scores were subtracted from follow-up assessments and were also included in the model. The relative risks from predicted doses and actual doses were computed and compared. Results: The predicted doses for both parotid glands were significantly less than the achieved doses (p < 0.0001), with differences ranging from 830 cGy ± 1270 cGy (0% NVT) to 1673 cGy ± 1197 cGy (30% NVT). The modelled risk of xerostomia ranged from 54% to 64% for achieved doses and from 33% to 51% for the dose predictions. Relative risks varied from 1.24 to 1.87, with maximum relative risk occurring at 85% NVT. Conclusions: Data-driven generation of treatment planning objectives without consideration of the underlying normal tissue complication probability may Result in inferior plans, even if quality metrics indicate otherwise. Inclusion of complication models in knowledge-based treatment planning is necessary in order to close the feedback loop between radiotherapy treatments and patient outcomes. Future work includes advancing and validating complication models in the context of knowledge-based treatment planning. This work is supported by Philips Radiation Oncology Systems.« less
ERIC Educational Resources Information Center
Parnafes, Orit
2012-01-01
This article presents a theoretical model of the process by which students construct and elaborate explanations of scientific phenomena using visual representations. The model describes progress in the underlying conceptual processes in students' explanations as a reorganization of fine-grained knowledge elements based on the Knowledge in Pieces…
NASA Astrophysics Data System (ADS)
Ban, Sang-Woo; Lee, Minho
2008-04-01
Knowledge-based clustering and autonomous mental development remains a high priority research topic, among which the learning techniques of neural networks are used to achieve optimal performance. In this paper, we present a new framework that can automatically generate a relevance map from sensory data that can represent knowledge regarding objects and infer new knowledge about novel objects. The proposed model is based on understating of the visual what pathway in our brain. A stereo saliency map model can selectively decide salient object areas by additionally considering local symmetry feature. The incremental object perception model makes clusters for the construction of an ontology map in the color and form domains in order to perceive an arbitrary object, which is implemented by the growing fuzzy topology adaptive resonant theory (GFTART) network. Log-polar transformed color and form features for a selected object are used as inputs of the GFTART. The clustered information is relevant to describe specific objects, and the proposed model can automatically infer an unknown object by using the learned information. Experimental results with real data have demonstrated the validity of this approach.
ERIC Educational Resources Information Center
Davidowitz, Bette; Potgieter, Marietjie
2016-01-01
Research has shown that a high level of content knowledge (CK) is necessary but not sufficient to develop the special knowledge base of expert teachers known as pedagogical content knowledge (PCK). This study contributes towards research to quantify the relationship between CK and PCK in science. In order to determine the proportion of the…
Inter-firm Networks, Organizational Learning and Knowledge Updating: An Empirical Study
NASA Astrophysics Data System (ADS)
Zhang, Su-rong; Wang, Wen-ping
In the era of knowledge-based economy which information technology develops rapidly, the rate of knowledge updating has become a critical factor for enterprises to gaining competitive advantage .We build an interactional theoretical model among inter-firm networks, organizational learning and knowledge updating thereby and demonstrate it with empirical study at last. The result shows that inter-firm networks and organizational learning is the source of knowledge updating.
Comprehensible knowledge model creation for cancer treatment decision making.
Afzal, Muhammad; Hussain, Maqbool; Ali Khan, Wajahat; Ali, Taqdir; Lee, Sungyoung; Huh, Eui-Nam; Farooq Ahmad, Hafiz; Jamshed, Arif; Iqbal, Hassan; Irfan, Muhammad; Abbas Hydari, Manzar
2017-03-01
A wealth of clinical data exists in clinical documents in the form of electronic health records (EHRs). This data can be used for developing knowledge-based recommendation systems that can assist clinicians in clinical decision making and education. One of the big hurdles in developing such systems is the lack of automated mechanisms for knowledge acquisition to enable and educate clinicians in informed decision making. An automated knowledge acquisition methodology with a comprehensible knowledge model for cancer treatment (CKM-CT) is proposed. With the CKM-CT, clinical data are acquired automatically from documents. Quality of data is ensured by correcting errors and transforming various formats into a standard data format. Data preprocessing involves dimensionality reduction and missing value imputation. Predictive algorithm selection is performed on the basis of the ranking score of the weighted sum model. The knowledge builder prepares knowledge for knowledge-based services: clinical decisions and education support. Data is acquired from 13,788 head and neck cancer (HNC) documents for 3447 patients, including 1526 patients of the oral cavity site. In the data quality task, 160 staging values are corrected. In the preprocessing task, 20 attributes and 106 records are eliminated from the dataset. The Classification and Regression Trees (CRT) algorithm is selected and provides 69.0% classification accuracy in predicting HNC treatment plans, consisting of 11 decision paths that yield 11 decision rules. Our proposed methodology, CKM-CT, is helpful to find hidden knowledge in clinical documents. In CKM-CT, the prediction models are developed to assist and educate clinicians for informed decision making. The proposed methodology is generalizable to apply to data of other domains such as breast cancer with a similar objective to assist clinicians in decision making and education. Copyright © 2017 Elsevier Ltd. All rights reserved.
Habitat classification modeling with incomplete data: Pushing the habitat envelope
Zarnetske, P.L.; Edwards, T.C.; Moisen, Gretchen G.
2007-01-01
Habitat classification models (HCMs) are invaluable tools for species conservation, land-use planning, reserve design, and metapopulation assessments, particularly at broad spatial scales. However, species occurrence data are often lacking and typically limited to presence points at broad scales. This lack of absence data precludes the use of many statistical techniques for HCMs. One option is to generate pseudo-absence points so that the many available statistical modeling tools can be used. Traditional techniques generate pseudoabsence points at random across broadly defined species ranges, often failing to include biological knowledge concerning the species-habitat relationship. We incorporated biological knowledge of the species-habitat relationship into pseudo-absence points by creating habitat envelopes that constrain the region from which points were randomly selected. We define a habitat envelope as an ecological representation of a species, or species feature's (e.g., nest) observed distribution (i.e., realized niche) based on a single attribute, or the spatial intersection of multiple attributes. We created HCMs for Northern Goshawk (Accipiter gentilis atricapillus) nest habitat during the breeding season across Utah forests with extant nest presence points and ecologically based pseudo-absence points using logistic regression. Predictor variables were derived from 30-m USDA Landfire and 250-m Forest Inventory and Analysis (FIA) map products. These habitat-envelope-based models were then compared to null envelope models which use traditional practices for generating pseudo-absences. Models were assessed for fit and predictive capability using metrics such as kappa, thresholdindependent receiver operating characteristic (ROC) plots, adjusted deviance (Dadj2), and cross-validation, and were also assessed for ecological relevance. For all cases, habitat envelope-based models outperformed null envelope models and were more ecologically relevant, suggesting that incorporating biological knowledge into pseudo-absence point generation is a powerful tool for species habitat assessments. Furthermore, given some a priori knowledge of the species-habitat relationship, ecologically based pseudo-absence points can be applied to any species, ecosystem, data resolution, and spatial extent. ?? 2007 by the Ecological Society of America.
1994-09-30
relational versus object oriented DBMS, knowledge discovery, data models, rnetadata, data filtering, clustering techniques, and synthetic data. A secondary...The first was the investigation of Al/ES Lapplications (knowledge discovery, data mining, and clustering ). Here CAST collabo.rated with Dr. Fred Petry...knowledge discovery system based on clustering techniques; implemented an on-line data browser to the DBMS; completed preliminary efforts to apply object
NASA Astrophysics Data System (ADS)
Murtazina, M. Sh; Avdeenko, T. V.
2018-05-01
The state of art and the progress in application of semantic technologies in the field of scientific and research activity have been analyzed. Even elementary empirical comparison has shown that the semantic search engines are superior in all respects to conventional search technologies. However, semantic information technologies are insufficiently used in the field of scientific and research activity in Russia. In present paper an approach to construction of ontological model of knowledge base is proposed. The ontological model is based on the upper-level ontology and the RDF mechanism for linking several domain ontologies. The ontological model is implemented in the Protégé environment.
Yoo, Min-Jung; Grozel, Clément; Kiritsis, Dimitris
2016-07-08
This paper describes our conceptual framework of closed-loop lifecycle information sharing for product-service in the Internet of Things (IoT). The framework is based on the ontology model of product-service and a type of IoT message standard, Open Messaging Interface (O-MI) and Open Data Format (O-DF), which ensures data communication. (1) BACKGROUND: Based on an existing product lifecycle management (PLM) methodology, we enhanced the ontology model for the purpose of integrating efficiently the product-service ontology model that was newly developed; (2) METHODS: The IoT message transfer layer is vertically integrated into a semantic knowledge framework inside which a Semantic Info-Node Agent (SINA) uses the message format as a common protocol of product-service lifecycle data transfer; (3) RESULTS: The product-service ontology model facilitates information retrieval and knowledge extraction during the product lifecycle, while making more information available for the sake of service business creation. The vertical integration of IoT message transfer, encompassing all semantic layers, helps achieve a more flexible and modular approach to knowledge sharing in an IoT environment; (4) Contribution: A semantic data annotation applied to IoT can contribute to enhancing collected data types, which entails a richer knowledge extraction. The ontology-based PLM model enables as well the horizontal integration of heterogeneous PLM data while breaking traditional vertical information silos; (5) CONCLUSION: The framework was applied to a fictive case study with an electric car service for the purpose of demonstration. For the purpose of demonstrating the feasibility of the approach, the semantic model is implemented in Sesame APIs, which play the role of an Internet-connected Resource Description Framework (RDF) database.
Yoo, Min-Jung; Grozel, Clément; Kiritsis, Dimitris
2016-01-01
This paper describes our conceptual framework of closed-loop lifecycle information sharing for product-service in the Internet of Things (IoT). The framework is based on the ontology model of product-service and a type of IoT message standard, Open Messaging Interface (O-MI) and Open Data Format (O-DF), which ensures data communication. (1) Background: Based on an existing product lifecycle management (PLM) methodology, we enhanced the ontology model for the purpose of integrating efficiently the product-service ontology model that was newly developed; (2) Methods: The IoT message transfer layer is vertically integrated into a semantic knowledge framework inside which a Semantic Info-Node Agent (SINA) uses the message format as a common protocol of product-service lifecycle data transfer; (3) Results: The product-service ontology model facilitates information retrieval and knowledge extraction during the product lifecycle, while making more information available for the sake of service business creation. The vertical integration of IoT message transfer, encompassing all semantic layers, helps achieve a more flexible and modular approach to knowledge sharing in an IoT environment; (4) Contribution: A semantic data annotation applied to IoT can contribute to enhancing collected data types, which entails a richer knowledge extraction. The ontology-based PLM model enables as well the horizontal integration of heterogeneous PLM data while breaking traditional vertical information silos; (5) Conclusion: The framework was applied to a fictive case study with an electric car service for the purpose of demonstration. For the purpose of demonstrating the feasibility of the approach, the semantic model is implemented in Sesame APIs, which play the role of an Internet-connected Resource Description Framework (RDF) database. PMID:27399717
Engineers' Non-Scientific Models in Technology Education
ERIC Educational Resources Information Center
Norstrom, Per
2013-01-01
Engineers commonly use rules, theories and models that lack scientific justification. Examples include rules of thumb based on experience, but also models based on obsolete science or folk theories. Centrifugal forces, heat and cold as substances, and sucking vacuum all belong to the latter group. These models contradict scientific knowledge, but…
Elicitation of neurological knowledge with argument-based machine learning.
Groznik, Vida; Guid, Matej; Sadikov, Aleksander; Možina, Martin; Georgiev, Dejan; Kragelj, Veronika; Ribarič, Samo; Pirtošek, Zvezdan; Bratko, Ivan
2013-02-01
The paper describes the use of expert's knowledge in practice and the efficiency of a recently developed technique called argument-based machine learning (ABML) in the knowledge elicitation process. We are developing a neurological decision support system to help the neurologists differentiate between three types of tremors: Parkinsonian, essential, and mixed tremor (comorbidity). The system is intended to act as a second opinion for the neurologists, and most importantly to help them reduce the number of patients in the "gray area" that require a very costly further examination (DaTSCAN). We strive to elicit comprehensible and medically meaningful knowledge in such a way that it does not come at the cost of diagnostic accuracy. To alleviate the difficult problem of knowledge elicitation from data and domain experts, we used ABML. ABML guides the expert to explain critical special cases which cannot be handled automatically by machine learning. This very efficiently reduces the expert's workload, and combines expert's knowledge with learning data. 122 patients were enrolled into the study. The classification accuracy of the final model was 91%. Equally important, the initial and the final models were also evaluated for their comprehensibility by the neurologists. All 13 rules of the final model were deemed as appropriate to be able to support its decisions with good explanations. The paper demonstrates ABML's advantage in combining machine learning and expert knowledge. The accuracy of the system is very high with respect to the current state-of-the-art in clinical practice, and the system's knowledge base is assessed to be very consistent from a medical point of view. This opens up the possibility to use the system also as a teaching tool. Copyright © 2012 Elsevier B.V. All rights reserved.
Neural networks for satellite remote sensing and robotic sensor interpretation
NASA Astrophysics Data System (ADS)
Martens, Siegfried
Remote sensing of forests and robotic sensor fusion can be viewed, in part, as supervised learning problems, mapping from sensory input to perceptual output. This dissertation develops ARTMAP neural networks for real-time category learning, pattern recognition, and prediction tailored to remote sensing and robotics applications. Three studies are presented. The first two use ARTMAP to create maps from remotely sensed data, while the third uses an ARTMAP system for sensor fusion on a mobile robot. The first study uses ARTMAP to predict vegetation mixtures in the Plumas National Forest based on spectral data from the Landsat Thematic Mapper satellite. While most previous ARTMAP systems have predicted discrete output classes, this project develops new capabilities for multi-valued prediction. On the mixture prediction task, the new network is shown to perform better than maximum likelihood and linear mixture models. The second remote sensing study uses an ARTMAP classification system to evaluate the relative importance of spectral and terrain data for map-making. This project has produced a large-scale map of remotely sensed vegetation in the Sierra National Forest. Network predictions are validated with ground truth data, and maps produced using the ARTMAP system are compared to a map produced by human experts. The ARTMAP Sierra map was generated in an afternoon, while the labor intensive expert method required nearly a year to perform the same task. The robotics research uses an ARTMAP system to integrate visual information and ultrasonic sensory information on a B14 mobile robot. The goal is to produce a more accurate measure of distance than is provided by the raw sensors. ARTMAP effectively combines sensory sources both within and between modalities. The improved distance percept is used to produce occupancy grid visualizations of the robot's environment. The maps produced point to specific problems of raw sensory information processing and demonstrate the benefits of using a neural network system for sensor fusion.
Cognitive foundations for model-based sensor fusion
NASA Astrophysics Data System (ADS)
Perlovsky, Leonid I.; Weijers, Bertus; Mutz, Chris W.
2003-08-01
Target detection, tracking, and sensor fusion are complicated problems, which usually are performed sequentially. First detecting targets, then tracking, then fusing multiple sensors reduces computations. This procedure however is inapplicable to difficult targets which cannot be reliably detected using individual sensors, on individual scans or frames. In such more complicated cases one has to perform functions of fusing, tracking, and detecting concurrently. This often has led to prohibitive combinatorial complexity and, as a consequence, to sub-optimal performance as compared to the information-theoretic content of all the available data. It is well appreciated that in this task the human mind is by far superior qualitatively to existing mathematical methods of sensor fusion, however, the human mind is limited in the amount of information and speed of computation it can cope with. Therefore, research efforts have been devoted toward incorporating "biological lessons" into smart algorithms, yet success has been limited. Why is this so, and how to overcome existing limitations? The fundamental reasons for current limitations are analyzed and a potentially breakthrough research and development effort is outlined. We utilize the way our mind combines emotions and concepts in the thinking process and present the mathematical approach to accomplishing this in the current technology computers. The presentation will summarize the difficulties encountered by intelligent systems over the last 50 years related to combinatorial complexity, analyze the fundamental limitations of existing algorithms and neural networks, and relate it to the type of logic underlying the computational structure: formal, multivalued, and fuzzy logic. A new concept of dynamic logic will be introduced along with algorithms capable of pulling together all the available information from multiple sources. This new mathematical technique, like our brain, combines conceptual understanding with emotional evaluation and overcomes the combinatorial complexity of concurrent fusion, tracking, and detection. The presentation will discuss examples of performance, where computational speedups of many orders of magnitude were attained leading to performance improvements of up to 10 dB (and better).
Developing Guided Inquiry-Based Student Lab Worksheet for Laboratory Knowledge Course
NASA Astrophysics Data System (ADS)
Rahmi, Y. L.; Novriyanti, E.; Ardi, A.; Rifandi, R.
2018-04-01
The course of laboratory knowledge is an introductory course for biology students to follow various lectures practicing in the biology laboratory. Learning activities of laboratory knowledge course at this time in the Biology Department, Universitas Negeri Padang has not been completed by supporting learning media such as student lab worksheet. Guided inquiry learning model is one of the learning models that can be integrated into laboratory activity. The study aimed to produce student lab worksheet based on guided inquiry for laboratory knowledge course and to determine the validity of lab worksheet. The research was conducted using research and developmet (R&D) model. The instruments used in data collection in this research were questionnaire for student needed analysis and questionnaire to measure the student lab worksheet validity. The data obtained was quantitative from several validators. The validators consist of three lecturers. The percentage of a student lab worksheet validity was 94.18 which can be categorized was very good.
Sketching for Knowledge Capture: A Progress Report
2002-01-16
understanding , qualitative modeling, knowledge acquisition, analogy, diagrammatic reasoning, spatial reasoning. INTRODUCTION Sketching is often used...main limits of sKEA’s expressivity are (a) the predicate vocabulary in its knowledge base and (b) how natural it is to express a piece of information ...Sketching for knowledge capture: A progress report Kenneth D. Forbus Qualitative Reasoning Group Northwestern University 1890 Maple Avenue
Lee, Kyubum; Kim, Byounggun; Jeon, Minji; Kim, Jihye; Tan, Aik Choon
2018-01-01
Background With the development of artificial intelligence (AI) technology centered on deep-learning, the computer has evolved to a point where it can read a given text and answer a question based on the context of the text. Such a specific task is known as the task of machine comprehension. Existing machine comprehension tasks mostly use datasets of general texts, such as news articles or elementary school-level storybooks. However, no attempt has been made to determine whether an up-to-date deep learning-based machine comprehension model can also process scientific literature containing expert-level knowledge, especially in the biomedical domain. Objective This study aims to investigate whether a machine comprehension model can process biomedical articles as well as general texts. Since there is no dataset for the biomedical literature comprehension task, our work includes generating a large-scale question answering dataset using PubMed and manually evaluating the generated dataset. Methods We present an attention-based deep neural model tailored to the biomedical domain. To further enhance the performance of our model, we used a pretrained word vector and biomedical entity type embedding. We also developed an ensemble method of combining the results of several independent models to reduce the variance of the answers from the models. Results The experimental results showed that our proposed deep neural network model outperformed the baseline model by more than 7% on the new dataset. We also evaluated human performance on the new dataset. The human evaluation result showed that our deep neural model outperformed humans in comprehension by 22% on average. Conclusions In this work, we introduced a new task of machine comprehension in the biomedical domain using a deep neural model. Since there was no large-scale dataset for training deep neural models in the biomedical domain, we created the new cloze-style datasets Biomedical Knowledge Comprehension Title (BMKC_T) and Biomedical Knowledge Comprehension Last Sentence (BMKC_LS) (together referred to as BioMedical Knowledge Comprehension) using the PubMed corpus. The experimental results showed that the performance of our model is much higher than that of humans. We observed that our model performed consistently better regardless of the degree of difficulty of a text, whereas humans have difficulty when performing biomedical literature comprehension tasks that require expert level knowledge. PMID:29305341
A Knowledge Management Model for Firms in the Financial Services Industry
ERIC Educational Resources Information Center
Held, Carsten; Duncan, Glen; Yanamandram, Venkat
2013-01-01
The financial services industry faces many demanding challenges. Firms within this industry are predominantly knowledge-based, as are most of the industry's products, processes and services. The application of knowledge management represents a clear opportunity for financial services firms to confront challenges. However, no industry specific…
PBL and the Postmodern Condition--Knowledge Production in University Education
ERIC Educational Resources Information Center
Ravn, Ole; Jensen, Annie Aarup
2016-01-01
In this article we discuss the contemporary conditions for running the Aalborg Problem Based Learning-model (PBL). We try to pinpoint key characteristics of these conditions emphasising Lyotard's conception of knowledge production referred to as the move towards a postmodern condition for knowledge. Through discussions of this alleged condition…
NASA Technical Reports Server (NTRS)
Hill, Randall W., Jr.
1990-01-01
The issues of knowledge representation and control in hypermedia-based training environments are discussed. The main objective is to integrate the flexible presentation capability of hypermedia with a knowledge-based approach to lesson discourse management. The instructional goals and their associated concepts are represented in a knowledge representation structure called a 'concept network'. Its functional usages are many: it is used to control the navigation through a presentation space, generate tests for student evaluation, and model the student. This architecture was implemented in HyperCLIPS, a hybrid system that creates a bridge between HyperCard, a popular hypertext-like system used for building user interfaces to data bases and other applications, and CLIPS, a highly portable government-owned expert system shell.
Knowledge-based diagnosis for aerospace systems
NASA Technical Reports Server (NTRS)
Atkinson, David J.
1988-01-01
The need for automated diagnosis in aerospace systems and the approach of using knowledge-based systems are examined. Research issues in knowledge-based diagnosis which are important for aerospace applications are treated along with a review of recent relevant research developments in Artificial Intelligence. The design and operation of some existing knowledge-based diagnosis systems are described. The systems described and compared include the LES expert system for liquid oxygen loading at NASA Kennedy Space Center, the FAITH diagnosis system developed at the Jet Propulsion Laboratory, the PES procedural expert system developed at SRI International, the CSRL approach developed at Ohio State University, the StarPlan system developed by Ford Aerospace, the IDM integrated diagnostic model, and the DRAPhys diagnostic system developed at NASA Langley Research Center.
Wilk, S; Michalowski, W; O'Sullivan, D; Farion, K; Sayyad-Shirabad, J; Kuziemsky, C; Kukawka, B
2013-01-01
The purpose of this study was to create a task-based support architecture for developing clinical decision support systems (CDSSs) that assist physicians in making decisions at the point-of-care in the emergency department (ED). The backbone of the proposed architecture was established by a task-based emergency workflow model for a patient-physician encounter. The architecture was designed according to an agent-oriented paradigm. Specifically, we used the O-MaSE (Organization-based Multi-agent System Engineering) method that allows for iterative translation of functional requirements into architectural components (e.g., agents). The agent-oriented paradigm was extended with ontology-driven design to implement ontological models representing knowledge required by specific agents to operate. The task-based architecture allows for the creation of a CDSS that is aligned with the task-based emergency workflow model. It facilitates decoupling of executable components (agents) from embedded domain knowledge (ontological models), thus supporting their interoperability, sharing, and reuse. The generic architecture was implemented as a pilot system, MET3-AE--a CDSS to help with the management of pediatric asthma exacerbation in the ED. The system was evaluated in a hospital ED. The architecture allows for the creation of a CDSS that integrates support for all tasks from the task-based emergency workflow model, and interacts with hospital information systems. Proposed architecture also allows for reusing and sharing system components and knowledge across disease-specific CDSSs.
Jiang, Y; Dou, Y L; Cai, A J; Zhang, Z; Tian, T; Dai, J H; Huang, A L
2016-02-01
Knowledge-motivation-psychological model was set up and tested through structural equation model to provide evidence on HIV prevention related strategy in Men who have Sex with Men (MSM). Snowball sampling method was used to recruit a total of 550 MSM volunteers from two MSM Non-Governmental Organizations in Urumqi, Xinjiang province. HIV prevention related information on MSM was collected through a questionnaire survey. A total of 477 volunteers showed with complete information. HIV prevention related Knowledge-motivation-psychological model was built under related experience and literature. Relations between knowledge, motivation and psychological was studied, using a ' structural equation model' with data from the fitting questionnaires and modification of the model. Structural equation model presented good fitting results. After revising the fitting index: RMSEA was 0.035, NFI was 0.965 and RFI was 0.920. Thereafter the exogenous latent variables would include knowledge, motivation and psychological effects. The endogenous latent variable appeared as prevention related behaviors. The standardized total effects of motivation, knowledge, psychological on prevention behavior were 0.44, 0.41 and 0.17 respectively. Correlation coefficient of motivation and psychological effects was 0.16. Correlation coefficient on knowledge and psychological effects was -0.17 (P<0.05). Correlation coefficient of knowledge and motivation did not show statistical significance. Knowledge of HIV and motivation of HIV prevention did not show any accordance in MSM population. It was necessary to increase the awareness and to improve the motivation of HIV prevention in MSM population.
EXPECT: Explicit Representations for Flexible Acquisition
NASA Technical Reports Server (NTRS)
Swartout, BIll; Gil, Yolanda
1995-01-01
To create more powerful knowledge acquisition systems, we not only need better acquisition tools, but we need to change the architecture of the knowledge based systems we create so that their structure will provide better support for acquisition. Current acquisition tools permit users to modify factual knowledge but they provide limited support for modifying problem solving knowledge. In this paper, the authors argue that this limitation (and others) stem from the use of incomplete models of problem-solving knowledge and inflexible specification of the interdependencies between problem-solving and factual knowledge. We describe the EXPECT architecture which addresses these problems by providing an explicit representation for problem-solving knowledge and intent. Using this more explicit representation, EXPECT can automatically derive the interdependencies between problem-solving and factual knowledge. By deriving these interdependencies from the structure of the knowledge-based system itself EXPECT supports more flexible and powerful knowledge acquisition.
Disentangling the Role of Domain-Specific Knowledge in Student Modeling
NASA Astrophysics Data System (ADS)
Ruppert, John; Duncan, Ravit Golan; Chinn, Clark A.
2017-08-01
This study explores the role of domain-specific knowledge in students' modeling practice and how this knowledge interacts with two domain-general modeling strategies: use of evidence and developing a causal mechanism. We analyzed models made by middle school students who had a year of intensive model-based instruction. These models were made to explain a familiar but unstudied biological phenomenon: late onset muscle pain. Students were provided with three pieces of evidence related to this phenomenon and asked to construct a model to account for this evidence. Findings indicate that domain-specific resources play a significant role in the extent to which the models accounted for provided evidence. On the other hand, familiarity with the situation appeared to contribute to the mechanistic character of models. Our results indicate that modeling strategies alone are insufficient for the development of a mechanistic model that accounts for provided evidence and that, while learners can develop a tentative model with a basic familiarity of the situation, scaffolding certain domain-specific knowledge is necessary to assist students with incorporating evidence in modeling tasks.
Marco-Ruiz, Luis; Maldonado, J Alberto; Karlsen, Randi; Bellika, Johan G
2015-01-01
Clinical Decision Support Systems (CDSS) help to improve health care and reduce costs. However, the lack of knowledge management and modelling hampers their maintenance and reuse. Current EHR standards and terminologies can allow the semantic representation of the data and knowledge of CDSS systems boosting their interoperability, reuse and maintenance. This paper presents the modelling process of respiratory conditions' symptoms and signs by a multidisciplinary team of clinicians and information architects with the help of openEHR, SNOMED and clinical information modelling tools for a CDSS. The information model of the CDSS was defined by means of an archetype and the knowledge model was implemented by means of an SNOMED-CT based ontology.
Choi, Se Y; Ahn, Seung H; Choi, Jae D; Kim, Jung H; Lee, Byoung-Il; Kim, Jeong-In
2016-01-01
Objective: The purpose of this study was to compare CT image quality for evaluating urolithiasis using filtered back projection (FBP), statistical iterative reconstruction (IR) and knowledge-based iterative model reconstruction (IMR) according to various scan parameters and radiation doses. Methods: A 5 × 5 × 5 mm3 uric acid stone was placed in a physical human phantom at the level of the pelvis. 3 tube voltages (120, 100 and 80 kV) and 4 current–time products (100, 70, 30 and 15 mAs) were implemented in 12 scans. Each scan was reconstructed with FBP, statistical IR (Levels 5–7) and knowledge-based IMR (soft-tissue Levels 1–3). The radiation dose, objective image quality and signal-to-noise ratio (SNR) were evaluated, and subjective assessments were performed. Results: The effective doses ranged from 0.095 to 2.621 mSv. Knowledge-based IMR showed better objective image noise and SNR than did FBP and statistical IR. The subjective image noise of FBP was worse than that of statistical IR and knowledge-based IMR. The subjective assessment scores deteriorated after a break point of 100 kV and 30 mAs. Conclusion: At the setting of 100 kV and 30 mAs, the radiation dose can be decreased by approximately 84% while keeping the subjective image assessment. Advances in knowledge: Patients with urolithiasis can be evaluated with ultralow-dose non-enhanced CT using a knowledge-based IMR algorithm at a substantially reduced radiation dose with the imaging quality preserved, thereby minimizing the risks of radiation exposure while providing clinically relevant diagnostic benefits for patients. PMID:26577542
From Data to Knowledge: GEOSS experience and the GEOSS Knowledge Base contribution to the GCI
NASA Astrophysics Data System (ADS)
Santoro, M.; Nativi, S.; Mazzetti, P., Sr.; Plag, H. P.
2016-12-01
According to systems theory, data is raw, it simply exists and has no significance beyond its existence; while, information is data that has been given meaning by way of relational connection. The appropriate collection of information, such that it contributes to understanding, is a process of knowledge creation.The Global Earth Observation System of Systems (GEOSS) developed by the Group on Earth Observations (GEO) is a set of coordinated, independent Earth observation, information and processing systems that interact and provide access to diverse information for a broad range of users in both public and private sectors. GEOSS links these systems to strengthen the monitoring of the state of the Earth. In the past ten years, the development of GEOSS has taught several lessons dealing with the need to move from (open) data to information and knowledge sharing. Advanced user-focused services require to move from a data-driven framework to a knowledge sharing platform. Such a platform needs to manage information and knowledge, in addition to datasets linked to them. For this scope, GEO has launched a specific task called "GEOSS Knowledge Base", which deals with resources, like user requirements, Sustainable Development Goals (SDGs), observation and processing ontologies, publications, guidelines, best practices, business processes/algorithms, definition of advanced concepts like Essential Variables (EVs), indicators, strategic goals, etc. In turn, information and knowledge (e.g. guidelines, best practices, user requirements, business processes, algorithms, etc.) can be used to generate additional information and knowledge from shared datasets. To fully utilize and leverage the GEOSS Knowledge Base, the current GEOSS Common Infrastructure (GCI) model will be extended and advanced to consider important concepts and implementation artifacts, such as data processing services and environmental/economic models as well as EVs, Primary Indicators, and SDGs. The new GCI model will link these concepts to the present dataset, observation and sensor concepts, enabling a set of very important new capabilities to be offered to GEOSS users.
Knowledge diffusion of dynamical network in terms of interaction frequency.
Liu, Jian-Guo; Zhou, Qing; Guo, Qiang; Yang, Zhen-Hua; Xie, Fei; Han, Jing-Ti
2017-09-07
In this paper, we present a knowledge diffusion (SKD) model for dynamic networks by taking into account the interaction frequency which always used to measure the social closeness. A set of agents, which are initially interconnected to form a random network, either exchange knowledge with their neighbors or move toward a new location through an edge-rewiring procedure. The activity of knowledge exchange between agents is determined by a knowledge transfer rule that the target node would preferentially select one neighbor node to transfer knowledge with probability p according to their interaction frequency instead of the knowledge distance, otherwise, the target node would build a new link with its second-order neighbor preferentially or select one node in the system randomly with probability 1 - p. The simulation results show that, comparing with the Null model defined by the random selection mechanism and the traditional knowledge diffusion (TKD) model driven by knowledge distance, the knowledge would spread more fast based on SKD driven by interaction frequency. In particular, the network structure of SKD would evolve as an assortative one, which is a fundamental feature of social networks. This work would be helpful for deeply understanding the coevolution of the knowledge diffusion and network structure.
Knowledge Diffusion on Networks through the Game Strategy
NASA Astrophysics Data System (ADS)
Sun, Shu; Wu, Jiangning; Xuan, Zhaoguo
In this paper, we develop a knowledge diffusion model in which agents determine to give their knowledge to others according to some exchange strategies. The typical network namely small-world network is used for modeling, on which agents with knowledge are viewed as the nodes of the network and the edges are viewed as the social relationships for knowledge transmission. Such agents are permitted to interact with their neighbors repeatedly who have direct connections with them and accordingly change their strategies by choosing the most beneficial neighbors to diffuse knowledge. Two kinds of knowledge transmission strategies are proposed for the theoretical model based on the game theory and thereafter used in different simulations to examine the effect of the network structure on the knowledge diffusion effect. By analyses, two main observations can be found: One is that the simulation results are contrary to our intuition which agents would like to only accept but not share, thus they will maximize their benefit; another one is that the number of the agents acquired knowledge and the corresponding knowledge stock turn out to be independent of the percentage of those agents who choose to contribute their knowledge.
ERIC Educational Resources Information Center
Fazio, Xavier; Gallagher, Tiffany L.
2018-01-01
We offer insights for using design-based research (DBR) as a model for constructing professional development that supports curriculum and instructional knowledge regarding science and literacy integration. We spotlight experiences in the DBR process from data collected from a sample of four elementary teachers. Findings from interviews, focus…
ERIC Educational Resources Information Center
Weizman, Ayelet; Covitt, Beth A.; Koehler, Matthew J.; Lundeberg, Mary A.; Oslund, Joy A.; Low, Mark R.; Eberhardt, Janet; Urban-Lurain, Mark
2008-01-01
In this study we measured changes in science teachers' conceptual science understanding (content knowledge) and pedagogical content knowledge (PCK) while participating in a problem-based learning (PBL) model of professional development. Teachers participated in a two-week long workshop followed by nine monthly meetings during one academic year…
Experienced Teachers' Pedagogical Content Knowledge of Teaching Acid-Base Chemistry
ERIC Educational Resources Information Center
Drechsler, Michal; Van Driel, Jan
2008-01-01
We investigated the pedagogical content knowledge (PCK) of nine experienced chemistry teachers. The teachers took part in a teacher training course on students' difficulties and the use of models in teaching acid-base chemistry, electrochemistry, and redox reactions. Two years after the course, the teachers were interviewed about their PCK of (1)…
In silico model-based inference: a contemporary approach for hypothesis testing in network biology
Klinke, David J.
2014-01-01
Inductive inference plays a central role in the study of biological systems where one aims to increase their understanding of the system by reasoning backwards from uncertain observations to identify causal relationships among components of the system. These causal relationships are postulated from prior knowledge as a hypothesis or simply a model. Experiments are designed to test the model. Inferential statistics are used to establish a level of confidence in how well our postulated model explains the acquired data. This iterative process, commonly referred to as the scientific method, either improves our confidence in a model or suggests that we revisit our prior knowledge to develop a new model. Advances in technology impact how we use prior knowledge and data to formulate models of biological networks and how we observe cellular behavior. However, the approach for model-based inference has remained largely unchanged since Fisher, Neyman and Pearson developed the ideas in the early 1900’s that gave rise to what is now known as classical statistical hypothesis (model) testing. Here, I will summarize conventional methods for model-based inference and suggest a contemporary approach to aid in our quest to discover how cells dynamically interpret and transmit information for therapeutic aims that integrates ideas drawn from high performance computing, Bayesian statistics, and chemical kinetics. PMID:25139179
In silico model-based inference: a contemporary approach for hypothesis testing in network biology.
Klinke, David J
2014-01-01
Inductive inference plays a central role in the study of biological systems where one aims to increase their understanding of the system by reasoning backwards from uncertain observations to identify causal relationships among components of the system. These causal relationships are postulated from prior knowledge as a hypothesis or simply a model. Experiments are designed to test the model. Inferential statistics are used to establish a level of confidence in how well our postulated model explains the acquired data. This iterative process, commonly referred to as the scientific method, either improves our confidence in a model or suggests that we revisit our prior knowledge to develop a new model. Advances in technology impact how we use prior knowledge and data to formulate models of biological networks and how we observe cellular behavior. However, the approach for model-based inference has remained largely unchanged since Fisher, Neyman and Pearson developed the ideas in the early 1900s that gave rise to what is now known as classical statistical hypothesis (model) testing. Here, I will summarize conventional methods for model-based inference and suggest a contemporary approach to aid in our quest to discover how cells dynamically interpret and transmit information for therapeutic aims that integrates ideas drawn from high performance computing, Bayesian statistics, and chemical kinetics. © 2014 American Institute of Chemical Engineers.
Hansson, Sven Ove; Aven, Terje
2014-07-01
This article discusses to what extent risk analysis is scientific in view of a set of commonly used definitions and criteria. We consider scientific knowledge to be characterized by its subject matter, its success in developing the best available knowledge in its fields of study, and the epistemic norms and values that guide scientific investigations. We proceed to assess the field of risk analysis according to these criteria. For this purpose, we use a model for risk analysis in which science is used as a base for decision making on risks, which covers the five elements evidence, knowledge base, broad risk evaluation, managerial review and judgment, and the decision; and that relates these elements to the domains experts and decisionmakers, and to the domains fact-based or value-based. We conclude that risk analysis is a scientific field of study, when understood as consisting primarily of (i) knowledge about risk-related phenomena, processes, events, etc., and (ii) concepts, theories, frameworks, approaches, principles, methods and models to understand, assess, characterize, communicate, and manage risk, in general and for specific applications (the instrumental part). © 2014 Society for Risk Analysis.
A knowledge based software engineering environment testbed
NASA Technical Reports Server (NTRS)
Gill, C.; Reedy, A.; Baker, L.
1985-01-01
The Carnegie Group Incorporated and Boeing Computer Services Company are developing a testbed which will provide a framework for integrating conventional software engineering tools with Artifical Intelligence (AI) tools to promote automation and productivity. The emphasis is on the transfer of AI technology to the software development process. Experiments relate to AI issues such as scaling up, inference, and knowledge representation. In its first year, the project has created a model of software development by representing software activities; developed a module representation formalism to specify the behavior and structure of software objects; integrated the model with the formalism to identify shared representation and inheritance mechanisms; demonstrated object programming by writing procedures and applying them to software objects; used data-directed and goal-directed reasoning to, respectively, infer the cause of bugs and evaluate the appropriateness of a configuration; and demonstrated knowledge-based graphics. Future plans include introduction of knowledge-based systems for rapid prototyping or rescheduling; natural language interfaces; blackboard architecture; and distributed processing
Developing an ontological explosion knowledge base for business continuity planning purposes.
Mohammadfam, Iraj; Kalatpour, Omid; Golmohammadi, Rostam; Khotanlou, Hasan
2013-01-01
Industrial accidents are among the most known challenges to business continuity. Many organisations have lost their reputation following devastating accidents. To manage the risks of such accidents, it is necessary to accumulate sufficient knowledge regarding their roots, causes and preventive techniques. The required knowledge might be obtained through various approaches, including databases. Unfortunately, many databases are hampered by (among other things) static data presentations, a lack of semantic features, and the inability to present accident knowledge as discrete domains. This paper proposes the use of Protégé software to develop a knowledge base for the domain of explosion accidents. Such a structure has a higher capability to improve information retrieval compared with common accident databases. To accomplish this goal, a knowledge management process model was followed. The ontological explosion knowledge base (EKB) was built for further applications, including process accident knowledge retrieval and risk management. The paper will show how the EKB has a semantic feature that enables users to overcome some of the search constraints of existing accident databases.
Data to knowledge: how to get meaning from your result.
Berman, Helen M; Gabanyi, Margaret J; Groom, Colin R; Johnson, John E; Murshudov, Garib N; Nicholls, Robert A; Reddy, Vijay; Schwede, Torsten; Zimmerman, Matthew D; Westbrook, John; Minor, Wladek
2015-01-01
Structural and functional studies require the development of sophisticated 'Big Data' technologies and software to increase the knowledge derived and ensure reproducibility of the data. This paper presents summaries of the Structural Biology Knowledge Base, the VIPERdb Virus Structure Database, evaluation of homology modeling by the Protein Model Portal, the ProSMART tool for conformation-independent structure comparison, the LabDB 'super' laboratory information management system and the Cambridge Structural Database. These techniques and technologies represent important tools for the transformation of crystallographic data into knowledge and information, in an effort to address the problem of non-reproducibility of experimental results.
Hasegawa, Takanori; Yamaguchi, Rui; Nagasaki, Masao; Miyano, Satoru; Imoto, Seiya
2014-01-01
Comprehensive understanding of gene regulatory networks (GRNs) is a major challenge in the field of systems biology. Currently, there are two main approaches in GRN analysis using time-course observation data, namely an ordinary differential equation (ODE)-based approach and a statistical model-based approach. The ODE-based approach can generate complex dynamics of GRNs according to biologically validated nonlinear models. However, it cannot be applied to ten or more genes to simultaneously estimate system dynamics and regulatory relationships due to the computational difficulties. The statistical model-based approach uses highly abstract models to simply describe biological systems and to infer relationships among several hundreds of genes from the data. However, the high abstraction generates false regulations that are not permitted biologically. Thus, when dealing with several tens of genes of which the relationships are partially known, a method that can infer regulatory relationships based on a model with low abstraction and that can emulate the dynamics of ODE-based models while incorporating prior knowledge is urgently required. To accomplish this, we propose a method for inference of GRNs using a state space representation of a vector auto-regressive (VAR) model with L1 regularization. This method can estimate the dynamic behavior of genes based on linear time-series modeling constructed from an ODE-based model and can infer the regulatory structure among several tens of genes maximizing prediction ability for the observational data. Furthermore, the method is capable of incorporating various types of existing biological knowledge, e.g., drug kinetics and literature-recorded pathways. The effectiveness of the proposed method is shown through a comparison of simulation studies with several previous methods. For an application example, we evaluated mRNA expression profiles over time upon corticosteroid stimulation in rats, thus incorporating corticosteroid kinetics/dynamics, literature-recorded pathways and transcription factor (TF) information. PMID:25162401
Rule-based mechanisms of learning for intelligent adaptive flight control
NASA Technical Reports Server (NTRS)
Handelman, David A.; Stengel, Robert F.
1990-01-01
How certain aspects of human learning can be used to characterize learning in intelligent adaptive control systems is investigated. Reflexive and declarative memory and learning are described. It is shown that model-based systems-theoretic adaptive control methods exhibit attributes of reflexive learning, whereas the problem-solving capabilities of knowledge-based systems of artificial intelligence are naturally suited for implementing declarative learning. Issues related to learning in knowledge-based control systems are addressed, with particular attention given to rule-based systems. A mechanism for real-time rule-based knowledge acquisition is suggested, and utilization of this mechanism within the context of failure diagnosis for fault-tolerant flight control is demonstrated.
NASA Astrophysics Data System (ADS)
Patwari, Puneet; Choudhury, Subhrojyoti R.; Banerjee, Amar; Swaminathan, N.; Pandey, Shreya
2016-07-01
Model Driven Engineering (MDE) as a key driver to reduce development cost of M&C systems is beginning to find acceptance across scientific instruments such as Radio Telescopes and Nuclear Reactors. Such projects are adopting it to reduce time to integrate, test and simulate their individual controllers and increase reusability and traceability in the process. The creation and maintenance of models is still a significant challenge to realizing MDE benefits. Creating domain-specific modelling environments reduces the barriers, and we have been working along these lines, creating a domain-specific language and environment based on an M&C knowledge model. However, large projects involve several such domains, and there is still a need to interconnect the domain models, in order to ensure modelling completeness. This paper presents a knowledge-centric approach to doing that, by creating a generic system model that underlies the individual domain knowledge models. We present our vision for M&C Domain Map Maker, a set of processes and tools that enables explication of domain knowledge in terms of domain models with mutual consistency relationships to aid MDE.
Basic self-knowledge and transparency.
Borgoni, Cristina
2018-01-01
Cogito -like judgments, a term coined by Burge (1988), comprise thoughts such as, I am now thinking , I [hereby] judge that Los Angeles is at the same latitude as North Africa, or I [hereby] intend to go to the opera tonight. It is widely accepted that we form cogito -like judgments in an authoritative and not merely empirical manner. We have privileged self-knowledge of the mental state that is self-ascribed in a cogito -like judgment. Thus, models of self-knowledge that aim to explain privileged self-knowledge should have the resources to explain the special self-knowledge involved in cogito judgments. My objective in this paper is to examine whether a transparency model of self-knowledge (i.e., models based on Evans ' 1982 remarks) can provide such an explanation: granted that cogito judgments are paradigmatic cases of privileged self-knowledge, does the transparency procedure explain why this is so? The paper advances a negative answer, arguing that the transparency procedure cannot generate the type of thought constitutive of cogito judgments.
Hively, Lee M [Philadelphia, TN
2011-07-12
The invention relates to a method and apparatus for simultaneously processing different sources of test data into informational data and then processing different categories of informational data into knowledge-based data. The knowledge-based data can then be communicated between nodes in a system of multiple computers according to rules for a type of complex, hierarchical computer system modeled on a human brain.
Situation Model for Situation-Aware Assistance of Dementia Patients in Outdoor Mobility
Yordanova, Kristina; Koldrack, Philipp; Heine, Christina; Henkel, Ron; Martin, Mike; Teipel, Stefan; Kirste, Thomas
2017-01-01
Background: Dementia impairs spatial orientation and route planning, thus often affecting the patient’s ability to move outdoors and maintain social activities. Situation-aware deliberative assistive technology devices (ATD) can substitute impaired cognitive function in order to maintain one’s level of social activity. To build such a system, one needs domain knowledge about the patient’s situation and needs. We call this collection of knowledge situation model. Objective: To construct a situation model for the outdoor mobility of people with dementia (PwD). The model serves two purposes: 1) as a knowledge base from which to build an ATD describing the mobility of PwD; and 2) as a codebook for the annotation of the recorded behavior. Methods: We perform systematic knowledge elicitation to obtain the relevant knowledge. The OBO Edit tool is used for implementing and validating the situation model. The model is evaluated by using it as a codebook for annotating the behavior of PwD during a mobility study and interrater agreement is computed. In addition, clinical experts perform manual evaluation and curation of the model. Results: The situation model consists of 101 concepts with 11 relation types between them. The results from the annotation showed substantial overlapping between two annotators (Cohen’s kappa of 0.61). Conclusion: The situation model is a first attempt to systematically collect and organize information related to the outdoor mobility of PwD for the purposes of situation-aware assistance. The model is the base for building an ATD able to provide situation-aware assistance and to potentially improve the quality of life of PwD. PMID:29060937
Gradient-based reliability maps for ACM-based segmentation of hippocampus.
Zarpalas, Dimitrios; Gkontra, Polyxeni; Daras, Petros; Maglaveras, Nicos
2014-04-01
Automatic segmentation of deep brain structures, such as the hippocampus (HC), in MR images has attracted considerable scientific attention due to the widespread use of MRI and to the principal role of some structures in various mental disorders. In this literature, there exists a substantial amount of work relying on deformable models incorporating prior knowledge about structures' anatomy and shape information. However, shape priors capture global shape characteristics and thus fail to model boundaries of varying properties; HC boundaries present rich, poor, and missing gradient regions. On top of that, shape prior knowledge is blended with image information in the evolution process, through global weighting of the two terms, again neglecting the spatially varying boundary properties, causing segmentation faults. An innovative method is hereby presented that aims to achieve highly accurate HC segmentation in MR images, based on the modeling of boundary properties at each anatomical location and the inclusion of appropriate image information for each of those, within an active contour model framework. Hence, blending of image information and prior knowledge is based on a local weighting map, which mixes gradient information, regional and whole brain statistical information with a multi-atlas-based spatial distribution map of the structure's labels. Experimental results on three different datasets demonstrate the efficacy and accuracy of the proposed method.
A Cognitive Architecture for Human Performance Process Model Research
1992-11-01
individually defined, updatable world representation which is a description of the world as the operator knows it. It contains rules for decisions, an...operate it), and rules of engagement (knowledge about the operator’s expected behavior). The HPP model works in the following way. Information enters...based models depict the problem-solving processes of experts. The experts’ knowledge is represented in symbol structures, along with rules for
Khajouei, Hamid; Khajouei, Reza
2017-12-01
Appropriate knowledge, correct information, and relevant data are vital in medical diagnosis and treatment systems. Knowledge Management (KM) through its tools/techniques provides a pertinent framework for decision-making in healthcare systems. The objective of this study was to identify and prioritize the KM tools/techniques that apply to hospital setting. This is a descriptive-survey study. Data were collected using a -researcher-made questionnaire that was developed based on experts' opinions to select the appropriate tools/techniques from 26 tools/techniques of the Asian Productivity Organization (APO) model. Questions were categorized into five steps of KM (identifying, creating, storing, sharing, and applying the knowledge) according to this model. The study population consisted of middle and senior managers of hospitals and managing directors of Vice-Chancellor for Curative Affairs in Kerman University of Medical Sciences in Kerman, Iran. The data were analyzed in SPSS v.19 using one-sample t-test. Twelve out of 26 tools/techniques of the APO model were identified as the tools applicable in hospitals. "Knowledge café" and "APO knowledge management assessment tool" with respective means of 4.23 and 3.7 were the most and the least applicable tools in the knowledge identification step. "Mentor-mentee scheme", as well as "voice and Voice over Internet Protocol (VOIP)" with respective means of 4.20 and 3.52 were the most and the least applicable tools/techniques in the knowledge creation step. "Knowledge café" and "voice and VOIP" with respective means of 3.85 and 3.42 were the most and the least applicable tools/techniques in the knowledge storage step. "Peer assist and 'voice and VOIP' with respective means of 4.14 and 3.38 were the most and the least applicable tools/techniques in the knowledge sharing step. Finally, "knowledge worker competency plan" and "knowledge portal" with respective means of 4.38 and 3.85 were the most and the least applicable tools/techniques in the knowledge application step. The results showed that 12 out of 26 tools in the APO model are appropriate for hospitals of which 11 are significantly applicable, and "storytelling" is marginally applicable. In this study, the preferred tools/techniques for implementation of each of the five KM steps in hospitals are introduced. Copyright © 2017 Elsevier B.V. All rights reserved.
Personalising e-learning modules: targeting Rasmussen levels using XML.
Renard, J M; Leroy, S; Camus, H; Picavet, M; Beuscart, R
2003-01-01
The development of Internet technologies has made it possible to increase the number and the diversity of on-line resources for teachers and students. Initiatives like the French-speaking Virtual Medical University Project (UMVF) try to organise the access to these resources. But both teachers and students are working on a partly redundant subset of knowledge. From the analysis of some French courses we propose a model for knowledge organisation derived from Rasmussen's stepladder. In the context of decision-making Rasmussen has identified skill-based, rule-based and knowledge-based levels for the mental process. In the medical context of problem-solving, we apply these three levels to the definition of three students levels: beginners, intermediate-level learners, experts. Based on our model, we build a representation of the hierarchical structure of data using XML language. We use XSLT Transformation Language in order to filter relevant data according to student level and to propose an appropriate display on students' terminal. The model and the XML implementation we define help to design tools for building personalised e-learning modules.
Extending TOPS: Ontology-driven Anomaly Detection and Analysis System
NASA Astrophysics Data System (ADS)
Votava, P.; Nemani, R. R.; Michaelis, A.
2010-12-01
Terrestrial Observation and Prediction System (TOPS) is a flexible modeling software system that integrates ecosystem models with frequent satellite and surface weather observations to produce ecosystem nowcasts (assessments of current conditions) and forecasts useful in natural resources management, public health and disaster management. We have been extending the Terrestrial Observation and Prediction System (TOPS) to include a capability for automated anomaly detection and analysis of both on-line (streaming) and off-line data. In order to best capture the knowledge about data hierarchies, Earth science models and implied dependencies between anomalies and occurrences of observable events such as urbanization, deforestation, or fires, we have developed an ontology to serve as a knowledge base. We can query the knowledge base and answer questions about dataset compatibilities, similarities and dependencies so that we can, for example, automatically analyze similar datasets in order to verify a given anomaly occurrence in multiple data sources. We are further extending the system to go beyond anomaly detection towards reasoning about possible causes of anomalies that are also encoded in the knowledge base as either learned or implied knowledge. This enables us to scale up the analysis by eliminating a large number of anomalies early on during the processing by either failure to verify them from other sources, or matching them directly with other observable events without having to perform an extensive and time-consuming exploration and analysis. The knowledge is captured using OWL ontology language, where connections are defined in a schema that is later extended by including specific instances of datasets and models. The information is stored using Sesame server and is accessible through both Java API and web services using SeRQL and SPARQL query languages. Inference is provided using OWLIM component integrated with Sesame.
Assessing the Dynamic Behavior of Online Q&A Knowledge Markets: A System Dynamics Approach
ERIC Educational Resources Information Center
Jafari, Mostafa; Hesamamiri, Roozbeh; Sadjadi, Jafar; Bourouni, Atieh
2012-01-01
Purpose: The objective of this paper is to propose a holistic dynamic model for understanding the behavior of a complex and internet-based kind of knowledge market by considering both social and economic interactions. Design/methodology/approach: A system dynamics (SD) model is formulated in this study to investigate the dynamic characteristics of…
ERIC Educational Resources Information Center
Lintean, Mihai; Rus, Vasile; Azevedo, Roger
2012-01-01
This article describes the problem of detecting the student mental models, i.e. students' knowledge states, during the self-regulatory activity of prior knowledge activation in MetaTutor, an intelligent tutoring system that teaches students self-regulation skills while learning complex science topics. The article presents several approaches to…
The role of familiarity in binary choice inferences.
Honda, Hidehito; Abe, Keiga; Matsuka, Toshihiko; Yamagishi, Kimihiko
2011-07-01
In research on the recognition heuristic (Goldstein & Gigerenzer, Psychological Review, 109, 75-90, 2002), knowledge of recognized objects has been categorized as "recognized" or "unrecognized" without regard to the degree of familiarity of the recognized object. In the present article, we propose a new inference model--familiarity-based inference. We hypothesize that when subjective knowledge levels (familiarity) of recognized objects differ, the degree of familiarity of recognized objects will influence inferences. Specifically, people are predicted to infer that the more familiar object in a pair of two objects has a higher criterion value on the to-be-judged dimension. In two experiments, using a binary choice task, we examined inferences about populations in a pair of two cities. Results support predictions of familiarity-based inference. Participants inferred that the more familiar city in a pair was more populous. Statistical modeling showed that individual differences in familiarity-based inference lie in the sensitivity to differences in familiarity. In addition, we found that familiarity-based inference can be generally regarded as an ecologically rational inference. Furthermore, when cue knowledge about the inference criterion was available, participants made inferences based on the cue knowledge about population instead of familiarity. Implications of the role of familiarity in psychological processes are discussed.
The DAB model of drawing processes
NASA Technical Reports Server (NTRS)
Hochhaus, Larry W.
1989-01-01
The problem of automatic drawing was investigated in two ways. First, a DAB model of drawing processes was introduced. DAB stands for three types of knowledge hypothesized to support drawing abilities, namely, Drawing Knowledge, Assimilated Knowledge, and Base Knowledge. Speculation concerning the content and character of each of these subsystems of the drawing process is introduced and the overall adequacy of the model is evaluated. Second, eight experts were each asked to understand six engineering drawings and to think aloud while doing so. It is anticipated that a concurrent protocol analysis of these interviews can be carried out in the future. Meanwhile, a general description of the videotape database is provided. In conclusion, the DAB model was praised as a worthwhile first step toward solution of a difficult problem, but was considered by and large inadequate to the challenge of automatic drawing. Suggestions for improvements on the model were made.
NASA Astrophysics Data System (ADS)
Santucci, F.; Santini, P. M.
2016-10-01
We study the generalization of the dispersionless Kadomtsev-Petviashvili (dKP) equation in n+1 dimensions and with nonlinearity of degree m+1, a model equation describing the propagation of weakly nonlinear, quasi one-dimensional waves in the absence of dispersion and dissipation, and arising in several physical contexts, like acoustics, plasma physics, hydrodynamics and nonlinear optics. In 2 + 1 dimensions and with quadratic nonlinearity, this equation is integrable through a novel inverse scattering transform, and it has been recently shown to be a prototype model equation in the description of the two-dimensional wave breaking of localized initial data. In higher dimensions and with higher nonlinearity, the generalized dKP equations are not integrable, but their invariance under motions on the paraboloid allows one to construct in this paper a family of exact solutions describing waves constant on their paraboloidal wave front and breaking simultaneously in all points of it, developing after breaking either multivaluedness or single-valued discontinuous profiles (shocks). Then such exact solutions are used to build the longtime behavior of the solutions of the Cauchy problem, for small and localized initial data, showing that wave breaking of small initial data takes place in the longtime regime if and only if m(n-1)≤slant 2. Lastly, the analytic aspects of such wave breaking are investigated in detail in terms of the small initial data, in both cases in which the solution becomes multivalued after breaking or it develops a shock. These results, contained in the 2012 master’s thesis of one of the authors (FS) [1], generalize those obtained in [2] for the dKP equation in n+1 dimensions with quadratic nonlinearity, and are obtained following the same strategy.
Experimental Study of the Richtmyer-Meshkov Instability of Incompressible Fluids
NASA Technical Reports Server (NTRS)
Niederhaus, Charles; Jacobs, Jeffrey W.
2002-01-01
The Richtmyer-Meshkov instability of a low Atwood number, miscible, two-liquid system is investigated experimentally. The initially stratified fluids are contained within a rectangular tank mounted to a sled that rides on a vertical set of rails. The instability is generated by dropping the sled onto a coil spring, producing a nearly impulsive upward acceleration. The subsequent freefall that occurs as the container travels upward and then downward on the rails allows the instability to evolve in the absence of gravity. The interface separating the two liquids initially has a well-defined, sinusoidal perturbation that quickly inverts and then grows in amplitude after undergoing the impulsive acceleration. Disturbance amplitudes are measured and compared to theoretical predictions. Linear stability theory gives excellent agreement with the measured initial growth rate, a(sub 0), for single-mode perturbations with the predicted amplitudes differing by less than 10% from experimental measurements up to a nondimensional time ka(sub 0)t = 0.7, where k is the wavenumber. Linear stability theory also provides excellent agreement for the individual mode amplitudes of multi-mode initial perturbations up until the interface becomes multi-valued. Comparison with previously published weakly nonlinear single-mode models shows good agreement up to ka(sub 0)t = 3, while published nonlinear single-mode models provide good agreement up to ka(sub 0)t = 30. The effects of Reynolds number on the vortex core evolution and overall growth rate of the interface are also investigated. Measurements of the overall amplitude are found to be unaffected by the Reynolds number for the range of values studied here. However, experiments carried out at lower values of Reynolds numbers were found to have decreased vortex core rotation rates. In addition, an instability in the vortex cores is observed.
dc properties of series-parallel arrays of Josephson junctions in an external magnetic field
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lewandowski, S.J.
1991-04-01
A detailed dc theory of superconducting multijunction interferometers has previously been developed by several authors for the case of parallel junction arrays. The theory is now extended to cover the case of a loop containing several junctions connected in series. The problem is closely associated with high-{ital T}{sub {ital c}} superconductors and their clusters of intrinsic Josephson junctions. These materials exhibit spontaneous interferometric effects, and there is no reason to assume that the intrinsic junctions form only parallel arrays. A simple formalism of phase states is developed in order to express the superconducting phase differences across the junctions forming amore » series array as functions of the phase difference across the weakest junction of the system, and to relate the differences in critical currents of the junctions to gaps in the allowed ranges of their phase functions. This formalism is used to investigate the energy states of the array, which in the case of different junctions are split and separated by energy barriers of height depending on the phase gaps. Modifications of the washboard model of a single junction are shown. Next a superconducting inductive loop containing a series array of two junctions is considered, and this model is used to demonstrate the transitions between phase states and the associated instabilities. Finally, the critical current of a parallel connection of two series arrays is analyzed and shown to be a multivalued function of the externally applied magnetic flux. The instabilities caused by the presence of intrinsic serial junctions in granular high-{ital T}{sub {ital c}} materials are pointed out as a potential source of additional noise.« less
NASA Astrophysics Data System (ADS)
van Aalst, Jan; Sioux Truong, Mya
2011-03-01
The phrase 'knowledge creation' refers to the practices by which a community advances its collective knowledge. Experience with a model of knowledge creation could help students to learn about the nature of science. This research examined how much progress a teacher and 16 Primary Five (Grade 4) students in the International Baccalaureate Primary Years Programme could make towards the discourse needed for Bereiter and Scardamalia's model of knowledge creation. The study consisted of two phases: a five-month period focusing on the development of the classroom ethos and skills needed for this model (Phase 1), followed by a two-month inquiry into life cycles (Phase 2). In Phase 1, we examined the classroom practices that are thought to support knowledge creation and the early experiences of the students with a web-based inquiry environment, Knowledge Forum®. In Phase 2, we conducted a summative evaluation of the students' work in Knowledge Forum in the light of the model. The data sources included classroom video recordings, artefacts of the in-class work, the Knowledge Forum database, a science content test, questionnaires, and interviews. The findings indicate that the students made substantial progress towards the knowledge creation discourse, particularly regarding the social structure of this kind of discourse and, to a lesser extent, its idea-centred nature. They also made acceptable advances in scientific knowledge and appeared to enjoy this way of learning. The study provides one of the first accounts in the literature of how a teacher new to the knowledge creation model enacted it in an Asian primary classroom.
ERIC Educational Resources Information Center
Moutinho, Sara; Moura, Rui; Vasconcelos, Clara
2017-01-01
Model-Based learning is a methodology that facilitates students' construction of scientific knowledge, which, sometimes, includes restructuring their mental models. Taking into consideration students' learning process, its aim is to promote a deeper understanding of phenomena's dynamics through the manipulation of models. Our aim was to ascertain…
Thinking Outside the Box: Agile Business Models for CNOs
NASA Astrophysics Data System (ADS)
Loss, Leandro; Crave, Servane
This paper introduces the idea of an agile Business Model for CNOs grounded on a new model of innovation based on the effects of globalization and of Knowledge Economy. The agile Business Model considers the resources that are spread out and available worldwide as well as the need for each customer to receive a unique customer experience. It aims at reinforcing in the context of the Knowledge Economy the different business models approaches developed so far. The paper also identifies the levers and the barriers of Agile Business Models Innovation in CNOs.
Knowledge Discovery from Posts in Online Health Communities Using Unified Medical Language System.
Chen, Donghua; Zhang, Runtong; Liu, Kecheng; Hou, Lei
2018-06-19
Patient-reported posts in Online Health Communities (OHCs) contain various valuable information that can help establish knowledge-based online support for online patients. However, utilizing these reports to improve online patient services in the absence of appropriate medical and healthcare expert knowledge is difficult. Thus, we propose a comprehensive knowledge discovery method that is based on the Unified Medical Language System for the analysis of narrative posts in OHCs. First, we propose a domain-knowledge support framework for OHCs to provide a basis for post analysis. Second, we develop a Knowledge-Involved Topic Modeling (KI-TM) method to extract and expand explicit knowledge within the text. We propose four metrics, namely, explicit knowledge rate, latent knowledge rate, knowledge correlation rate, and perplexity, for the evaluation of the KI-TM method. Our experimental results indicate that our proposed method outperforms existing methods in terms of providing knowledge support. Our method enhances knowledge support for online patients and can help develop intelligent OHCs in the future.
Modeling Research Project Risks with Fuzzy Maps
ERIC Educational Resources Information Center
Bodea, Constanta Nicoleta; Dascalu, Mariana Iuliana
2009-01-01
The authors propose a risks evaluation model for research projects. The model is based on fuzzy inference. The knowledge base for fuzzy process is built with a causal and cognitive map of risks. The map was especially developed for research projects, taken into account their typical lifecycle. The model was applied to an e-testing research…
Building a Knowledge to Action Program in Stroke Rehabilitation.
Janzen, Shannon; McIntyre, Amanda; Richardson, Marina; Britt, Eileen; Teasell, Robert
2016-09-01
The knowledge to action (KTA) process proposed by Graham et al (2006) is a framework to facilitate the development and application of research evidence into clinical practice. The KTA process consists of the knowledge creation cycle and the action cycle. The Evidence Based Review of Stroke Rehabilitation is a foundational part of the knowledge creation cycle and has helped guide the development of best practice recommendations in stroke. The Rehabilitation Knowledge to Action Project is an audit-feedback process for the clinical implementation of best practice guidelines, which follows the action cycle. The objective of this review was to: (1) contextualize the Evidence Based Review of Stroke Rehabilitation and Rehabilitation Knowledge to Action Project within the KTA model and (2) show how this process led to improved evidence-based practice in stroke rehabilitation. Through this process, a single centre was able to change clinical practice and promote a culture that supports the use of evidence-based practices in stroke rehabilitation.
Requirements analysis, domain knowledge, and design
NASA Technical Reports Server (NTRS)
Potts, Colin
1988-01-01
Two improvements to current requirements analysis practices are suggested: domain modeling, and the systematic application of analysis heuristics. Domain modeling is the representation of relevant application knowledge prior to requirements specification. Artificial intelligence techniques may eventually be applicable for domain modeling. In the short term, however, restricted domain modeling techniques, such as that in JSD, will still be of practical benefit. Analysis heuristics are standard patterns of reasoning about the requirements. They usually generate questions of clarification or issues relating to completeness. Analysis heuristics can be represented and therefore systematically applied in an issue-based framework. This is illustrated by an issue-based analysis of JSD's domain modeling and functional specification heuristics. They are discussed in the context of the preliminary design of simple embedded systems.
2016-01-01
Observations of individual organisms (data) can be combined with expert ecological knowledge of species, especially causal knowledge, to model and extract from flower–visiting data useful information about behavioral interactions between insect and plant organisms, such as nectar foraging and pollen transfer. We describe and evaluate a method to elicit and represent such expert causal knowledge of behavioral ecology, and discuss the potential for wider application of this method to the design of knowledge-based systems for knowledge discovery in biodiversity and ecosystem informatics. PMID:27851814
Conceptual information processing: A robust approach to KBS-DBMS integration
NASA Technical Reports Server (NTRS)
Lazzara, Allen V.; Tepfenhart, William; White, Richard C.; Liuzzi, Raymond
1987-01-01
Integrating the respective functionality and architectural features of knowledge base and data base management systems is a topic of considerable interest. Several aspects of this topic and associated issues are addressed. The significance of integration and the problems associated with accomplishing that integration are discussed. The shortcomings of current approaches to integration and the need to fuse the capabilities of both knowledge base and data base management systems motivates the investigation of information processing paradigms. One such paradigm is concept based processing, i.e., processing based on concepts and conceptual relations. An approach to robust knowledge and data base system integration is discussed by addressing progress made in the development of an experimental model for conceptual information processing.
Knowledge transmission model with differing initial transmission and retransmission process
NASA Astrophysics Data System (ADS)
Wang, Haiying; Wang, Jun; Small, Michael
2018-10-01
Knowledge transmission is a cyclic dynamic diffusion process. The rate of acceptance of knowledge differs upon whether or not the recipient has previously held the knowledge. In this paper, the knowledge transmission process is divided into an initial and a retransmission procedure, each with its own transmission and self-learning parameters. Based on epidemic spreading model, we propose a naive-evangelical-agnostic (VEA) knowledge transmission model and derive mean-field equations to describe the dynamics of knowledge transmission in homogeneous networks. Theoretical analysis identifies a criterion for the persistence of knowledge, i.e., the reproduction number R0 depends on the minor effective parameters between the initial and retransmission process. Moreover, the final size of evangelical individuals is only related to retransmission process parameters. Numerical simulations validate the theoretical analysis. Furthermore, the simulations indicate that increasing the initial transmission parameters, including first transmission and self-learning rates of naive individuals, can accelerate the velocity of knowledge transmission efficiently but have no effect on the final size of evangelical individuals. In contrast, the retransmission parameters, including retransmission and self-learning rates of agnostic individuals, have a significant effect on the rate of knowledge transmission, i.e., the larger parameters the greater final density of evangelical individuals.
NASA Astrophysics Data System (ADS)
Willmes, C.
2017-12-01
In the frame of the Collaborative Research Centre 806 (CRC 806) an interdisciplinary research project, that needs to manage data, information and knowledge from heterogeneous domains, such as archeology, cultural sciences, and the geosciences, a collaborative internal knowledge base system was developed. The system is based on the open source MediaWiki software, that is well known as the software that enables Wikipedia, for its facilitation of a web based collaborative knowledge and information management platform. This software is additionally enhanced with the Semantic MediaWiki (SMW) extension, that allows to store and manage structural data within the Wiki platform, as well as it facilitates complex query and API interfaces to the structured data stored in the SMW data base. Using an additional open source software called mobo, it is possible to improve the data model development process, as well as automated data imports, from small spreadsheets to large relational databases. Mobo is a command line tool that helps building and deploying SMW structure in an agile, Schema-Driven Development way, and allows to manage and collaboratively develop the data model formalizations, that are formalized in JSON-Schema format, using version control systems like git. The combination of a well equipped collaborative web platform facilitated by Mediawiki, the possibility to store and query structured data in this collaborative database provided by SMW, as well as the possibility for automated data import and data model development enabled by mobo, result in a powerful but flexible system to build and develop a collaborative knowledge base system. Furthermore, SMW allows the application of Semantic Web technology, the structured data can be exported into RDF, thus it is possible to set a triple-store including a SPARQL endpoint on top of the database. The JSON-Schema based data models, can be enhanced into JSON-LD, to facilitate and profit from the possibilities of Linked Data technology.
Error-associated behaviors and error rates for robotic geology
NASA Technical Reports Server (NTRS)
Anderson, Robert C.; Thomas, Geb; Wagner, Jacob; Glasgow, Justin
2004-01-01
This study explores human error as a function of the decision-making process. One of many models for human decision-making is Rasmussen's decision ladder [9]. The decision ladder identifies the multiple tasks and states of knowledge involved in decision-making. The tasks and states of knowledge can be classified by the level of cognitive effort required to make the decision, leading to the skill, rule, and knowledge taxonomy (Rasmussen, 1987). Skill based decisions require the least cognitive effort and knowledge based decisions require the greatest cognitive effort. Errors can occur at any of the cognitive levels.
ERIC Educational Resources Information Center
Jing, Tang; Dancheng, Luo; Ye, Zhao
2016-01-01
Purpose: The entrepreneurship is a course of gaining knowledge from the failure and stimulating positive energy constantly. The entrepreneur's psychological resilience is the key to gain knowledge (positive energy) from failure (negative energy). The education of undergraduate entrepreneurship is one of the priorities these days. Educators shall…