Science.gov

Sample records for priori knowledge lvgiga

  1. The consequences of learnability for the a priori knowledge in a world

    NASA Astrophysics Data System (ADS)

    Sommer, Hanns

    1998-07-01

    The precondition for the evolution of intelligent beings in a world (modelled by anticipatory systems) is the learnability of some regularities in that world. The consequences, that can be deduced from the learnability property in a world, form the a priori knowledge. This a priori knowledge is independent of the empirical facts in a special world. It will be shown that the a priori knowledge consists not only of logical tautologies. Also some well-known non trivial theorems in physics and psychology form part of the a priori knowledge. The a priori knowledge obtained in different worlds is necessarily organised by the same structures. Examples from physics and psychology show the use of a separation between a priori knowledge and empirical knowledge.

  2. Algorithms for magnetic tomography—on the role of a priori knowledge and constraints

    NASA Astrophysics Data System (ADS)

    Hauer, Karl-Heinz; Potthast, Roland; Wannert, Martin

    2008-08-01

    Magnetic tomography investigates the reconstruction of currents from their magnetic fields. Here, we will study a number of projection methods in combination with the Tikhonov regularization for stabilization for the solution of the Biot-Savart integral equation Wj = H with the Biot-Savart integral operator W:(L2(Ω))3 → (L2(∂G))3 where \\overline{\\Omega} \\subset G . In particular, we study the role of a priori knowledge when incorporated into the choice of the projection spaces X_n \\subset (L^2(\\Omega))^3, n\\in {\\bb N} , for example the conditions div j = 0 or the use of the full boundary value problem div σgrad phivE = 0 in Ω, ν sdot σgrad phivE = g on ∂Ω with some known function g, where j = σgrad phivE and σ is an anisotropic matrix-valued conductivity. We will discuss and compare these schemes investigating the ill-posedness of each algorithm in terms of the behaviour of the singular values of the corresponding operators both when a priori knowledge is incorporated and when the geometrical setting is modified. Finally, we will numerically evaluate the stability constants in the practical setup of magnetic tomography for fuel cells and, thus, calculate usable error bounds for this important application area.

  3. Do we use a priori knowledge of gravity when making elbow rotations?

    PubMed

    Pinter, Ilona J; van Soest, Arthur J; Bobbert, Maarten F; Smeets, Jeroen B J

    2012-03-01

    In this study, we aim to investigate whether motor commands, emanating from movement planning, are customized to movement orientation relative to gravity from the first trial on. Participants made fast point-to-point elbow flexions and extensions in the transverse plane. We compared movements that had been practiced in reclined orientation either against or with gravity with the same movement relative to the body axis made in the upright orientation (neutral compared to gravity). For each movement type, five rotations from reclined to upright orientation were made. For each rotation, we analyzed the first trial in upright orientation and the directly preceding trial in reclined orientation. Additionally, we analyzed the last five trials of a 30-trial block in upright position and compared these trials with the first trials in upright orientation. Although participants moved fast, gravitational torques were substantial. The change in body orientation affected movement planning: we found a decrease in peak angular velocity and a decrease in amplitude for the first trials made in the upright orientation, regardless of whether the previous movements in reclined orientation were made against or with gravity. We found that these decreases disappeared after participants familiarized themselves with moving in upright position in a 30-trial block. These results indicate that participants used a general strategy, corresponding to the strategy observed in situations with unreliable or limited information on external conditions. From this, we conclude that during movement planning, a priori knowledge of gravity was not used to specifically customize motor commands for the neutral gravity condition.

  4. Novel post-Doppler STAP with a priori knowledge information for traffic monitoring applications: basic idea and first results

    NASA Astrophysics Data System (ADS)

    da Silva, André B. C.; Baumgartner, Stefan V.

    2017-09-01

    This paper presents a novel a priori knowledge-based algorithm for traffic monitoring applications. The powerful post-Doppler space-time adaptive processing (PD STAP) is combined with a known road network obtained from the freely available OpenStreetMap (OSM) database. The road information is applied after the PD STAP for recognizing and rejecting false detections, and moreover, for repositioning the vehicles detected in the vicinity of the roads. The algorithm presents great potential for real-time processing, decreased hardware complexity and low costs compared to state-of-the-art systems. The processor was tested using real multi-channel data acquired by DLR's airborne system F-SAR. The experimental results are shown and discussed, and the novelties are highlighted (e.g., the benefits of using a priori knowledge information).

  5. A Computationally Efficient, Exploratory Approach to Brain Connectivity Incorporating False Discovery Rate Control, A Priori Knowledge, and Group Inference

    PubMed Central

    Liu, Aiping; Li, Junning; Wang, Z. Jane; McKeown, Martin J.

    2012-01-01

    Graphical models appear well suited for inferring brain connectivity from fMRI data, as they can distinguish between direct and indirect brain connectivity. Nevertheless, biological interpretation requires not only that the multivariate time series are adequately modeled, but also that there is accurate error-control of the inferred edges. The PCfdr algorithm, which was developed by Li and Wang, was to provide a computationally efficient means to control the false discovery rate (FDR) of computed edges asymptotically. The original PCfdr algorithm was unable to accommodate a priori information about connectivity and was designed to infer connectivity from a single subject rather than a group of subjects. Here we extend the original PCfdr algorithm and propose a multisubject, error-rate-controlled brain connectivity modeling approach that allows incorporation of prior knowledge of connectivity. In simulations, we show that the two proposed extensions can still control the FDR around or below a specified threshold. When the proposed approach is applied to fMRI data in a Parkinson's disease study, we find robust group evidence of the disease-related changes, the compensatory changes, and the normalizing effect of L-dopa medication. The proposed method provides a robust, accurate, and practical method for the assessment of brain connectivity patterns from functional neuroimaging data. PMID:23251232

  6. Model-based waveform design for optimal detection: A multi-objective approach to dealing with incomplete a priori knowledge.

    PubMed

    Hamschin, Brandon M; Loughlin, Patrick J

    2015-11-01

    This work considers the design of optimal, energy-constrained transmit signals for active sensing for the case when the designer has incomplete or uncertain knowledge of the target and/or environment. The mathematical formulation is that of a multi-objective optimization problem, wherein one can incorporate a plurality of potential targets, interference, or clutter models and in doing so take advantage of the wide range of results in the literature related to modeling each. It is shown, via simulation, that when the objective function of the optimization problem is chosen to maximize the minimum (i.e., maxmin) probability of detection among all possible model combinations, the optimal waveforms obtained are advantageous. The advantage results because the maxmin waveforms judiciously allocate energy to spectral regions where each of the target models respond strongly and each of the environmental models affect minimal detection performance degradation. In particular, improved detection performance is shown compared to linear frequency modulated transmit signals and compared to signals designed with the wrong target spectrum assumed. Additionally, it is shown that the maxmin design yields performance comparable to an optimal design matched to the correct target/environmental model. Finally, it is proven that the maxmin problem formulation is convex.

  7. A retrospective cohort study on the risk of stroke in relation to a priori health knowledge level among people with type 2 diabetes mellitus in Taiwan.

    PubMed

    Lai, Yun-Ju; Hu, Hsiao-Yun; Lee, Ya-Ling; Ku, Po-Wen; Yen, Yung-Feng; Chu, Dachen

    2017-05-22

    Intervention of diabetes care education with regular laboratory check-up in outpatient visits showed long-term benefits to reduce the risk of macrovascular complications among people with type 2 diabetes. However, research on the level of a priori health knowledge to the prevention of diabetic complications in community settings has been scarce. We therefore aimed to investigate the association of health knowledge and stroke incidence in patients with type 2 diabetes in Taiwan. A nationally representative sample of general Taiwanese population was selected using a multistage systematic sampling process from Taiwan National Health Interview Survey (NHIS) in 2005. Subjects were interviewed by a standardized face-to-face questionnaire in the survey, obtaining information of demographics, socioeconomic status, family medical history, obesity, health behaviors, and 15-item health knowledge assessment. The NHIS dataset was linked to Taiwan National Health Insurance claims data to retrieve the diagnosis of type 2 diabetes in NHIS participants at baseline and identify follow-up incidence of stroke from 2005 to 2013. Univariate and multivariate Cox regressions were used to estimate the effect of baseline health knowledge level to the risk of stroke incidence among this group of people with type 2 diabetes. A total of 597 diabetic patients with a mean age of 51.28 years old and nearly half of males were analyzed. During the 9-year follow-up period, 65 new stroke cases were identified among them. Kaplan-Meier curves comparing the three groups of low/moderate/high knowledge levels revealed a statistical significance (p-value of log-rank test <0.01). After controlling for potential confounders, comparing to the group of low health knowledge level, the relative risk of stroke was significantly lower for those with moderate (adjusted hazard ratio [AHR] = 0.63; 95% CI, 0.33-1.19; p-value = 0.15) and high level of health knowledge (AHR = 0.43; 95% CI, 0.22-0.86; p

  8. An Approach for the Long-Term 30-m Land Surface Snow-Free Albedo Retrieval from Historic Landsat Surface Reflectance and MODIS-based A Priori Anisotropy Knowledge

    NASA Technical Reports Server (NTRS)

    Shuai, Yanmin; Masek, Jeffrey G.; Gao, Feng; Schaaf, Crystal B.; He, Tao

    2014-01-01

    Land surface albedo has been recognized by the Global Terrestrial Observing System (GTOS) as an essential climate variable crucial for accurate modeling and monitoring of the Earth's radiative budget. While global climate studies can leverage albedo datasets from MODIS, VIIRS, and other coarse-resolution sensors, many applications in heterogeneous environments can benefit from higher-resolution albedo products derived from Landsat. We previously developed a "MODIS-concurrent" approach for the 30-meter albedo estimation which relied on combining post-2000 Landsat data with MODIS Bidirectional Reflectance Distribution Function (BRDF) information. Here we present a "pre-MODIS era" approach to extend 30-m surface albedo generation in time back to the 1980s, through an a priori anisotropy Look-Up Table (LUT) built up from the high quality MCD43A BRDF estimates over representative homogenous regions. Each entry in the LUT reflects a unique combination of land cover, seasonality, terrain information, disturbance age and type, and Landsat optical spectral bands. An initial conceptual LUT was created for the Pacific Northwest (PNW) of the United States and provides BRDF shapes estimated from MODIS observations for undisturbed and disturbed surface types (including recovery trajectories of burned areas and non-fire disturbances). By accepting the assumption of a generally invariant BRDF shape for similar land surface structures as a priori information, spectral white-sky and black-sky albedos are derived through albedo-to-nadir reflectance ratios as a bridge between the Landsat and MODIS scale. A further narrow-to-broadband conversion based on radiative transfer simulations is adopted to produce broadband albedos at visible, near infrared, and shortwave regimes.We evaluate the accuracy of resultant Landsat albedo using available field measurements at forested AmeriFlux stations in the PNW region, and examine the consistency of the surface albedo generated by this approach

  9. A Priori Analysis of Natural Language Queries.

    ERIC Educational Resources Information Center

    Spiegler, Israel; Elata, Smadar

    1988-01-01

    Presents a model for the a priori analysis of natural language queries which uses an algorithm to transform the query into a logical pattern that is used to determine the answerability of the query. The results of testing by a prototype system implemented in PROLOG are discussed. (20 references) (CLB)

  10. Predictive a priori pressure-dependent kinetics.

    PubMed

    Jasper, Ahren W; Pelzer, Kenley M; Miller, James A; Kamarchik, Eugene; Harding, Lawrence B; Klippenstein, Stephen J

    2014-12-05

    The ability to predict the pressure dependence of chemical reaction rates would be a great boon to kinetic modeling of processes such as combustion and atmospheric chemistry. This pressure dependence is intimately related to the rate of collision-induced transitions in energy E and angular momentum J. We present a scheme for predicting this pressure dependence based on coupling trajectory-based determinations of moments of the E,J-resolved collisional transfer rates with the two-dimensional master equation. This completely a priori procedure provides a means for proceeding beyond the empiricism of prior work. The requisite microcanonical dissociation rates are obtained from ab initio transition state theory. Predictions for the CH4 = CH3 + H and C2H3 = C2H2 + H reaction systems are in excellent agreement with experiment.

  11. Measurement, coordination, and the relativized a priori

    NASA Astrophysics Data System (ADS)

    Padovani, Flavia

    2015-11-01

    The problem of measurement is a central issue in the epistemology and methodology of the physical sciences. In recent literature on scientific representation, large emphasis has been put on the "constitutive role" played by measurement procedures as forms of representation. Despite its importance, this issue hardly finds any mention in writings on constitutive principles, viz. in Michael Friedman's account of relativized a priori principles. This issue, instead, was at the heart of Reichenbach's analysis of coordinating principles that has inspired Friedman's interpretation. This paper suggests that these procedures should have a part in an account of constitutive principles of science, and that they could be interpreted following the intuition originally present (but ultimately not fully developed) in Reichenbach's early work.

  12. Onboard star identification without a priori attitude information

    NASA Astrophysics Data System (ADS)

    Ketchum, Eleanor A.; Tolson, Robert H.

    1995-03-01

    Many algorithms used today determine spacecraft attitude by identifying stars in the field of view of a star tracker. However, each of these methods require some a priori knowledge of the spacecraft attitude. Some algorithms have been extended to implement a computation-intense full-sky scan. Others require large data bases. Both storage and speed are concerns for autonomous onboard systems. This paper presents an algorithm that, by discretizing the sky and filtering by visual magnitude of the brightest observed star, provides a star identification process that is computationally efficient, compared to existing techniques. A savings in onboard storage of over 80% compared with a popular existing technique is documented. Results of random tests with simulated star fields are presented without false identification and with dramatic increase in speed over full-sky scan methods.

  13. Conventional Principles in Science: On the foundations and development of the relativized a priori

    NASA Astrophysics Data System (ADS)

    Ivanova, Milena; Farr, Matt

    2015-11-01

    The present volume consists of a collection of papers originally presented at the conference Conventional Principles in Science, held at the University of Bristol, August 2011, which featured contributions on the history and contemporary development of the notion of 'relativized a priori' principles in science, from Henri Poincaré's conventionalism to Michael Friedman's contemporary defence of the relativized a priori. In Science and Hypothesis, Poincaré assessed the problematic epistemic status of Euclidean geometry and Newton's laws of motion, famously arguing that each has the status of 'convention' in that their justification is neither analytic nor empirical in nature. In The Theory of Relativity and A Priori Knowledge, Hans Reichenbach, in light of the general theory of relativity, proposed an updated notion of the Kantian synthetic a priori to account for the dynamic inter-theoretic status of geometry and other non-empirical physical principles. Reichenbach noted that one may reject the 'necessarily true' aspect of the synthetic a priori whilst preserving the feature of being constitutive of the object of knowledge. Such constitutive principles are theory-relative, as illustrated by the privileged role of non-Euclidean geometry in general relativity theory. This idea of relativized a priori principles in spacetime physics has been analysed and developed at great length in the modern literature in the work of Michael Friedman, in particular the roles played by the light postulate and the equivalence principle - in special and general relativity respectively - in defining the central terms of their respective theories and connecting the abstract mathematical formalism of the theories with their empirical content. The papers in this volume guide the reader through the historical development of conventional and constitutive principles in science, from the foundational work of Poincaré, Reichenbach and others, to contemporary issues and applications of the

  14. Integrating a priori information in edge-linking algorithms

    NASA Astrophysics Data System (ADS)

    Farag, Aly A.; Cao, Yu; Yeap, Yuen-Pin

    1992-09-01

    This research presents an approach to integrate a priori information to the path metric of the LINK algorithm. The zero-crossing contours of the $DEL2G are taken as a gross estimate of the boundaries in the image. This estimate of the boundaries is used to define the swath of important information, and to provide a distance measure for edge localization. During the linking process, a priori information plays important roles in (1) dramatically reducing the search space because the actual path lies within +/- 2 (sigma) f from the prototype contours ((sigma) f is the standard deviation of the Gaussian kernel used in the edge enhancement step); (2) breaking the ties when the search metrics give uncertain information; and (3) selecting the set of goal nodes for the search algorithm. We show that the integration of a priori information in the LINK algorithms provides faster and more accurate edge linking.

  15. A Priori Estimation of Rate Constants for Unimolecular Decomposition Reactions

    DTIC Science & Technology

    1979-02-01

    priori theoretical predictions for the decomposition rate of the formyl and the methoxy radicals have been made by application of the Rice-Ramsperger...Kassel-Marcus Theory. An ArrheniLus rate coefficient expression is derived for the formyl radical decomposition, and a modified Arrhenius type rate...9 IV. A PREDICTED RATE CONSTANT FOR FORMYL RADICAL DECOMPOSITION. .. .... .................... ........19 V. SUMMtARY

  16. The Influence of "a priori" Ideas on the Experimental Approach.

    ERIC Educational Resources Information Center

    Cauzinille-Marmeche, Evelyne; And Others

    1985-01-01

    Investigated the role of "a priori" ideas in planning experiments and data processing leading to inferences. Thirty-one students (ages 11-13) observed a "combustion/candle in a closed container" experiment and were asked to interpret sets of measurements. Findings, among others, show that children preferentially experiment on…

  17. Ex Priori: Exposure-based Prioritization across Chemical Space

    EPA Science Inventory

    EPA's Exposure Prioritization (Ex Priori) is a simplified, quantitative visual dashboard that makes use of data from various inputs to provide rank-ordered internalized dose metric. This complements other high throughput screening by viewing exposures within all chemical space si...

  18. "A Priori" Assessment of Language Learning Tasks by Practitioners

    ERIC Educational Resources Information Center

    Westhoff, Gerard J.

    2009-01-01

    Teachers' competence to estimate the effectiveness of learning materials is important and often neglected in programmes for teacher education. In this lecture I will try to explore the possibilities of designing scaffolding instruments for a "priori" assessment of language learning tasks, based on insights from SLA and cognitive psychology, more…

  19. The Influence of "a priori" Ideas on the Experimental Approach.

    ERIC Educational Resources Information Center

    Cauzinille-Marmeche, Evelyne; And Others

    1985-01-01

    Investigated the role of "a priori" ideas in planning experiments and data processing leading to inferences. Thirty-one students (ages 11-13) observed a "combustion/candle in a closed container" experiment and were asked to interpret sets of measurements. Findings, among others, show that children preferentially experiment on…

  20. Ex Priori: Exposure-based Prioritization across Chemical Space

    EPA Science Inventory

    EPA's Exposure Prioritization (Ex Priori) is a simplified, quantitative visual dashboard that makes use of data from various inputs to provide rank-ordered internalized dose metric. This complements other high throughput screening by viewing exposures within all chemical space si...

  1. "A Priori" Assessment of Language Learning Tasks by Practitioners

    ERIC Educational Resources Information Center

    Westhoff, Gerard J.

    2009-01-01

    Teachers' competence to estimate the effectiveness of learning materials is important and often neglected in programmes for teacher education. In this lecture I will try to explore the possibilities of designing scaffolding instruments for a "priori" assessment of language learning tasks, based on insights from SLA and cognitive psychology, more…

  2. Structural a priori information in near-infrared optical tomography

    NASA Astrophysics Data System (ADS)

    Dehghani, Hamid; Carpenter, Colin M.; Yalavarthy, Phaneendra K.; Pogue, Brian W.; Culver, Joseph P.

    2007-02-01

    Recent interest in the use of dual modality imaging in the field of optical Near Infrared (NIR) Tomography has increased, specifically with use of structural information, from for example, MRI. Although MRI images provide high resolution structural information about tissue, they lack the contrast and functional information needed to investigate physiology, whereas NIR data has been established as a high contrast imaging modality, but one which suffers from low resolution. To this effect, the use of dual modality data has been shown to increase the qualitative and quantitative accuracy of clinical information that can be obtained from tissue. Results so far have indicated that providing accurate apriori structural information is available, such dual modality imaging techniques can be used for the detection and characterization of breast cancer in-vivo, as well as the investigation of brain function and physiology in both human and small animal studies. Although there has been much interest and research into the best suitable and robust use of a-priori structural information within the reconstruction of optical properties of tissue, little work has been done into the investigation of how much accuracy is needed from the structural MRI images in order to obtain the most clinically reliable information. In this paper, we will present and demonstrate the two most common application of a-priori information into image reconstruction, namely soft and hard priori. The effect of inaccuracies of the a-priori structural information within the reconstructed NIR images are presented showing that providing that the error of the a-priori information is within 20% in terms of size and location, adequate NIR images can be reconstructed.

  3. First-arrival traveltime sound speed inversion with a priori information

    PubMed Central

    Hooi, Fong Ming; Carson, Paul L.

    2014-01-01

    Purpose: A first-arrival travel-time sound speed algorithm presented byTarantola [Inverse Problem Theory and Methods for Model Parameter Estimation (SIAM, Philadelphia, PA, 2005)] is adapted to the medical ultrasonics setting. Through specification of a covariance matrix for the object model, the algorithm allows for natural inclusion of physical a priori information of the object. The algorithm's ability to accurately and robustly reconstruct a complex sound speed distribution is demonstrated on simulation and experimental data using a limited aperture. Methods: The algorithm is first demonstrated generally in simulation with a numerical breast phantom imaged in different geometries. As this work is motivated by the authors' limited aperture dual sided ultrasound breast imaging system, experimental data are acquired with a Verasonics system with dual, 128 element, linear L7-4 arrays. The transducers are automatically calibrated for usage in the eikonal forward model.A priori information such as knowledge of correlated regions within the object is obtained via segmentation of B-mode images generated from synthetic aperture imaging. Results: As one illustration of the algorithm's facility for inclusion ofa priori information, physically grounded regularization is demonstrated in simulation. The algorithm's practicality is then demonstrated through experimental realization in limited aperture cases. Reconstructions of sound speed distributions of various complexity are improved through inclusion of a priori information. The sound speed maps are generally reconstructed with accuracy within a few m/s. Conclusions: This paper demonstrates the ability to form sound speed images using two opposed commercial linear arrays to mimic ultrasound image acquisition in the compressed mammographic geometry. The ability to create reasonably good speed of sound images in the compressed mammographic geometry allows images to be readily coregistered to tomosynthesis image volumes for

  4. First-arrival traveltime sound speed inversion with a priori information.

    PubMed

    Hooi, Fong Ming; Carson, Paul L

    2014-08-01

    A first-arrival travel-time sound speed algorithm presented by Tarantola [Inverse Problem Theory and Methods for Model Parameter Estimation (SIAM, Philadelphia, PA, 2005)] is adapted to the medical ultrasonics setting. Through specification of a covariance matrix for the object model, the algorithm allows for natural inclusion of physical a priori information of the object. The algorithm's ability to accurately and robustly reconstruct a complex sound speed distribution is demonstrated on simulation and experimental data using a limited aperture. The algorithm is first demonstrated generally in simulation with a numerical breast phantom imaged in different geometries. As this work is motivated by the authors' limited aperture dual sided ultrasound breast imaging system, experimental data are acquired with a Verasonics system with dual, 128 element, linear L7-4 arrays. The transducers are automatically calibrated for usage in the eikonal forward model.A priori information such as knowledge of correlated regions within the object is obtained via segmentation of B-mode images generated from synthetic aperture imaging. As one illustration of the algorithm's facility for inclusion ofa priori information, physically grounded regularization is demonstrated in simulation. The algorithm's practicality is then demonstrated through experimental realization in limited aperture cases. Reconstructions of sound speed distributions of various complexity are improved through inclusion of a priori information. The sound speed maps are generally reconstructed with accuracy within a few m/s. This paper demonstrates the ability to form sound speed images using two opposed commercial linear arrays to mimic ultrasound image acquisition in the compressed mammographic geometry. The ability to create reasonably good speed of sound images in the compressed mammographic geometry allows images to be readily coregistered to tomosynthesis image volumes for breast cancer detection and

  5. Note: Reconstruction of fluid flows in porous media using geometric a priori information

    NASA Astrophysics Data System (ADS)

    Maisl, Michael; Scholl, Hagen; Schorr, Christian; Seemann, Ralf

    2016-12-01

    X-ray tomography typically suffers from insufficient temporal resolution when imaging dynamic processes. Using the example of multiphase flow in solid porous media, we adapt an iterative algorithm to compute 3d tomograms from 2d projections, which allows for a significant reduction of scan time while maintaining a high level of reconstruction quality. To this end, a priori knowledge about the porous medium is incorporated into the reconstruction algorithm. This algorithm is universal when monitoring dynamic changes in any static matrix and allows for an at least five times decreased imaging time with respect to standard reconstruction algorithms.

  6. Note: Reconstruction of fluid flows in porous media using geometric a priori information.

    PubMed

    Maisl, Michael; Scholl, Hagen; Schorr, Christian; Seemann, Ralf

    2016-12-01

    X-ray tomography typically suffers from insufficient temporal resolution when imaging dynamic processes. Using the example of multiphase flow in solid porous media, we adapt an iterative algorithm to compute 3d tomograms from 2d projections, which allows for a significant reduction of scan time while maintaining a high level of reconstruction quality. To this end, a priori knowledge about the porous medium is incorporated into the reconstruction algorithm. This algorithm is universal when monitoring dynamic changes in any static matrix and allows for an at least five times decreased imaging time with respect to standard reconstruction algorithms.

  7. A priori SNR estimation and noise estimation for speech enhancement

    NASA Astrophysics Data System (ADS)

    Yao, Rui; Zeng, ZeQing; Zhu, Ping

    2016-12-01

    A priori signal-to-noise ratio (SNR) estimation and noise estimation are important for speech enhancement. In this paper, a novel modified decision-directed (DD) a priori SNR estimation approach based on single-frequency entropy, named DDBSE, is proposed. DDBSE replaces the fixed weighting factor in the DD approach with an adaptive one calculated according to change of single-frequency entropy. Simultaneously, a new noise power estimation approach based on unbiased minimum mean square error (MMSE) and voice activity detection (VAD), named UMVAD, is proposed. UMVAD adopts different strategies to estimate noise in order to reduce over-estimation and under-estimation of noise. UMVAD improves the classical statistical model-based VAD by utilizing an adaptive threshold to replace the original fixed one and modifies the unbiased MMSE-based noise estimation approach using an adaptive a priori speech presence probability calculated by entropy instead of the original fixed one. Experimental results show that DDBSE can provide greater noise suppression than DD and UMVAD can improve the accuracy of noise estimation. Compared to existing approaches, speech enhancement based on UMVAD and DDBSE can obtain a better segment SNR score and composite measure c ovl score, especially in adverse environments such as non-stationary noise and low-SNR.

  8. A priori SNR estimation and noise estimation for speech enhancement.

    PubMed

    Yao, Rui; Zeng, ZeQing; Zhu, Ping

    2016-01-01

    A priori signal-to-noise ratio (SNR) estimation and noise estimation are important for speech enhancement. In this paper, a novel modified decision-directed (DD) a priori SNR estimation approach based on single-frequency entropy, named DDBSE, is proposed. DDBSE replaces the fixed weighting factor in the DD approach with an adaptive one calculated according to change of single-frequency entropy. Simultaneously, a new noise power estimation approach based on unbiased minimum mean square error (MMSE) and voice activity detection (VAD), named UMVAD, is proposed. UMVAD adopts different strategies to estimate noise in order to reduce over-estimation and under-estimation of noise. UMVAD improves the classical statistical model-based VAD by utilizing an adaptive threshold to replace the original fixed one and modifies the unbiased MMSE-based noise estimation approach using an adaptive a priori speech presence probability calculated by entropy instead of the original fixed one. Experimental results show that DDBSE can provide greater noise suppression than DD and UMVAD can improve the accuracy of noise estimation. Compared to existing approaches, speech enhancement based on UMVAD and DDBSE can obtain a better segment SNR score and composite measure covl score, especially in adverse environments such as non-stationary noise and low-SNR.

  9. Knowledge.

    ERIC Educational Resources Information Center

    Online-Offline, 1999

    1999-01-01

    This theme issue on knowledge includes annotated listings of Web sites, CD-ROMs and computer software, videos, books, and additional resources that deal with knowledge and differences between how animals and humans learn. Sidebars discuss animal intelligence, learning proper behavior, and getting news from the Internet. (LRW)

  10. Knowledge.

    ERIC Educational Resources Information Center

    Online-Offline, 1999

    1999-01-01

    This theme issue on knowledge includes annotated listings of Web sites, CD-ROMs and computer software, videos, books, and additional resources that deal with knowledge and differences between how animals and humans learn. Sidebars discuss animal intelligence, learning proper behavior, and getting news from the Internet. (LRW)

  11. Developing detailed a priori 3D models of large environments to aid in robotic navigation tasks

    NASA Astrophysics Data System (ADS)

    Grinstead, Brad; Koschan, Andreas F.; Abidi, Mongi A.

    2004-09-01

    In order to effectively navigate any environment, a robotic vehicle needs to understand the terrain and obstacles native to that environment. Knowledge of its own location and orientation, and knowledge of the region of operation, can greatly improve the robot"s performance. To this end, we have developed a mobile system for the fast digitization of large-scale environments to develop the a priori information needed for prediction and optimization of the robot"s performance. The system collects ground-level video and laser range information, fusing them together to develop accurate 3D models of the target environment. In addition, the system carries a differential Global Positioning System (GPS) as well as an Inertial Navigation System (INS) for determining the position and orientation of the various scanners as they acquire data. Issues involved in the fusion of these various data modalities include: Integration of the position and orientation (pose) sensors" data at varying sampling rates and availability; Selection of "best" geometry in overlapping data cases; Efficient representation of large 3D datasets for real-time processing techniques. Once the models have been created, this data can be used to provide a priori information about negative obstacles, obstructed fields of view, navigation constraints, and focused feature detection.

  12. A priori discretization quality metrics for distributed hydrologic modeling applications

    NASA Astrophysics Data System (ADS)

    Liu, Hongli; Tolson, Bryan; Craig, James; Shafii, Mahyar; Basu, Nandita

    2016-04-01

    In distributed hydrologic modelling, a watershed is treated as a set of small homogeneous units that address the spatial heterogeneity of the watershed being simulated. The ability of models to reproduce observed spatial patterns firstly depends on the spatial discretization, which is the process of defining homogeneous units in the form of grid cells, subwatersheds, or hydrologic response units etc. It is common for hydrologic modelling studies to simply adopt a nominal or default discretization strategy without formally assessing alternative discretization levels. This approach lacks formal justifications and is thus problematic. More formalized discretization strategies are either a priori or a posteriori with respect to building and running a hydrologic simulation model. A posteriori approaches tend to be ad-hoc and compare model calibration and/or validation performance under various watershed discretizations. The construction and calibration of multiple versions of a distributed model can become a seriously limiting computational burden. Current a priori approaches are more formalized and compare overall heterogeneity statistics of dominant variables between candidate discretization schemes and input data or reference zones. While a priori approaches are efficient and do not require running a hydrologic model, they do not fully investigate the internal spatial pattern changes of variables of interest. Furthermore, the existing a priori approaches focus on landscape and soil data and do not assess impacts of discretization on stream channel definition even though its significance has been noted by numerous studies. The primary goals of this study are to (1) introduce new a priori discretization quality metrics considering the spatial pattern changes of model input data; (2) introduce a two-step discretization decision-making approach to compress extreme errors and meet user-specified discretization expectations through non-uniform discretization threshold

  13. Effects of a priori liking on the elicitation of mimicry.

    PubMed

    Stel, Mariëlle; van Baaren, Rick B; Blascovich, Jim; van Dijk, Eric; McCall, Cade; Pollmann, Monique M H; van Leeuwen, Matthijs L; Mastop, Jessanne; Vonk, Roos

    2010-01-01

    Mimicry and prosocial feelings are generally thought to be positively related. However, the conditions under which mimicry and liking are related largely remain unspecified. We advance this specification by examining the relationship between mimicry and liking more thoroughly. In two experiments, we manipulated an individual's a priori liking for another and investigated whether it influenced mimicry of that person. Our experiments demonstrate that in the presence of a reason to like a target, automatic mimicry is increased. However, mimicry did not decrease when disliking a target. These studies provide further evidence of a link between mimicry and liking and extend previous research by showing that a certain level of mimicry even occurs when mimicry behavior is inconsistent with one's goals or motivations.

  14. A priori physicalism, lonely ghosts and Cartesian doubt.

    PubMed

    Goff, Philip

    2012-06-01

    A zombie is a physical duplicates of a human being which lacks consciousness. A ghost is a phenomenal duplicate of a human being whose nature is exhausted by consciousness. Discussion of zombie arguments, that is anti-physicalist arguments which appeal to the conceivability of zombies, is familiar in the philosophy of mind literature, whilst ghostly arguments, that is, anti-physicalist arguments which appeal to the conceivability of ghosts, are somewhat neglected. In this paper I argue that ghostly arguments have a number of dialectical advantages over zombie arguments. I go onto explain how the conceivability of ghosts is inconsistent with two kinds of a priori physicalism: analytic functionalism and the Australian physicalism of Armstrong and Lewis. Copyright © 2011 Elsevier Inc. All rights reserved.

  15. A priori discretization error metrics for distributed hydrologic modeling applications

    NASA Astrophysics Data System (ADS)

    Liu, Hongli; Tolson, Bryan A.; Craig, James R.; Shafii, Mahyar

    2016-12-01

    Watershed spatial discretization is an important step in developing a distributed hydrologic model. A key difficulty in the spatial discretization process is maintaining a balance between the aggregation-induced information loss and the increase in computational burden caused by the inclusion of additional computational units. Objective identification of an appropriate discretization scheme still remains a challenge, in part because of the lack of quantitative measures for assessing discretization quality, particularly prior to simulation. This study proposes a priori discretization error metrics to quantify the information loss of any candidate discretization scheme without having to run and calibrate a hydrologic model. These error metrics are applicable to multi-variable and multi-site discretization evaluation and provide directly interpretable information to the hydrologic modeler about discretization quality. The first metric, a subbasin error metric, quantifies the routing information loss from discretization, and the second, a hydrological response unit (HRU) error metric, improves upon existing a priori metrics by quantifying the information loss due to changes in land cover or soil type property aggregation. The metrics are straightforward to understand and easy to recode. Informed by the error metrics, a two-step discretization decision-making approach is proposed with the advantage of reducing extreme errors and meeting the user-specified discretization error targets. The metrics and decision-making approach are applied to the discretization of the Grand River watershed in Ontario, Canada. Results show that information loss increases as discretization gets coarser. Moreover, results help to explain the modeling difficulties associated with smaller upstream subbasins since the worst discretization errors and highest error variability appear in smaller upstream areas instead of larger downstream drainage areas. Hydrologic modeling experiments under

  16. Precise regional baseline estimation using a priori orbital information

    NASA Technical Reports Server (NTRS)

    Lindqwister, Ulf J.; Lichten, Stephen M.; Blewitt, Geoffrey

    1990-01-01

    A solution using GPS measurements acquired during the CASA Uno campaign has resulted in 3-4 mm horizontal daily baseline repeatability and 13 mm vertical repeatability for a 729 km baseline, located in North America. The agreement with VLBI is at the level of 10-20 mm for all components. The results were obtained with the GIPSY orbit determination and baseline estimation software and are based on five single-day data arcs spanning the 20, 21, 25, 26, and 27 of January, 1988. The estimation strategy included resolving the carrier phase integer ambiguities, utilizing an optial set of fixed reference stations, and constraining GPS orbit parameters by applying a priori information. A multiday GPS orbit and baseline solution has yielded similar 2-4 mm horizontal daily repeatabilities for the same baseline, consistent with the constrained single-day arc solutions. The application of weak constraints to the orbital state for single-day data arcs produces solutions which approach the precise orbits obtained with unconstrained multiday arc solutions.

  17. Impact of modellers' decisions on hydrological a priori predictions

    NASA Astrophysics Data System (ADS)

    Holländer, H. M.; Bormann, H.; Blume, T.; Buytaert, W.; Chirico, G. B.; Exbrayat, J.-F.; Gustafsson, D.; Hölzel, H.; Krauße, T.; Kraft, P.; Stoll, S.; Blöschl, G.; Flühler, H.

    2013-07-01

    The purpose of this paper is to stimulate a re-thinking of how we, the catchment hydrologists, could become reliable forecasters. A group of catchment modellers predicted the hydrological response of a man-made 6 ha catchment in its initial phase (Chicken Creek) without having access to the observed records. They used conceptually different model families. Their modelling experience differed largely. The prediction exercise was organized in three steps: (1) for the 1st prediction modellers received a basic data set describing the internal structure of the catchment (somewhat more complete than usually available to a priori predictions in ungauged catchments). They did not obtain time series of stream flow, soil moisture or groundwater response. (2) Before the 2nd improved prediction they inspected the catchment on-site and attended a workshop where the modellers presented and discussed their first attempts. (3) For their improved 3rd prediction they were offered additional data by charging them pro forma with the costs for obtaining this additional information. Holländer et al. (2009) discussed the range of predictions obtained in step 1. Here, we detail the modeller's decisions in accounting for the various processes based on what they learned during the field visit (step 2) and add the final outcome of step 3 when the modellers made use of additional data. We document the prediction progress as well as the learning process resulting from the availability of added information. For the 2nd and 3rd step, the progress in prediction quality could be evaluated in relation to individual modelling experience and costs of added information. We learned (i) that soft information such as the modeller's system understanding is as important as the model itself (hard information), (ii) that the sequence of modelling steps matters (field visit, interactions between differently experienced experts, choice of model, selection of available data, and methods for parameter guessing

  18. Impact of modellers' decisions on hydrological a priori predictions

    NASA Astrophysics Data System (ADS)

    Holländer, H. M.; Bormann, H.; Blume, T.; Buytaert, W.; Chirico, G. B.; Exbrayat, J.-F.; Gustafsson, D.; Hölzel, H.; Krauße, T.; Kraft, P.; Stoll, S.; Blöschl, G.; Flühler, H.

    2014-06-01

    In practice, the catchment hydrologist is often confronted with the task of predicting discharge without having the needed records for calibration. Here, we report the discharge predictions of 10 modellers - using the model of their choice - for the man-made Chicken Creek catchment (6 ha, northeast Germany, Gerwin et al., 2009b) and we analyse how well they improved their prediction in three steps based on adding information prior to each following step. The modellers predicted the catchment's hydrological response in its initial phase without having access to the observed records. They used conceptually different physically based models and their modelling experience differed largely. Hence, they encountered two problems: (i) to simulate discharge for an ungauged catchment and (ii) using models that were developed for catchments, which are not in a state of landscape transformation. The prediction exercise was organized in three steps: (1) for the first prediction the modellers received a basic data set describing the catchment to a degree somewhat more complete than usually available for a priori predictions of ungauged catchments; they did not obtain information on stream flow, soil moisture, nor groundwater response and had therefore to guess the initial conditions; (2) before the second prediction they inspected the catchment on-site and discussed their first prediction attempt; (3) for their third prediction they were offered additional data by charging them pro forma with the costs for obtaining this additional information. Holländer et al. (2009) discussed the range of predictions obtained in step (1). Here, we detail the modeller's assumptions and decisions in accounting for the various processes. We document the prediction progress as well as the learning process resulting from the availability of added information. For the second and third steps, the progress in prediction quality is evaluated in relation to individual modelling experience and costs of

  19. Using a priori knowledge for developing bolometric tomography in toroidal devices

    NASA Astrophysics Data System (ADS)

    Sano, Ryuichi; Peterson, Byron J.; Mukai, Kiyofumi; Teranishi, Masaru; Iwama, Naofumi; Kobayashi, Masahiro

    2016-11-01

    In tomographic imaging of magnetically confined toroidal plasmas, a countermeasure against missing observation has been studied in terms of the adoption of prior information based on modelled plasma profiles. The Tikhonov regularization for image reconstruction is extended by the use of the Euclidean distance. A procedure of model fitting is designed in order to adaptively generate the reference image. The new method is tested on a typical example of ill-conditioned tomography, that is, the three-dimensional imaging-bolometer tomography in the large helical device. It has been found that the new method is useful for diminishing artifacts and thus for better recognizing the radiation structure of plasma.

  20. Automated cleaning and uncertainty attribution of archival bathymetry based on a priori knowledge

    NASA Astrophysics Data System (ADS)

    Ladner, Rodney Wade; Elmore, Paul; Perkins, A. Louise; Bourgeois, Brian; Avera, Will

    2017-03-01

    Hydrographic offices hold large valuable historic bathymetric data sets, many of which were collected using older generation survey systems that contain little or no metadata and/or uncertainty estimates. These bathymetric data sets generally contain large outlier (errant) data points to clean, yet standard practice does not include rigorous automated procedures for systematic cleaning of these historical data sets and their subsequent conversion into reusable data formats. In this paper, we propose an automated method for this task. We utilize statistically diverse threshold tests, including a robust least trimmed squared method, to clean the data. We use LOESS weighted regression residuals together with a Student-t distribution to attribute uncertainty for each retained sounding; the resulting uncertainty values compare favorably with native estimates of uncertainty from co-located data sets which we use to estimate a point-wise goodness-of-fit measure. Storing a cleansed validated data set augmented with uncertainty in a re-usable format provides the details of this analysis for subsequent users. Our test results indicate that the method significantly improves the quality of the data set while concurrently providing confidence interval estimates and point-wise goodness-of-fit estimates as referenced to current hydrographic practices.

  1. Predicting folding-unfolding transitions in proteins without a priori knowledge of the folded state

    NASA Astrophysics Data System (ADS)

    Okan, Osman; Turgut, Deniz; Garcia, Angel; Ozisik, Rahmi

    2013-03-01

    The common computational method of studying folding transitions in proteins is to compare simulated conformations against the folded structure, but this method obviously requires the folded structure to be known beforehand. In the current study, we show that the use of bond orientational order parameter (BOOP) Ql [Steinhardt PJ, Nelson DR, Ronchetti M, Phys. Rev. B 1983, 28, 784] is a viable alternative to the commonly adopted root mean squared distance (RMSD) measure in probing conformational transitions. Replica exchange molecular dynamics simulations of the trp-cage protein (with 20 residues) in TIP-3P water were used to compare BOOP against RMSD. The results indicate that the correspondence between BOOP and RMSD time series become stronger with increasing l. We finally show that robust linear models that incorporate different Ql can be parameterized from a given replica run and can be used to study other replica trajectories. This work is partially supported by NSF DUE-1003574.

  2. A no a priori knowledge estimation of the impulse response for satellite image noise reduction

    NASA Astrophysics Data System (ADS)

    Benbouzid, A. B.; Taleb, N.

    2015-04-01

    Due to launching vibrations and space harsh environment, high resolution remote sensing satellite imaging systems require permanent assessment and control of image quality, which may vary between ground pre-launch measurements, after launch and over satellite lifetime. In order to mitigate noise, remove artifacts and enhance image interpretability, the Point Spread Function (PSF) of the imaging system is estimated. Image deconvolution can be performed across the characterization of the actual Modulation Transfer Function (MTF) of the imaging system. In this work we focus on adapting and applying a no reference method to characterize in orbit high resolution satellite images in terms of geometrical performance. Moreover, we use natural details contained in images as edges transitions to estimate the impulse response via the assessment of the MTF image. The obtained results are encouraging and promising.

  3. Automated cleaning and uncertainty attribution of archival bathymetry based on a priori knowledge

    NASA Astrophysics Data System (ADS)

    Ladner, Rodney Wade; Elmore, Paul; Perkins, A. Louise; Bourgeois, Brian; Avera, Will

    2017-09-01

    Hydrographic offices hold large valuable historic bathymetric data sets, many of which were collected using older generation survey systems that contain little or no metadata and/or uncertainty estimates. These bathymetric data sets generally contain large outlier (errant) data points to clean, yet standard practice does not include rigorous automated procedures for systematic cleaning of these historical data sets and their subsequent conversion into reusable data formats. In this paper, we propose an automated method for this task. We utilize statistically diverse threshold tests, including a robust least trimmed squared method, to clean the data. We use LOESS weighted regression residuals together with a Student-t distribution to attribute uncertainty for each retained sounding; the resulting uncertainty values compare favorably with native estimates of uncertainty from co-located data sets which we use to estimate a point-wise goodness-of-fit measure. Storing a cleansed validated data set augmented with uncertainty in a re-usable format provides the details of this analysis for subsequent users. Our test results indicate that the method significantly improves the quality of the data set while concurrently providing confidence interval estimates and point-wise goodness-of-fit estimates as referenced to current hydrographic practices.

  4. Validating Affordances as an Instrument for Design and a Priori Analysis of Didactical Situations in Mathematics

    ERIC Educational Resources Information Center

    Sollervall, Håkan; Stadler, Erika

    2015-01-01

    The aim of the presented case study is to investigate how coherent analytical instruments may guide the a priori and a posteriori analyses of a didactical situation. In the a priori analysis we draw on the notion of affordances, as artefact-mediated opportunities for action, to construct hypothetical trajectories of goal-oriented actions that have…

  5. Testing the plausibility of several a priori assumed error distributions for discharge measurements

    NASA Astrophysics Data System (ADS)

    Van Eerdenbrugh, Katrien; Verhoest, Niko E. C.

    2017-04-01

    Hydrologic measurements are used for a variety of research topics and operational projects. Regardless of the application, it is important to account for measurement uncertainty. In many projects, no local information is available about this uncertainty. Therefore, error distributions and accompanying parameters or uncertainty boundaries are often taken from literature without any knowledge about their applicability in the new context. In this research, an approach is proposed that uses relative differences between simultaneous discharge measurements to test the plausibility of several a priori assumed error distributions. For this test, simultaneous discharge measurements (measured with one type of device) from nine different Belgian rivers were available. This implies the assumption that their error distribution does not depend upon river, measurement location and measurement team. Moreover, it is assumed that the errors of two simultaneous measurements are not mutually dependent. This data set does not allow for a direct assessment of measurement errors. However, independently of the value of the real discharge, the relative difference between two simultaneous measurements can be expressed by their relative measurement errors. If a distribution is assumed for these errors, it is thus possible to test equality between the distributions of both the relative differences of the simultaneously measured discharge pairs and a created set of relative differences based on two equally sized samples of measurement errors from the assumed distribution. If the assumed error distribution is correct, these two data sets will have the same distribution. In this research, equality is tested with a two-sample nonparametric Kolmogorov-Smirnov test. The resulting p-value and the corresponding value of the Kolmogorov-Smirnov statistic (KS statistic) are used for this evaluation. The occurrence of a high p-value (and corresponding small value of the KS statistic) provides no

  6. Reconstruction of Consistent 3d CAD Models from Point Cloud Data Using a Priori CAD Models

    NASA Astrophysics Data System (ADS)

    Bey, A.; Chaine, R.; Marc, R.; Thibault, G.; Akkouche, S.

    2011-09-01

    We address the reconstruction of 3D CAD models from point cloud data acquired in industrial environments, using a pre-existing 3D model as an initial estimate of the scene to be processed. Indeed, this prior knowledge can be used to drive the reconstruction so as to generate an accurate 3D model matching the point cloud. We more particularly focus our work on the cylindrical parts of the 3D models. We propose to state the problem in a probabilistic framework: we have to search for the 3D model which maximizes some probability taking several constraints into account, such as the relevancy with respect to the point cloud and the a priori 3D model, and the consistency of the reconstructed model. The resulting optimization problem can then be handled using a stochastic exploration of the solution space, based on the random insertion of elements in the configuration under construction, coupled with a greedy management of the conflicts which efficiently improves the configuration at each step. We show that this approach provides reliable reconstructed 3D models by presenting some results on industrial data sets.

  7. Acoustic attenuation imaging of tissue bulk properties with a priori information

    PubMed Central

    Hooi, Fong Ming; Kripfgans, Oliver; Carson, Paul L.

    2016-01-01

    Attenuation of ultrasound waves traversing a medium is not only a result of absorption and scattering within a given tissue, but also of coherent scattering, including diffraction, refraction, and reflection of the acoustic wave at tissue boundaries. This leads to edge enhancement and other artifacts in most reconstruction algorithms, other than 3D wave migration with currently impractical, implementations. The presented approach accounts for energy loss at tissue boundaries by normalizing data based on variable sound speed, and potential density, of the medium using a k-space wave solver. Coupled with a priori knowledge of major sound speed distributions, physical attenuation values within broad ranges, and the assumption of homogeneity within segmented regions, an attenuation image representative of region bulk properties is constructed by solving a penalized weighted least squares optimization problem. This is in contradistinction to absorption or to conventional attenuation coefficient based on overall insertion loss with strong dependence on sound speed and impedance mismatches at tissue boundaries. This imaged property will be referred to as the bulk attenuation coefficient. The algorithm is demonstrated on an opposed array setup, with mean-squared-error improvements from 0.6269 to 0.0424 (dB/cm/MHz)2 for a cylindrical phantom, and 0.1622 to 0.0256 (dB/cm/MHz)2 for a windowed phantom. PMID:27914403

  8. Use of a priori statistics to minimize acquisition time for RFI immune spread spectrum systems

    NASA Technical Reports Server (NTRS)

    Holmes, J. K.; Woo, K. T.

    1978-01-01

    The optimum acquisition sweep strategy was determined for a PN code despreader when the a priori probability density function was not uniform. A psuedo noise spread spectrum system was considered which could be utilized in the DSN to combat radio frequency interference. In a sample case, when the a priori probability density function was Gaussian, the acquisition time was reduced by about 41% compared to a uniform sweep approach.

  9. An algorithm for finding biologically significant features in microarray data based on a priori manifold learning.

    PubMed

    Hira, Zena M; Trigeorgis, George; Gillies, Duncan F

    2014-01-01

    Microarray databases are a large source of genetic data, which, upon proper analysis, could enhance our understanding of biology and medicine. Many microarray experiments have been designed to investigate the genetic mechanisms of cancer, and analytical approaches have been applied in order to classify different types of cancer or distinguish between cancerous and non-cancerous tissue. However, microarrays are high-dimensional datasets with high levels of noise and this causes problems when using machine learning methods. A popular approach to this problem is to search for a set of features that will simplify the structure and to some degree remove the noise from the data. The most widely used approach to feature extraction is principal component analysis (PCA) which assumes a multivariate Gaussian model of the data. More recently, non-linear methods have been investigated. Among these, manifold learning algorithms, for example Isomap, aim to project the data from a higher dimensional space onto a lower dimension one. We have proposed a priori manifold learning for finding a manifold in which a representative set of microarray data is fused with relevant data taken from the KEGG pathway database. Once the manifold has been constructed the raw microarray data is projected onto it and clustering and classification can take place. In contrast to earlier fusion based methods, the prior knowledge from the KEGG databases is not used in, and does not bias the classification process--it merely acts as an aid to find the best space in which to search the data. In our experiments we have found that using our new manifold method gives better classification results than using either PCA or conventional Isomap.

  10. Lost in space: Onboard star identification using CCD star tracker data without an a priori attitude

    NASA Technical Reports Server (NTRS)

    Ketchum, Eleanor A.; Tolson, Robert H.

    1993-01-01

    There are many algorithms in use today which determine spacecraft attitude by identifying stars in the field of view of a star tracker. Some methods, which date from the early 1960's, compare the angular separation between observed stars with a small catalog. In the last 10 years, several methods have been developed which speed up the process and reduce the amount of memory needed, a key element to onboard attitude determination. However, each of these methods require some a priori knowledge of the spacecraft attitude. Although the Sun and magnetic field generally provide the necessary coarse attitude information, there are occasions when a spacecraft could get lost when it is not prudent to wait for sunlight. Also, the possibility of efficient attitude determination using only the highly accurate CCD star tracker could lead to fully autonomous spacecraft attitude determination. The need for redundant coarse sensors could thus be eliminated at substantial cost reduction. Some groups have extended their algorithms to implement a computation intense full sky scan. Some require large data bases. Both storage and speed are concerns for autonomous onboard systems. Neural network technology is even being explored by some as a possible solution, but because of the limited number of patterns that can be stored and large overhead, nothing concrete has resulted from these efforts. This paper presents an algorithm which, by descretizing the sky and filtering by visual magnitude of the brightness observed star, speeds up the lost in space star identification process while reducing the amount of necessary onboard computer storage compared to existing techniques.

  11. Lost in space: Onboard star identification using CCD star tracker data without an a priori attitude

    NASA Technical Reports Server (NTRS)

    Ketchum, Eleanor A.; Tolson, Robert H.

    1993-01-01

    There are many algorithms in use today which determine spacecraft attitude by identifying stars in the field of view of a star tracker. Some methods, which date from the early 1960's, compare the angular separation between observed stars with a small catalog. In the last 10 years, several methods have been developed which speed up the process and reduce the amount of memory needed, a key element to onboard attitude determination. However, each of these methods require some a priori knowledge of the spacecraft attitude. Although the Sun and magnetic field generally provide the necessary coarse attitude information, there are occasions when a spacecraft could get lost when it is not prudent to wait for sunlight. Also, the possibility of efficient attitude determination using only the highly accurate CCD star tracker could lead to fully autonomous spacecraft attitude determination. The need for redundant coarse sensors could thus be eliminated at substantial cost reduction. Some groups have extended their algorithms to implement a computation intense full sky scan. Some require large data bases. Both storage and speed are concerns for autonomous onboard systems. Neural network technology is even being explored by some as a possible solution, but because of the limited number of patterns that can be stored and large overhead, nothing concrete has resulted from these efforts. This paper presents an algorithm which, by descretizing the sky and filtering by visual magnitude of the brightness observed star, speeds up the lost in space star identification process while reducing the amount of necessary onboard computer storage compared to existing techniques.

  12. The constitutive a priori and the distinction between mathematical and physical possibility

    NASA Astrophysics Data System (ADS)

    Everett, Jonathan

    2015-11-01

    This paper is concerned with Friedman's recent revival of the notion of the relativized a priori. It is particularly concerned with addressing the question as to how Friedman's understanding of the constitutive function of the a priori has changed since his defence of the idea in his Dynamics of Reason. Friedman's understanding of the a priori remains influenced by Reichenbach's initial defence of the idea; I argue that this notion of the a priori does not naturally lend itself to describing the historical development of space-time physics. Friedman's analysis of the role of the rotating frame thought experiment in the development of general relativity - which he suggests made the mathematical possibility of four-dimensional space-time a genuine physical possibility - has a central role in his argument. I analyse this thought experiment and argue that it is better understood by following Cassirer and placing emphasis on regulative principles. Furthermore, I argue that Cassirer's Kantian framework enables us to capture Friedman's key insights into the nature of the constitutive a priori.

  13. Mediterranean Diet and Cardiovascular Disease: A Critical Evaluation of A Priori Dietary Indexes

    PubMed Central

    D’Alessandro, Annunziata; De Pergola, Giovanni

    2015-01-01

    The aim of this paper is to analyze the a priori dietary indexes used in the studies that have evaluated the role of the Mediterranean Diet in influencing the risk of developing cardiovascular disease. All the studies show that this dietary pattern protects against cardiovascular disease, but studies show quite different effects on specific conditions such as coronary heart disease or cerebrovascular disease. A priori dietary indexes used to measure dietary exposure imply quantitative and/or qualitative divergences from the traditional Mediterranean Diet of the early 1960s, and, therefore, it is very difficult to compare the results of different studies. Based on real cultural heritage and traditions, we believe that the a priori indexes used to evaluate adherence to the Mediterranean Diet should consider classifying whole grains and refined grains, olive oil and monounsaturated fats, and wine and alcohol differently. PMID:26389950

  14. Bayesian classification of polarimetric SAR images using adaptive a priori probabilities

    NASA Technical Reports Server (NTRS)

    Van Zyl, J. J.; Burnette, C. F.

    1992-01-01

    The problem of classifying earth terrain by observed polarimetric scattering properties is tackled with an iterative Bayesian scheme using a priori probabilities adaptively. The first classification is based on the use of fixed and not necessarily equal a priori probabilities, and successive iterations change the a priori probabilities adaptively. The approach is applied to an SAR image in which a single water body covers 10 percent of the image area. The classification accuracy for ocean, urban, vegetated, and total area increase, and the percentage of reclassified pixels decreases greatly as the iteration number increases. The iterative scheme is found to improve the a posteriori classification accuracy of maximum likelihood classifiers by iteratively using the local homogeneity in polarimetric SAR images. A few iterations can improve the classification accuracy significantly without sacrificing key high-frequency detail or edges in the image.

  15. Incorporation of a priori gravity field information in satellite orbit determination using bin parameters

    NASA Technical Reports Server (NTRS)

    Wu, Jiun-Tsong; Wu, Sien-Chong

    1992-01-01

    A method to determine satellite orbits using tracking data and a priori gravitational field is described. The a priori constraint on the orbit dynamics is determined by the covariance matrix of the spherical harmonic coefficients for the gravity model, so that the optimal combination of the measurements and gravitational field is achieved. A set of bin parameters is introduced to represent the perturbation of the gravitational field on the position of the satellite orbit. The covariance matrix of a conventional gravity model is transformed into that for the bin parameters by the variational partial derivatives. The covariance matrices of the bin parameters and the epoch state are combined to form the covariance matrix of the satellite positions at the measurement times. The combined matrix is used as the a priori information to estimate the satellite positions with measurements.

  16. Bayesian classification of polarimetric SAR images using adaptive a priori probabilities

    NASA Technical Reports Server (NTRS)

    Van Zyl, J. J.; Burnette, C. F.

    1992-01-01

    The problem of classifying earth terrain by observed polarimetric scattering properties is tackled with an iterative Bayesian scheme using a priori probabilities adaptively. The first classification is based on the use of fixed and not necessarily equal a priori probabilities, and successive iterations change the a priori probabilities adaptively. The approach is applied to an SAR image in which a single water body covers 10 percent of the image area. The classification accuracy for ocean, urban, vegetated, and total area increase, and the percentage of reclassified pixels decreases greatly as the iteration number increases. The iterative scheme is found to improve the a posteriori classification accuracy of maximum likelihood classifiers by iteratively using the local homogeneity in polarimetric SAR images. A few iterations can improve the classification accuracy significantly without sacrificing key high-frequency detail or edges in the image.

  17. Incorporation of a priori gravity field information in satellite orbit determination using bin parameters

    NASA Technical Reports Server (NTRS)

    Wu, Jiun-Tsong; Wu, Sien-Chong

    1992-01-01

    A method to determine satellite orbits using tracking data and a priori gravitational field is described. The a priori constraint on the orbit dynamics is determined by the covariance matrix of the spherical harmonic coefficients for the gravity model, so that the optimal combination of the measurements and gravitational field is achieved. A set of bin parameters is introduced to represent the perturbation of the gravitational field on the position of the satellite orbit. The covariance matrix of a conventional gravity model is transformed into that for the bin parameters by the variational partial derivatives. The covariance matrices of the bin parameters and the epoch state are combined to form the covariance matrix of the satellite positions at the measurement times. The combined matrix is used as the a priori information to estimate the satellite positions with measurements.

  18. Learning to improve medical decision making from imbalanced data without a priori cost.

    PubMed

    Wan, Xiang; Liu, Jiming; Cheung, William K; Tong, Tiejun

    2014-12-05

    In a medical data set, data are commonly composed of a minority (positive or abnormal) group and a majority (negative or normal) group and the cost of misclassifying a minority sample as a majority sample is highly expensive. This is the so-called imbalanced classification problem. The traditional classification functions can be seriously affected by the skewed class distribution in the data. To deal with this problem, people often use a priori cost to adjust the learning process in the pursuit of optimal classification function. However, this priori cost is often unknown and hard to estimate in medical decision making. In this paper, we propose a new learning method, named RankCost, to classify imbalanced medical data without using a priori cost. Instead of focusing on improving the class-prediction accuracy, RankCost is to maximize the difference between the minority class and the majority class by using a scoring function, which translates the imbalanced classification problem into a partial ranking problem. The scoring function is learned via a non-parametric boosting algorithm. We compare RankCost to several representative approaches on four medical data sets varying in size, imbalanced ratio, and dimension. The experimental results demonstrate that unlike the currently available methods that often perform unevenly with different priori costs, RankCost shows comparable performance in a consistent manner. It is a challenging task to learn an effective classification model based on imbalanced data in medical data analysis. The traditional approaches often use a priori cost to adjust the learning of the classification function. This work presents a novel approach, namely RankCost, for learning from medical imbalanced data sets without using a priori cost. The experimental results indicate that RankCost performs very well in imbalanced data classification and can be a useful method in real-world applications of medical decision making.

  19. Allocating Sample Material to Increase the Precision of a Priori Contrasts.

    ERIC Educational Resources Information Center

    Clark, Sheldon B.; Huck, Schuyler W.

    In true experiments in which sample material can be randomly assigned to treatment conditions, most researchers presume that the condition of equal sample sizes is statistically desirable. When one or more a priori contrasts can be identified which represent a few overriding experimental concerns, however, allocating sample material unequally will…

  20. Simplified multi-track detection schemes using a priori information for bit patterned media recording

    NASA Astrophysics Data System (ADS)

    Kong, Gyuyeol; Choi, Sooyong

    2012-04-01

    Simplified multi-track detection schemes using a priori information for bit patterned magnetic recording (BPMR) are proposed in this paper. The proposed detection schemes adopt the simplified trellis diagram, use a priori information, and detect the main-track data in the along- and cross-track directions. The simplified trellis diagram, which has 4 states and 8 branches, can be obtained by setting the corner entries of the generalized partial response (GPR) target to zero and replacing the four parallel branches with a single branch. However, these simplified techniques seriously suffer from performance degradation in high density BPMR channels. To overcome the performance degradation, a priori information is used to give higher reliability to the branch metric. In addition, to fully use the characteristics of channel detection with a two-dimensional (2D) GPR target, the proposed schemes estimate a priori information and detect the main-track data in the along- and cross-track directions by using a 2D equalizer with a 2D GPR target. The bit error rate performances of the proposed schemes are compared with the previous detection schemes when areal density is 3 Tb/in2. Simulation results show that the proposed schemes with simpler structures have more than 2 dB gains compared with the other detection schemes.

  1. Background and Attitude Questionnaire Items and A Priori Weights. Table 4.

    ERIC Educational Resources Information Center

    Michigan State Dept. of Education, Lansing. Research, Evaluation, and Assessment Services.

    Background and attitude questionnaire items used in the Michigan Educational Assessment battery to measure socioeconomic status and attitudes toward self, school, and the importance of school achievement are presented. A priori weights for item responses are provided. (For related document, see TM 002 329.) (KM)

  2. A priori L∞ estimates for solutions of a class of reaction-diffusion systems.

    PubMed

    Du, Zengji; Peng, Rui

    2016-05-01

    In this short paper, we establish a priori L∞-norm estimates for solutions of a class of reaction-diffusion systems which can be used to model the spread of infectious disease. The developed technique may find applications in other reaction-diffusion systems.

  3. Unequal a priori probability multiple hypothesis testing in space domain awareness with the space surveillance telescope.

    PubMed

    Hardy, Tyler; Cain, Stephen; Blake, Travis

    2016-05-20

    This paper investigates the ability to improve Space Domain Awareness (SDA) by increasing the number of detectable Resident Space Objects (RSOs) from space surveillance sensors. With matched filter based techniques, the expected impulse response, or Point Spread Function (PSF), is compared against the received data. In the situation where the images are spatially undersampled, the modeled PSF may not match the received data if the RSO does not fall in the center of the pixel. This aliasing can be accounted for with a Multiple Hypothesis Test (MHT). Previously, proposed MHTs have implemented a test with an equal a priori prior probability assumption. This paper investigates using an unequal a priori probability MHT. To determine accurate a priori probabilities, three metrics are computed; they are correlation, physical distance, and empirical. Using the calculated a priori probabilities, a new algorithm is developed, and images from the Space Surveillance Telescope (SST) are analyzed. The number of detected objects by both an equal and unequal prior probabilities are compared while keeping the false alarm rate constant. Any additional number of detected objects will help improve SDA capabilities.

  4. Realism, functions, and the a priori: Ernst Cassirer's philosophy of science.

    PubMed

    Heis, Jeremy

    2014-12-01

    This paper presents the main ideas of Cassirer's general philosophy of science, focusing on the two aspects of his thought that--in addition to being the most central ideas in his philosophy of science--have received the most attention from contemporary philosophers of science: his theory of the a priori aspects of physical theory, and his relation to scientific realism.

  5. Quantitation of the a priori dosimetric capabilities of spatial points in inverse planning and its significant implication in defining IMRT solution space

    NASA Astrophysics Data System (ADS)

    Shou, Z.; Yang, Y.; Cotrutz, C.; Levy, D.; Xing, Lei

    2005-04-01

    In inverse planning, the likelihood for the points in a target or sensitive structure to meet their dosimetric goals is generally heterogeneous and represents the a priori knowledge of the system once the patient and beam configuration are chosen. Because of this intrinsic heterogeneity, in some extreme cases, a region in a target may never meet the prescribed dose without seriously deteriorating the doses in other areas. Conversely, the prescription in a region may be easily met without violating the tolerance of any sensitive structure. In this work, we introduce the concept of dosimetric capability to quantify the a priori information and develop a strategy to integrate the data into the inverse planning process. An iterative algorithm is implemented to numerically compute the capability distribution on a case specific basis. A method of incorporating the capability data into inverse planning is developed by heuristically modulating the importance of the individual voxels according to the a priori capability distribution. The formalism is applied to a few specific examples to illustrate the technical details of the new inverse planning technique. Our study indicates that the dosimetric capability is a useful concept to better understand the complex inverse planning problem and an effective use of the information allows us to construct a clinically more meaningful objective function to improve IMRT dose optimization techniques. Part of this work was presented in the 14th International Conference on the Use of Computers in Radiation Therapy, Seoul, Korea, 2004.

  6. Automatic Artifact Removal from Electroencephalogram Data Based on A Priori Artifact Information.

    PubMed

    Zhang, Chi; Tong, Li; Zeng, Ying; Jiang, Jingfang; Bu, Haibing; Yan, Bin; Li, Jianxin

    2015-01-01

    Electroencephalogram (EEG) is susceptible to various nonneural physiological artifacts. Automatic artifact removal from EEG data remains a key challenge for extracting relevant information from brain activities. To adapt to variable subjects and EEG acquisition environments, this paper presents an automatic online artifact removal method based on a priori artifact information. The combination of discrete wavelet transform and independent component analysis (ICA), wavelet-ICA, was utilized to separate artifact components. The artifact components were then automatically identified using a priori artifact information, which was acquired in advance. Subsequently, signal reconstruction without artifact components was performed to obtain artifact-free signals. The results showed that, using this automatic online artifact removal method, there were statistical significant improvements of the classification accuracies in both two experiments, namely, motor imagery and emotion recognition.

  7. A priori estimates, existence and non-existence for quasilinear cooperative elliptic systems

    SciTech Connect

    Zou, H

    2008-04-30

    Let m>1 be a real number and let {omega} subset of R{sup n}, n>=2, be a connected smooth domain. Consider the system of quasi-linear elliptic differential equations ; div(|{nabla}u|{sup m-2}{nabla}u)+f(u,v); =0 in {omega}; div(|{nabla}v|{sup m-2}{nabla}v)+g(u,v); =0 in {omega}; where u>=0, v>=0, f and g are real functions. Relations between the Liouville non-existence and a priori estimates and existence on bounded domains are studied. Under appropriate conditions, a variety of results on a priori estimates, existence and non-existence of positive solutions have been established. Bibliography: 11 titles.

  8. Acquisition of priori tissue optical structure based on non-rigid image registration

    NASA Astrophysics Data System (ADS)

    Wan, Wenbo; Li, Jiao; Liu, Lingling; Wang, Yihan; Zhang, Yan; Gao, Feng

    2015-03-01

    Shape-parameterized diffuse optical tomography (DOT), which is based on a priori that assumes the uniform distribution of the optical properties in the each region, shows the effectiveness of complex biological tissue optical heterogeneities reconstruction. The priori tissue optical structure could be acquired with the assistance of anatomical imaging methods such as X-ray computed tomography (XCT) which suffers from low-contrast for soft tissues including different optical characteristic regions. For the mouse model, a feasible strategy of a priori tissue optical structure acquisition is proposed based on a non-rigid image registration algorithm. During registration, a mapping matrix is calculated to elastically align the XCT image of reference mouse to the XCT image of target mouse. Applying the matrix to the reference atlas which is a detailed mesh of organs/tissues in reference mouse, registered atlas can be obtained as the anatomical structure of target mouse. By assigning the literature published optical parameters of each organ to the corresponding anatomical structure, optical structure of the target organism can be obtained as a priori information for DOT reconstruction algorithm. By applying the non-rigid image registration algorithm to a target mouse which is transformed from the reference mouse, the results show that the minimum correlation coefficient can be improved from 0.2781 (before registration) to 0.9032 (after fine registration), and the maximum average Euclid distances can be decreased from 12.80mm (before registration) to 1.02mm (after fine registration), which has verified the effectiveness of the algorithm.

  9. A priori Estimates and Existence for Elliptic Systems via Bootstrap in Weighted Lebesgue Spaces

    NASA Astrophysics Data System (ADS)

    Quittner, P.; Souplet, Ph.

    2004-10-01

    We present a new general method to obtain regularity and a priori estimates for solutions of semilinear elliptic systems in bounded domains. This method is based on a bootstrap procedure, used alternatively on each component, in the scale of weighted Lebesgue spaces Lpδ(Ω)=Lp(Ωδ(x) dx), where δ(x) is the distance to the boundary. Using this method, we significantly improve the known existence results for various classes of elliptic systems.

  10. Implications of genome wide association studies for addiction: Are our a priori assumptions all wrong?

    PubMed Central

    Hall, F. Scott; Drgonova, Jana; Jain, Siddharth; Uhl, George R.

    2013-01-01

    Substantial genetic contributions to addiction vulnerability are supported by data from twin studies, linkage studies, candidate gene association studies and, more recently, Genome Wide Association Studies (GWAS). Parallel to this work, animal studies have attempted to identify the genes that may contribute to responses to addictive drugs and addiction liability, initially focusing upon genes for the targets of the major drugs of abuse. These studies identified genes/proteins that affect responses to drugs of abuse; however, this does not necessarily mean that variation in these genes contributes to the genetic component of addiction liability. One of the major problems with initial linkage and candidate gene studies was an a priori focus on the genes thought to be involved in addiction based upon the known contributions of those proteins to drug actions, making the identification of novel genes unlikely. The GWAS approach is systematic and agnostic to such a priori assumptions. From the numerous GWAS now completed several conclusions may be drawn: (1) addiction is highly polygenic; each allelic variant contributing in a small, additive fashion to addiction vulnerability; (2) unexpected, compared to our a priori assumptions, classes of genes are most important in explaining addiction vulnerability; (3) although substantial genetic heterogeneity exists, there is substantial convergence of GWAS signals on particular genes. This review traces the history of this research; from initial transgenic mouse models based upon candidate gene and linkage studies, through the progression of GWAS for addiction and nicotine cessation, to the current human and transgenic mouse studies post-GWAS. PMID:23872493

  11. A priori data-driven multi-clustered reservoir generation algorithm for echo state network.

    PubMed

    Li, Xiumin; Zhong, Ling; Xue, Fangzheng; Zhang, Anguo

    2015-01-01

    Echo state networks (ESNs) with multi-clustered reservoir topology perform better in reservoir computing and robustness than those with random reservoir topology. However, these ESNs have a complex reservoir topology, which leads to difficulties in reservoir generation. This study focuses on the reservoir generation problem when ESN is used in environments with sufficient priori data available. Accordingly, a priori data-driven multi-cluster reservoir generation algorithm is proposed. The priori data in the proposed algorithm are used to evaluate reservoirs by calculating the precision and standard deviation of ESNs. The reservoirs are produced using the clustering method; only the reservoir with a better evaluation performance takes the place of a previous one. The final reservoir is obtained when its evaluation score reaches the preset requirement. The prediction experiment results obtained using the Mackey-Glass chaotic time series show that the proposed reservoir generation algorithm provides ESNs with extra prediction precision and increases the structure complexity of the network. Further experiments also reveal the appropriate values of the number of clusters and time window size to obtain optimal performance. The information entropy of the reservoir reaches the maximum when ESN gains the greatest precision.

  12. Effects of daily, high spatial resolution a priori profiles of satellite-derived NOx emissions

    NASA Astrophysics Data System (ADS)

    Laughner, J.; Zare, A.; Cohen, R. C.

    2016-12-01

    The current generation of space-borne NO2 column observations provides a powerful method of constraining NOx emissions due to the spatial resolution and global coverage afforded by the Ozone Monitoring Instrument (OMI). The greater resolution available in next generation instruments such as TROPOMI and the capabilities of geosynchronous platforms TEMPO, Sentinel-4, and GEMS will provide even greater capabilities in this regard, but we must apply lessons learned from the current generation of retrieval algorithms to make the best use of these instruments. Here, we focus on the effect of the resolution of the a priori NO2 profiles used in the retrieval algorithms. We show that for an OMI retrieval, using daily high-resolution a priori profiles results in changes in the retrieved VCDs up to 40% when compared to a retrieval using monthly average profiles at the same resolution. Further, comparing a retrieval with daily high spatial resolution a priori profiles to a more standard one, we show that emissions derived increase by 100% when using the optimized retrieval.

  13. Implications of genome wide association studies for addiction: are our a priori assumptions all wrong?

    PubMed

    Hall, F Scott; Drgonova, Jana; Jain, Siddharth; Uhl, George R

    2013-12-01

    Substantial genetic contributions to addiction vulnerability are supported by data from twin studies, linkage studies, candidate gene association studies and, more recently, Genome Wide Association Studies (GWAS). Parallel to this work, animal studies have attempted to identify the genes that may contribute to responses to addictive drugs and addiction liability, initially focusing upon genes for the targets of the major drugs of abuse. These studies identified genes/proteins that affect responses to drugs of abuse; however, this does not necessarily mean that variation in these genes contributes to the genetic component of addiction liability. One of the major problems with initial linkage and candidate gene studies was an a priori focus on the genes thought to be involved in addiction based upon the known contributions of those proteins to drug actions, making the identification of novel genes unlikely. The GWAS approach is systematic and agnostic to such a priori assumptions. From the numerous GWAS now completed several conclusions may be drawn: (1) addiction is highly polygenic; each allelic variant contributing in a small, additive fashion to addiction vulnerability; (2) unexpected, compared to our a priori assumptions, classes of genes are most important in explaining addiction vulnerability; (3) although substantial genetic heterogeneity exists, there is substantial convergence of GWAS signals on particular genes. This review traces the history of this research; from initial transgenic mouse models based upon candidate gene and linkage studies, through the progression of GWAS for addiction and nicotine cessation, to the current human and transgenic mouse studies post-GWAS.

  14. A priori mesh grading for the numerical calculation of the head-related transfer functions.

    PubMed

    Ziegelwanger, Harald; Kreuzer, Wolfgang; Majdak, Piotr

    2016-12-15

    Head-related transfer functions (HRTFs) describe the directional filtering of the incoming sound caused by the morphology of a listener's head and pinnae. When an accurate model of a listener's morphology exists, HRTFs can be calculated numerically with the boundary element method (BEM). However, the general recommendation to model the head and pinnae with at least six elements per wavelength renders the BEM as a time-consuming procedure when calculating HRTFs for the full audible frequency range. In this study, a mesh preprocessing algorithm is proposed, viz., a priori mesh grading, which reduces the computational costs in the HRTF calculation process significantly. The mesh grading algorithm deliberately violates the recommendation of at least six elements per wavelength in certain regions of the head and pinnae and varies the size of elements gradually according to an a priori defined grading function. The evaluation of the algorithm involved HRTFs calculated for various geometric objects including meshes of three human listeners and various grading functions. The numerical accuracy and the predicted sound-localization performance of calculated HRTFs were analyzed. A-priori mesh grading appeared to be suitable for the numerical calculation of HRTFs in the full audible frequency range and outperformed uniform meshes in terms of numerical errors, perception based predictions of sound-localization performance, and computational costs.

  15. A priori mesh grading for the numerical calculation of the head-related transfer functions

    PubMed Central

    Ziegelwanger, Harald; Kreuzer, Wolfgang; Majdak, Piotr

    2017-01-01

    Head-related transfer functions (HRTFs) describe the directional filtering of the incoming sound caused by the morphology of a listener’s head and pinnae. When an accurate model of a listener’s morphology exists, HRTFs can be calculated numerically with the boundary element method (BEM). However, the general recommendation to model the head and pinnae with at least six elements per wavelength renders the BEM as a time-consuming procedure when calculating HRTFs for the full audible frequency range. In this study, a mesh preprocessing algorithm is proposed, viz., a priori mesh grading, which reduces the computational costs in the HRTF calculation process significantly. The mesh grading algorithm deliberately violates the recommendation of at least six elements per wavelength in certain regions of the head and pinnae and varies the size of elements gradually according to an a priori defined grading function. The evaluation of the algorithm involved HRTFs calculated for various geometric objects including meshes of three human listeners and various grading functions. The numerical accuracy and the predicted sound-localization performance of calculated HRTFs were analyzed. A-priori mesh grading appeared to be suitable for the numerical calculation of HRTFs in the full audible frequency range and outperformed uniform meshes in terms of numerical errors, perception based predictions of sound-localization performance, and computational costs. PMID:28239186

  16. Geopositioning with a quadcopter: Extracted feature locations and predicted accuracy without a priori sensor attitude information

    NASA Astrophysics Data System (ADS)

    Dolloff, John; Hottel, Bryant; Edwards, David; Theiss, Henry; Braun, Aaron

    2017-05-01

    This paper presents an overview of the Full Motion Video-Geopositioning Test Bed (FMV-GTB) developed to investigate algorithm performance and issues related to the registration of motion imagery and subsequent extraction of feature locations along with predicted accuracy. A case study is included corresponding to a video taken from a quadcopter. Registration of the corresponding video frames is performed without the benefit of a priori sensor attitude (pointing) information. In particular, tie points are automatically measured between adjacent frames using standard optical flow matching techniques from computer vision, an a priori estimate of sensor attitude is then computed based on supplied GPS sensor positions contained in the video metadata and a photogrammetric/search-based structure from motion algorithm, and then a Weighted Least Squares adjustment of all a priori metadata across the frames is performed. Extraction of absolute 3D feature locations, including their predicted accuracy based on the principles of rigorous error propagation, is then performed using a subset of the registered frames. Results are compared to known locations (check points) over a test site. Throughout this entire process, no external control information (e.g. surveyed points) is used other than for evaluation of solution errors and corresponding accuracy.

  17. Examples of use of a-priori information to the inversion of AEM data

    NASA Astrophysics Data System (ADS)

    Viezzoli, A.; Munday, T. J.; Sapia, V.

    2012-12-01

    There is a growing focus in the international near surface geophysical community in the merging of information (loosely termed "data") from different sources, in the modelling of the subsurface. The use of a-priori data as extra input to the inversion of Airborne Electromagnetic data is one illustrative example of such trend. It allows providing more robust results, for a number of reasons. The first one is probably the capability to cross check the geophysical derived model against ancillary information, in a more quantitative and objective way than it can be done a-posteriori. The second is that mitigates the inherent non uniqueness of the results of inversion of geophysical data, which is due to the fact that the problem is usually ill posed. The third is the ever higher level of accuracy of the derived output sought after by end users that, rightly so, demand results (either direct or derived) they can use directly for management. Last, but not least, is the drive to incorporate different physical parameters originating from different sources into one inversion problem, in order to derive directly, e.g., geological or hydrogeological models that fit all data sets at once. In this paper we present examples obtained adding information from geophysics (i.e., seismic, surface and borehole geoelectric) and from geology (e.g., lithology), to the inversion of Airborne EM data from different systems (e.g., VTEM, AeroTEM, SkyTEM, Resolve). Case studies are from several areas in the world, with varied geological settings. In our formulation, the a-priori information is treated as nothing but an extra data set, carrying location, values, uncertainty, and expected lateral variability. The information it contains is spread to the location of the neighbouring AEM soundings, using the Spatially Constrained Inversion approach. Constraints and uncertainties are usually different depending on data types and geology. Case studies show the effect on the inversion results of the a-priori

  18. Phillips-Tikhonov regularization with a priori information for neutron emission tomographic reconstruction on Joint European Torus

    SciTech Connect

    Bielecki, J.; Scholz, M.; Drozdowicz, K.; Giacomelli, L.; Kiptily, V.; Kempenaars, M.; Conroy, S.; Craciunescu, T.; Collaboration: EUROfusion Consortium, JET, Culham Science Centre, Abingdon OX14 3DB

    2015-09-15

    A method of tomographic reconstruction of the neutron emissivity in the poloidal cross section of the Joint European Torus (JET, Culham, UK) tokamak was developed. Due to very limited data set (two projection angles, 19 lines of sight only) provided by the neutron emission profile monitor (KN3 neutron camera), the reconstruction is an ill-posed inverse problem. The aim of this work consists in making a contribution to the development of reliable plasma tomography reconstruction methods that could be routinely used at JET tokamak. The proposed method is based on Phillips-Tikhonov regularization and incorporates a priori knowledge of the shape of normalized neutron emissivity profile. For the purpose of the optimal selection of the regularization parameters, the shape of normalized neutron emissivity profile is approximated by the shape of normalized electron density profile measured by LIDAR or high resolution Thomson scattering JET diagnostics. In contrast with some previously developed methods of ill-posed plasma tomography reconstruction problem, the developed algorithms do not include any post-processing of the obtained solution and the physical constrains on the solution are imposed during the regularization process. The accuracy of the method is at first evaluated by several tests with synthetic data based on various plasma neutron emissivity models (phantoms). Then, the method is applied to the neutron emissivity reconstruction for JET D plasma discharge #85100. It is demonstrated that this method shows good performance and reliability and it can be routinely used for plasma neutron emissivity reconstruction on JET.

  19. Phillips-Tikhonov regularization with a priori information for neutron emission tomographic reconstruction on Joint European Torus.

    PubMed

    Bielecki, J; Giacomelli, L; Kiptily, V; Scholz, M; Drozdowicz, K; Conroy, S; Craciunescu, T; Kempenaars, M

    2015-09-01

    A method of tomographic reconstruction of the neutron emissivity in the poloidal cross section of the Joint European Torus (JET, Culham, UK) tokamak was developed. Due to very limited data set (two projection angles, 19 lines of sight only) provided by the neutron emission profile monitor (KN3 neutron camera), the reconstruction is an ill-posed inverse problem. The aim of this work consists in making a contribution to the development of reliable plasma tomography reconstruction methods that could be routinely used at JET tokamak. The proposed method is based on Phillips-Tikhonov regularization and incorporates a priori knowledge of the shape of normalized neutron emissivity profile. For the purpose of the optimal selection of the regularization parameters, the shape of normalized neutron emissivity profile is approximated by the shape of normalized electron density profile measured by LIDAR or high resolution Thomson scattering JET diagnostics. In contrast with some previously developed methods of ill-posed plasma tomography reconstruction problem, the developed algorithms do not include any post-processing of the obtained solution and the physical constrains on the solution are imposed during the regularization process. The accuracy of the method is at first evaluated by several tests with synthetic data based on various plasma neutron emissivity models (phantoms). Then, the method is applied to the neutron emissivity reconstruction for JET D plasma discharge #85100. It is demonstrated that this method shows good performance and reliability and it can be routinely used for plasma neutron emissivity reconstruction on JET.

  20. Phillips-Tikhonov regularization with a priori information for neutron emission tomographic reconstruction on Joint European Torus

    NASA Astrophysics Data System (ADS)

    Bielecki, J.; Giacomelli, L.; Kiptily, V.; Scholz, M.; Drozdowicz, K.; Conroy, S.; Craciunescu, T.; Kempenaars, M.

    2015-09-01

    A method of tomographic reconstruction of the neutron emissivity in the poloidal cross section of the Joint European Torus (JET, Culham, UK) tokamak was developed. Due to very limited data set (two projection angles, 19 lines of sight only) provided by the neutron emission profile monitor (KN3 neutron camera), the reconstruction is an ill-posed inverse problem. The aim of this work consists in making a contribution to the development of reliable plasma tomography reconstruction methods that could be routinely used at JET tokamak. The proposed method is based on Phillips-Tikhonov regularization and incorporates a priori knowledge of the shape of normalized neutron emissivity profile. For the purpose of the optimal selection of the regularization parameters, the shape of normalized neutron emissivity profile is approximated by the shape of normalized electron density profile measured by LIDAR or high resolution Thomson scattering JET diagnostics. In contrast with some previously developed methods of ill-posed plasma tomography reconstruction problem, the developed algorithms do not include any post-processing of the obtained solution and the physical constrains on the solution are imposed during the regularization process. The accuracy of the method is at first evaluated by several tests with synthetic data based on various plasma neutron emissivity models (phantoms). Then, the method is applied to the neutron emissivity reconstruction for JET D plasma discharge #85100. It is demonstrated that this method shows good performance and reliability and it can be routinely used for plasma neutron emissivity reconstruction on JET.

  1. Evaluating A Priori Ozone Profile Information Used in TEMPO Tropospheric Ozone Retrievals

    NASA Astrophysics Data System (ADS)

    Johnson, M. S.; Sullivan, J. T.; Liu, X.; Newchurch, M.; Kuang, S.; McGee, T. J.; Langford, A. O.; Senff, C. J.; Leblanc, T.; Berkoff, T.; Gronoff, G.; Chen, G.; Strawbridge, K. B.

    2016-12-01

    Ozone (O3) is a greenhouse gas and toxic pollutant which plays a major role in air quality. Typically, monitoring of surface air quality and O3 mixing ratios is primarily conducted using in situ measurement networks. This is partially due to high-quality information related to air quality being limited from space-borne platforms due to coarse spatial resolution, limited temporal frequency, and minimal sensitivity to lower tropospheric and surface-level O3. The Tropospheric Emissions: Monitoring of Pollution (TEMPO) satellite is designed to address these limitations of current space-based platforms and to improve our ability to monitor North American air quality. TEMPO will provide hourly data of total column and vertical profiles of O3 with high spatial resolution to be used as a near-real-time air quality product. TEMPO O3 retrievals will apply the Smithsonian Astrophysical Observatory profile algorithm developed based on work from GOME, GOME-2, and OMI. This algorithm uses a priori O3 profile information from a climatological data-base developed from long-term ozone-sonde measurements (tropopause-based (TB) O3 climatology). It has been shown that satellite O3 retrievals are sensitive to a priori O3 profiles and covariance matrices. During this work we investigate the climatological data to be used in TEMPO algorithms (TB O3) and simulated data from the NASA GMAO Goddard Earth Observing System (GEOS-5) Forward Processing (FP) near-real-time (NRT) model products. These two data products will be evaluated with ground-based lidar data from the Tropospheric Ozone Lidar Network (TOLNet) at various locations of the US. This study evaluates the TB climatology, GEOS-5 climatology, and 3-hourly GEOS-5 data compared to lower tropospheric observations to demonstrate the accuracy of a priori information to potentially be used in TEMPO O3 algorithms. Here we present our initial analysis and the theoretical impact on TEMPO retrievals in the lower troposphere.

  2. A priori model independent inverse potential mapping: the impact of electrode positioning.

    PubMed

    van der Graaf, A W Maurits; Bhagirath, Pranav; de Hooge, Jacques; de Groot, Natasja M S; Götte, Marco J W

    2016-01-01

    In inverse potential mapping, local epicardial potentials are computed from recorded body surface potentials (BSP). When BSP are recorded with only a limited number of electrodes, in general biophysical a priori models are applied to facilitate the inverse computation. This study investigated the possibility of deriving epicardial potential information using only 62 torso electrodes in the absence of an a priori model. Computer simulations were used to determine the optimal in vivo positioning of 62 torso electrodes. Subsequently, three different electrode configurations, i.e., surrounding the thorax, concentrated precordial (30 mm inter-electrode distance) and super-concentrated precordial (20 mm inter-electrode distance) were used to record BSP from three healthy volunteers. Magnetic resonance imaging (MRI) was performed to register the electrode positions with respect to the anatomy of the patient. Epicardial potentials were inversely computed from the recorded BSP. In order to determine the reconstruction quality, the super-concentrated electrode configuration was applied in four patients with an implanted MRI-conditional pacemaker system. The distance between the position of the ventricular lead tip on MRI and the inversely reconstructed pacing site was determined. The epicardial potential distribution reconstructed using the super-concentrated electrode configuration demonstrated the highest correlation (R = 0.98; p < 0.01) with the original epicardial source model. A mean localization error of 5.3 mm was found in the pacemaker patients. This study demonstrated the feasibility of deriving detailed anterior epicardial potential information using only 62 torso electrodes without the use of an a priori model.

  3. Evaluating A Priori Ozone Profile Information Used in TEMPO Tropospheric Ozone Retrievals

    NASA Technical Reports Server (NTRS)

    Johnson, Matthew S.; Sullivan, John T.; Liu, Xiong; Newchurch, Mike; Kuang, Shi; McGee, Thomas J.; Langford, Andrew O'Neil; Senff, Christoph J.; Leblanc, Thierry; Berkoff, Timothy; Gronoff, Guillaume; Chen, Gao; Strawbridge, Kevin B.

    2016-01-01

    Ozone (O3) is a greenhouse gas and toxic pollutant which plays a major role in air quality. Typically, monitoring of surface air quality and O3 mixing ratios is primarily conducted using in situ measurement networks. This is partially due to high-quality information related to air quality being limited from space-borne platforms due to coarse spatial resolution, limited temporal frequency, and minimal sensitivity to lower tropospheric and surface-level O3. The Tropospheric Emissions: Monitoring of Pollution (TEMPO) satellite is designed to address these limitations of current space-based platforms and to improve our ability to monitor North American air quality. TEMPO will provide hourly data of total column and vertical profiles of O3 with high spatial resolution to be used as a near-real-time air quality product. TEMPO O3 retrievals will apply the Smithsonian Astrophysical Observatory profile algorithm developed based on work from GOME, GOME-2, and OMI. This algorithm uses a priori O3 profile information from a climatological data-base developed from long-term ozone-sonde measurements (tropopause-based (TB) O3 climatology). It has been shown that satellite O3 retrievals are sensitive to a priori O3 profiles and covariance matrices. During this work we investigate the climatological data to be used in TEMPO algorithms (TB O3) and simulated data from the NASA GMAO Goddard Earth Observing System (GEOS-5) Forward Processing (FP) near-real-time (NRT) model products. These two data products will be evaluated with ground-based lidar data from the Tropospheric Ozone Lidar Network (TOLNet) at various locations of the US. This study evaluates the TB climatology, GEOS-5 climatology, and 3-hourly GEOS-5 data compared to lower tropospheric observations to demonstrate the accuracy of a priori information to potentially be used in TEMPO O3 algorithms. Here we present our initial analysis and the theoretical impact on TEMPO retrievals in the lower troposphere.

  4. 3SMAC: an a priori tomographic model of the upper mantle based on geophysical modeling

    NASA Astrophysics Data System (ADS)

    Nataf, Henri-Claude; Ricard, Yanick

    1996-05-01

    We present an a priori three-dimensional 'tomographic' model of the upper mantle. We construct this model (called 3SMAC — three-dimensional seismological model a priori constrained) in four steps: we compile information on the thickness of 'chemical' layers in the Earth (water, sediments, upper and lower crust, etc); we get a 3D temperature distribution from thermal plate models applied to the oceans and continents; we deduce the mineralogy in the mantle from pressure and temperature and we finally get a three-dimensional model of density, seismic velocities, and attenuation by introducing laboratory measurements of these quantities as a function of pressure and temperature. The model is thus consistent with various geophysical data, such as ocean bathymetry, and surface heat flux. We use this model to compute synthetic travel-times of body waves, and we compare them with observations. A similar exercise is performed for surface waves and normal modes in a companion paper (Ricard et al., 1996, J. Geophys. Res., in press). We find that our model predicts the bulk of the observed travel-time variations. Both the amplitude and general pattern are well recovered. The discrepancies suggest that tomography can provide useful regional information on the thermal state of the continents. In the oceans, the flattening of the sea-floor beond 70 Ma seems difficult to reconcile with the seismic observations. Overall, our 3SMAC model is both a realistic model, which can be used to test various tomographic methods, and a model of the minimum heterogeneities to be expected from geodynamical modeling. Therefore, it should be a useful a priori model to be used in tomographic inversions, in order to retrieve reliable images of heterogeneities in the transition zone, which should, in turn, greatly improve our understanding of geodynamical processes in the deep Earth. 3SMAC and accompanying software can be retrieved by anonymous ftp at geoscope.ipgp.jussieu.fr.

  5. An a priori solar radiation pressure model for the QZSS Michibiki satellite

    NASA Astrophysics Data System (ADS)

    Zhao, Qile; Chen, Guo; Guo, Jing; Liu, Jingnan; Liu, Xianglin

    2017-07-01

    It has been noted that the satellite laser ranging (SLR) residuals of the Quasi-Zenith Satellite System (QZSS) Michibiki satellite orbits show very marked dependence on the elevation angle of the Sun above the orbital plane (i.e., the β angle). It is well recognized that the systematic error is caused by mismodeling of the solar radiation pressure (SRP). Although the error can be reduced by the updated ECOM SRP model, the orbit error is still very large when the satellite switches to orbit-normal (ON) orientation. In this study, an a priori SRP model was established for the QZSS Michibiki satellite to enhance the ECOM model. This model is expressed in ECOM's D, Y, and B axes (DYB) using seven parameters for the yaw-steering (YS) mode, and additional three parameters are used to compensate the remaining modeling deficiencies, particularly the perturbations in the Y axis, based on a redefined DYB for the ON mode. With the proposed a priori model, QZSS Michibiki's precise orbits over 21 months were determined. SLR validation indicated that the systematic β -angle-dependent error was reduced when the satellite was in the YS mode, and better than an 8-cm root mean square (RMS) was achieved. More importantly, the orbit quality was also improved significantly when the satellite was in the ON mode. Relative to ECOM and adjustable box-wing model, the proposed SRP model showed the best performance in the ON mode, and the RMS of the SLR residuals was better than 15 cm, which was a two times improvement over the ECOM without a priori model used, but was still two times worse than the YS mode.

  6. Proportionality of Components, Liouville Theorems and a Priori Estimates for Noncooperative Elliptic Systems

    NASA Astrophysics Data System (ADS)

    Montaru, Alexandre; Sirakov, Boyan; Souplet, Philippe

    2014-07-01

    We study qualitative properties of positive solutions of noncooperative, possibly nonvariational, elliptic systems. We obtain new classification and Liouville type theorems in the whole Euclidean space, as well as in half-spaces, and deduce a priori estimates and the existence of positive solutions for related Dirichlet problems. We significantly improve the known results for a large class of systems involving a balance between repulsive and attractive terms. This class contains systems arising in biological models of Lotka-Volterra type, in physical models of Bose-Einstein condensates and in models of chemical reactions.

  7. The regularized CQ algorithm without a priori knowledge of operator norm for solving the split feasibility problem.

    PubMed

    Tian, Ming; Zhang, Hui-Fang

    2017-01-01

    The split feasibility problem (SFP) is finding a point [Formula: see text] such that [Formula: see text], where C and Q are nonempty closed convex subsets of Hilbert spaces [Formula: see text] and [Formula: see text], and [Formula: see text] is a bounded linear operator. Byrne's CQ algorithm is an effective algorithm to solve the SFP, but it needs to compute [Formula: see text], and sometimes [Formula: see text] is difficult to work out. López introduced a choice of stepsize [Formula: see text], [Formula: see text], [Formula: see text]. However, he only obtained weak convergence theorems. In order to overcome the drawbacks, in this paper, we first provide a regularized CQ algorithm without computing [Formula: see text] to find the minimum-norm solution of the SFP and then obtain a strong convergence theorem.

  8. The a priori SDR Estimation Techniques with Reduced Speech Distortion for Acoustic Echo and Noise Suppression

    NASA Astrophysics Data System (ADS)

    Thoonsaengngam, Rattapol; Tangsangiumvisai, Nisachon

    This paper proposes an enhanced method for estimating the a priori Signal-to-Disturbance Ratio (SDR) to be employed in the Acoustic Echo and Noise Suppression (AENS) system for full-duplex hands-free communications. The proposed a priori SDR estimation technique is modified based upon the Two-Step Noise Reduction (TSNR) algorithm to suppress the background noise while preserving speech spectral components. In addition, a practical approach to determine accurately the Echo Spectrum Variance (ESV) is presented based upon the linear relationship assumption between the power spectrum of far-end speech and acoustic echo signals. The ESV estimation technique is then employed to alleviate the acoustic echo problem. The performance of the AENS system that employs these two proposed estimation techniques is evaluated through the Echo Attenuation (EA), Noise Attenuation (NA), and two speech distortion measures. Simulation results based upon real speech signals guarantee that our improved AENS system is able to mitigate efficiently the problem of acoustic echo and background noise, while preserving the speech quality and speech intelligibility.

  9. Normalization of T2W-MRI prostate images using Rician a priori

    NASA Astrophysics Data System (ADS)

    Lemaître, Guillaume; Rastgoo, Mojdeh; Massich, Joan; Vilanova, Joan C.; Walker, Paul M.; Freixenet, Jordi; Meyer-Baese, Anke; Mériaudeau, Fabrice; Martí, Robert

    2016-03-01

    Prostate cancer is reported to be the second most frequently diagnosed cancer of men in the world. In practise, diagnosis can be affected by multiple factors which reduces the chance to detect the potential lesions. In the last decades, new imaging techniques mainly based on MRI are developed in conjunction with Computer-Aided Diagnosis (CAD) systems to help radiologists for such diagnosis. CAD systems are usually designed as a sequential process consisting of four stages: pre-processing, segmentation, registration and classification. As a pre-processing, image normalization is a critical and important step of the chain in order to design a robust classifier and overcome the inter-patients intensity variations. However, little attention has been dedicated to the normalization of T2W-Magnetic Resonance Imaging (MRI) prostate images. In this paper, we propose two methods to normalize T2W-MRI prostate images: (i) based on a Rician a priori and (ii) based on a Square-Root Slope Function (SRSF) representation which does not make any assumption regarding the Probability Density Function (PDF) of the data. A comparison with the state-of-the-art methods is also provided. The normalization of the data is assessed by comparing the alignment of the patient PDFs in both qualitative and quantitative manners. In both evaluation, the normalization using Rician a priori outperforms the other state-of-the-art methods.

  10. Solution of underdetermined systems of equations with gridded a priori constraints.

    PubMed

    Stiros, Stathis C; Saltogianni, Vasso

    2014-01-01

    The TOPINV, Topological Inversion algorithm (or TGS, Topological Grid Search) initially developed for the inversion of highly non-linear redundant systems of equations, can solve a wide range of underdetermined systems of non-linear equations. This approach is a generalization of a previous conclusion that this algorithm can be used for the solution of certain integer ambiguity problems in Geodesy. The overall approach is based on additional (a priori) information for the unknown variables. In the past, such information was used either to linearize equations around approximate solutions, or to expand systems of observation equations solved on the basis of generalized inverses. In the proposed algorithm, the a priori additional information is used in a third way, as topological constraints to the unknown n variables, leading to an R(n) grid containing an approximation of the real solution. The TOPINV algorithm does not focus on point-solutions, but exploits the structural and topological constraints in each system of underdetermined equations in order to identify an optimal closed space in the R(n) containing the real solution. The centre of gravity of the grid points defining this space corresponds to global, minimum-norm solutions. The rationale and validity of the overall approach are demonstrated on the basis of examples and case studies, including fault modelling, in comparison with SVD solutions and true (reference) values, in an accuracy-oriented approach.

  11. Determining the depth of certain gravity sources without a priori specification of their structural index

    NASA Astrophysics Data System (ADS)

    Zhou, Shuai; Huang, Danian

    2015-11-01

    We have developed a new method for the interpretation of gravity tensor data based on the generalized Tilt-depth method. Cooper (2011, 2012) extended the magnetic Tilt-depth method to gravity data. We take the gradient-ratio method of Cooper (2011, 2012) and modify it so that the source type does not need to be specified a priori. We develop the new method by generalizing the Tilt-depth method for depth estimation for different types of source bodies. The new technique uses only the three vertical tensor components of the full gravity tensor data observed or calculated at different height plane to estimate the depth of the buried bodies without a priori specification of their structural index. For severely noise-corrupted data, our method utilizes different upward continuation height data, which can effectively reduce the influence of noise. Theoretical simulations of the gravity source model with and without noise illustrate the ability of the method to provide source depth information. Additionally, the simulations demonstrate that the new method is simple, computationally fast and accurate. Finally, we apply the method using the gravity data acquired over the Humble Salt Dome in the USA as an example. The results show a good correspondence to the previous drilling and seismic interpretation results.

  12. Template based rotation: A method for functional connectivity analysis with a priori templates☆

    PubMed Central

    Schultz, Aaron P.; Chhatwal, Jasmeer P.; Huijbers, Willem; Hedden, Trey; van Dijk, Koene R.A.; McLaren, Donald G.; Ward, Andrew M.; Wigman, Sarah; Sperling, Reisa A.

    2014-01-01

    Functional connectivity magnetic resonance imaging (fcMRI) is a powerful tool for understanding the network level organization of the brain in research settings and is increasingly being used to study large-scale neuronal network degeneration in clinical trial settings. Presently, a variety of techniques, including seed-based correlation analysis and group independent components analysis (with either dual regression or back projection) are commonly employed to compute functional connectivity metrics. In the present report, we introduce template based rotation,1 a novel analytic approach optimized for use with a priori network parcellations, which may be particularly useful in clinical trial settings. Template based rotation was designed to leverage the stable spatial patterns of intrinsic connectivity derived from out-of-sample datasets by mapping data from novel sessions onto the previously defined a priori templates. We first demonstrate the feasibility of using previously defined a priori templates in connectivity analyses, and then compare the performance of template based rotation to seed based and dual regression methods by applying these analytic approaches to an fMRI dataset of normal young and elderly subjects. We observed that template based rotation and dual regression are approximately equivalent in detecting fcMRI differences between young and old subjects, demonstrating similar effect sizes for group differences and similar reliability metrics across 12 cortical networks. Both template based rotation and dual-regression demonstrated larger effect sizes and comparable reliabilities as compared to seed based correlation analysis, though all three methods yielded similar patterns of network differences. When performing inter-network and sub-network connectivity analyses, we observed that template based rotation offered greater flexibility, larger group differences, and more stable connectivity estimates as compared to dual regression and seed based

  13. Template based rotation: a method for functional connectivity analysis with a priori templates.

    PubMed

    Schultz, Aaron P; Chhatwal, Jasmeer P; Huijbers, Willem; Hedden, Trey; van Dijk, Koene R A; McLaren, Donald G; Ward, Andrew M; Wigman, Sarah; Sperling, Reisa A

    2014-11-15

    Functional connectivity magnetic resonance imaging (fcMRI) is a powerful tool for understanding the network level organization of the brain in research settings and is increasingly being used to study large-scale neuronal network degeneration in clinical trial settings. Presently, a variety of techniques, including seed-based correlation analysis and group independent components analysis (with either dual regression or back projection) are commonly employed to compute functional connectivity metrics. In the present report, we introduce template based rotation,(1) a novel analytic approach optimized for use with a priori network parcellations, which may be particularly useful in clinical trial settings. Template based rotation was designed to leverage the stable spatial patterns of intrinsic connectivity derived from out-of-sample datasets by mapping data from novel sessions onto the previously defined a priori templates. We first demonstrate the feasibility of using previously defined a priori templates in connectivity analyses, and then compare the performance of template based rotation to seed based and dual regression methods by applying these analytic approaches to an fMRI dataset of normal young and elderly subjects. We observed that template based rotation and dual regression are approximately equivalent in detecting fcMRI differences between young and old subjects, demonstrating similar effect sizes for group differences and similar reliability metrics across 12 cortical networks. Both template based rotation and dual-regression demonstrated larger effect sizes and comparable reliabilities as compared to seed based correlation analysis, though all three methods yielded similar patterns of network differences. When performing inter-network and sub-network connectivity analyses, we observed that template based rotation offered greater flexibility, larger group differences, and more stable connectivity estimates as compared to dual regression and seed based

  14. A priori analysis: an application to the estimate of the uncertainty in course grades

    NASA Astrophysics Data System (ADS)

    Lippi, G. L.

    2014-07-01

    A priori analysis (APA) is discussed as a tool to assess the reliability of grades in standard curricular courses. This unusual, but striking, application is presented when teaching the section on the data treatment of a laboratory course to illustrate the characteristics of the APA and its potential for widespread use, beyond the traditional physics curriculum. The conditions necessary for this kind of analysis are discussed, the general framework is set out and a specific example is given to illustrate its various aspects. Students are often struck by this unusual application and are more apt to remember the APA. Instructors may also benefit from some of the gathered information, as discussed in the paper.

  15. GNSS Precise Kinematic Positioning for Multiple Kinematic Stations Based on A Priori Distance Constraints.

    PubMed

    He, Kaifei; Xu, Tianhe; Förste, Christoph; Petrovic, Svetozar; Barthelmes, Franz; Jiang, Nan; Flechtner, Frank

    2016-04-01

    When applying the Global Navigation Satellite System (GNSS) for precise kinematic positioning in airborne and shipborne gravimetry, multiple GNSS receiving equipment is often fixed mounted on the kinematic platform carrying the gravimetry instrumentation. Thus, the distances among these GNSS antennas are known and invariant. This information can be used to improve the accuracy and reliability of the state estimates. For this purpose, the known distances between the antennas are applied as a priori constraints within the state parameters adjustment. These constraints are introduced in such a way that their accuracy is taken into account. To test this approach, GNSS data of a Baltic Sea shipborne gravimetric campaign have been used. The results of our study show that an application of distance constraints improves the accuracy of the GNSS kinematic positioning, for example, by about 4 mm for the radial component.

  16. Microwave Radar Imaging of Heterogeneous Breast Tissue Integrating A Priori Information

    PubMed Central

    Kelly, Thomas N.; Sarafianou, Mantalena; Craddock, Ian J.

    2014-01-01

    Conventional radar-based image reconstruction techniques fail when they are applied to heterogeneous breast tissue, since the underlying in-breast relative permittivity is unknown or assumed to be constant. This results in a systematic error during the process of image formation. A recent trend in microwave biomedical imaging is to extract the relative permittivity from the object under test to improve the image reconstruction quality and thereby to enhance the diagnostic assessment. In this paper, we present a novel radar-based methodology for microwave breast cancer detection in heterogeneous breast tissue integrating a 3D map of relative permittivity as a priori information. This leads to a novel image reconstruction formulation where the delay-and-sum focusing takes place in time rather than range domain. Results are shown for a heterogeneous dense (class-4) and a scattered fibroglandular (class-2) numerical breast phantom using Bristol's 31-element array configuration. PMID:25435861

  17. GNSS Precise Kinematic Positioning for Multiple Kinematic Stations Based on A Priori Distance Constraints

    PubMed Central

    He, Kaifei; Xu, Tianhe; Förste, Christoph; Petrovic, Svetozar; Barthelmes, Franz; Jiang, Nan; Flechtner, Frank

    2016-01-01

    When applying the Global Navigation Satellite System (GNSS) for precise kinematic positioning in airborne and shipborne gravimetry, multiple GNSS receiving equipment is often fixed mounted on the kinematic platform carrying the gravimetry instrumentation. Thus, the distances among these GNSS antennas are known and invariant. This information can be used to improve the accuracy and reliability of the state estimates. For this purpose, the known distances between the antennas are applied as a priori constraints within the state parameters adjustment. These constraints are introduced in such a way that their accuracy is taken into account. To test this approach, GNSS data of a Baltic Sea shipborne gravimetric campaign have been used. The results of our study show that an application of distance constraints improves the accuracy of the GNSS kinematic positioning, for example, by about 4 mm for the radial component. PMID:27043580

  18. A priori mesh quality metrics for three-dimensional hybrid grids

    SciTech Connect

    Kallinderis, Y. Fotia, S.

    2015-01-01

    Use of general hybrid grids to attain complex-geometry field simulations poses a challenge on estimation of their quality. Apart from the typical problems of non-uniformity and non-orthogonality, the change in element topology is an extra issue to address. The present work derives and evaluates an a priori mesh quality indicator for structured, unstructured, as well as hybrid grids consisting of hexahedra, prisms, tetrahedra, and pyramids. Emphasis is placed on deriving a direct relation between the quality measure and mesh distortion. The work is based on use of the Finite Volume discretization for evaluation of first order spatial derivatives. The analytic form of the truncation error is derived and applied to elementary types of mesh distortion including typical hybrid grid interfaces. The corresponding analytic expressions provide relations between the truncation error and the degree of stretching, skewness, shearing, torsion, expansion, as well as the type of grid interface.

  19. A priori analysis of a LES subfilter model for soot-turbulence-chemistry interactions

    NASA Astrophysics Data System (ADS)

    Lew, Jeffry K.; Mueller, Michael E.

    2016-11-01

    In a turbulent flame, soot interacts with turbulence and combustion chemistry at the smallest scales. An existing LES subfilter model proposes that soot-turbulence interactions are independent of chemistry due to the time scale separation between slow soot formation and rapid heat-releasing reactions. However, interactions between soot, turbulence, and chemistry occur even after the nucleation of soot from polycyclic aromatic hydrocarbon (PAH) dimers. In fact, the interplay of soot and gas-phase chemistry may be intensified during oxidation and surface growth. To capture these effects, a dependence on the local mixture fraction has been introduced into the subfilter model. This modified model is evaluated a priori using a direct numerical simulation (DNS) database of soot evolution in a turbulent non-premixed n-heptane/air jet flame.

  20. A method to unmix multiple fluorophores in microscopy images with minimal a priori information.

    PubMed

    Schlachter, S; Schwedler, S; Esposito, A; Kaminski Schierle, G S; Moggridge, G D; Kaminski, C F

    2009-12-07

    The ability to quantify the fluorescence signals from multiply labeled biological samples is highly desirable in the life sciences but often difficult, because of spectral overlap between fluorescent species and the presence of autofluorescence. Several so called unmixing algorithms have been developed to address this problem. Here, we present a novel algorithm that combines measurements of lifetime and spectrum to achieve unmixing without a priori information on the spectral properties of the fluorophore labels. The only assumption made is that the lifetimes of the fluorophores differ. Our method combines global analysis for a measurement of lifetime distributions with singular value decomposition to recover individual fluorescence spectra. We demonstrate the technique on simulated datasets and subsequently by an experiment on a biological sample. The method is computationally efficient and straightforward to implement. Applications range from histopathology of complex and multiply labelled samples to functional imaging in live cells.

  1. A priori postulated and real power in cluster randomized trials: mind the gap.

    PubMed

    Guittet, Lydia; Giraudeau, Bruno; Ravaud, Philippe

    2005-08-18

    Cluster randomization design is increasingly used for the evaluation of health-care, screening or educational interventions. The intraclass correlation coefficient (ICC) defines the clustering effect and be specified during planning. The aim of this work is to study the influence of the ICC on power in cluster randomized trials. Power contour graphs were drawn to illustrate the loss in power induced by an underestimation of the ICC when planning trials. We also derived the maximum achievable power given a specified ICC. The magnitude of the ICC can have a major impact on power, and with low numbers of clusters, 80% power may not be achievable. Underestimating the ICC during planning cluster randomized trials can lead to a seriously underpowered trial. Publication of a priori postulated and a posteriori estimated ICCs is necessary for a more objective reading: negative trial results may be the consequence of a loss of power due to a mis-specification of the ICC.

  2. A Priori Estimates for Free Boundary Problem of Incompressible Inviscid Magnetohydrodynamic Flows

    NASA Astrophysics Data System (ADS)

    Hao, Chengchun; Luo, Tao

    2014-06-01

    In the present paper, we prove the a priori estimates of Sobolev norms for a free boundary problem of the incompressible inviscid magnetohydrodynamics equations in all physical spatial dimensions n = 2 and 3 by adopting a geometrical point of view used in Christodoulou and Lindblad (Commun Pure Appl Math 53:1536-1602, 2000), and estimating quantities such as the second fundamental form and the velocity of the free surface. We identify the well-posedness condition that the outer normal derivative of the total pressure including the fluid and magnetic pressures is negative on the free boundary, which is similar to the physical condition (Taylor sign condition) for the incompressible Euler equations of fluids.

  3. A Priori Estimates for Fractional Nonlinear Degenerate Diffusion Equations on Bounded Domains

    NASA Astrophysics Data System (ADS)

    Bonforte, Matteo; Vázquez, Juan Luis

    2015-10-01

    We investigate quantitative properties of the nonnegative solutions to the nonlinear fractional diffusion equation, , posed in a bounded domain, , with m > 1 for t > 0. As we use one of the most common definitions of the fractional Laplacian , 0 < s < 1, in a bounded domain with zero Dirichlet boundary conditions. We consider a general class of very weak solutions of the equation, and obtain a priori estimates in the form of smoothing effects, absolute upper bounds, lower bounds, and Harnack inequalities. We also investigate the boundary behaviour and we obtain sharp estimates from above and below. In addition, we obtain similar estimates for fractional semilinear elliptic equations. Either the standard Laplacian case s = 1 or the linear case m = 1 are recovered as limits. The method is quite general, suitable to be applied to a number of similar problems.

  4. Predicting thermal history a-priori for magnetic nanoparticle hyperthermia of internal carcinoma

    NASA Astrophysics Data System (ADS)

    Dhar, Purbarun; Sirisha Maganti, Lakshmi

    2017-08-01

    This article proposes a simplistic and realistic method where a direct analytical expression can be derived for the temperature field within a tumour during magnetic nanoparticle hyperthermia. The approximated analytical expression for thermal history within the tumour is derived based on the lumped capacitance approach and considers all therapy protocols and parameters. The present method is simplistic and provides an easy framework for estimating hyperthermia protocol parameters promptly. The model has been validated with respect to several experimental reports on animal models such as mice/rabbit/hamster and human clinical trials. It has been observed that the model is able to accurately estimate the thermal history within the carcinoma during the hyperthermia therapy. The present approach may find implications in a-priori estimation of the thermal history in internal tumours for optimizing magnetic hyperthermia treatment protocols with respect to the ablation time, tumour size, magnetic drug concentration, field strength, field frequency, nanoparticle material and size, tumour location, and so on.

  5. Rapid multi-wavelength optical assessment of circulating blood volume without a priori data

    NASA Astrophysics Data System (ADS)

    Loginova, Ekaterina V.; Zhidkova, Tatyana V.; Proskurnin, Mikhail A.; Zharov, Vladimir P.

    2016-03-01

    The measurement of circulating blood volume (CBV) is crucial in various medical conditions including surgery, iatrogenic problems, rapid fluid administration, transfusion of red blood cells, or trauma with extensive blood loss including battlefield injuries and other emergencies. Currently, available commercial techniques are invasive and time-consuming for trauma situations. Recently, we have proposed high-speed multi-wavelength photoacoustic/photothermal (PA/PT) flow cytometry for in vivo CBV assessment with multiple dyes as PA contrast agents (labels). As the first step, we have characterized the capability of this technique to monitor the clearance of three dyes (indocyanine green, methylene blue, and trypan blue) in an animal model. However, there are strong demands on improvements in PA/PT flow cytometry. As additional verification of our proof-of-concept of this technique, we performed optical photometric CBV measurements in vitro. Three label dyes—methylene blue, crystal violet and, partially, brilliant green—were selected for simultaneous photometric determination of the components of their two-dye mixtures in the circulating blood in vitro without any extra data (like hemoglobin absorption) known a priori. The tests of single dyes and their mixtures in a flow system simulating a blood transfusion system showed a negligible difference between the sensitivities of the determination of these dyes under batch and flow conditions. For individual dyes, the limits of detection of 3×10-6 M‒3×10-6 M in blood were achieved, which provided their continuous determination at a level of 10-5 M for the CBV assessment without a priori data on the matrix. The CBV assessment with errors no higher than 4% were obtained, and the possibility to apply the developed procedure for optical photometric (flow cytometry) with laser sources was shown.

  6. Methods for improving limited field-of-view radiotherapy reconstructions using imperfect a priori images.

    PubMed

    Ruchala, Kenneth J; Olivera, Gustavo H; Kapatoes, Jeffrey M; Reckwerdt, Paul J; Mackie, Thomas R

    2002-11-01

    There are many benefits to having an online CT imaging system for radiotherapy, as it helps identify changes in the patient's position and anatomy between the time of planning and treatment. However, many current online CT systems suffer from a limited field-of-view (LFOV) in that collected data do not encompass the patient's complete cross section. Reconstruction of these data sets can quantitatively distort the image values and introduce artifacts. This work explores the use of planning CT data as a priori information for improving these reconstructions. Methods are presented to incorporate this data by aligning the LFOV with the planning images and then merging the data sets in sinogram space. One alignment option is explicit fusion, producing fusion-aligned reprojection (FAR) images. For cases where explicit fusion is not viable, FAR can be implemented using the implicit fusion of normal setup error, referred to as normal-error-aligned reprojection (NEAR). These methods are evaluated for multiday patient images showing both internal and skin-surface anatomical variation. The iterative use of NEAR and FAR is also investigated, as are applications of NEAR and FAR to dose calculations and the compensation of LFOV online MVCT images with kVCT planning images. Results indicate that NEAR and FAR can utilize planning CT data as imperfect a priori information to reduce artifacts and quantitatively improve images. These benefits can also increase the accuracy of dose calculations and be used for augmenting CT images (e.g., MVCT) acquired at different energies than the planning CT.

  7. A-Priori Rupture Models for Northern California Type-A Faults

    USGS Publications Warehouse

    Wills, Chris J.; Weldon, Ray J.; Field, Edward H.

    2008-01-01

    This appendix describes how a-priori rupture models were developed for the northern California Type-A faults. As described in the main body of this report, and in Appendix G, ?a-priori? models represent an initial estimate of the rate of single and multi-segment surface ruptures on each fault. Whether or not a given model is moment balanced (i.e., satisfies section slip-rate data) depends on assumptions made regarding the average slip on each segment in each rupture (which in turn depends on the chosen magnitude-area relationship). Therefore, for a given set of assumptions, or branch on the logic tree, the methodology of the present Working Group (WGCEP-2007) is to find a final model that is as close as possible to the a-priori model, in the least squares sense, but that also satisfies slip rate and perhaps other data. This is analogous the WGCEP- 2002 approach of effectively voting on the relative rate of each possible rupture, and then finding the closest moment-balance model (under a more limiting set of assumptions than adopted by the present WGCEP, as described in detail in Appendix G). The 2002 Working Group Report (WCCEP, 2003, referred to here as WGCEP-2002), created segmented earthquake rupture forecast models for all faults in the region, including some that had been designated as Type B faults in the NSHMP, 1996, and one that had not previously been considered. The 2002 National Seismic Hazard Maps used the values from WGCEP-2002 for all the faults in the region, essentially treating all the listed faults as Type A faults. As discussed in Appendix A, the current WGCEP found that there are a number of faults with little or no data on slip-per-event, or dates of previous earthquakes. As a result, the WGCEP recommends that faults with minimal available earthquake recurrence data: the Greenville, Mount Diablo, San Gregorio, Monte Vista-Shannon and Concord-Green Valley be modeled as Type B faults to be consistent with similarly poorly-known faults statewide

  8. The importance of using dynamical a-priori profiles for infrared O3 retrievals : the case of IASI.

    NASA Astrophysics Data System (ADS)

    Peiro, H.; Emili, E.; Le Flochmoen, E.; Barret, B.; Cariolle, D.

    2016-12-01

    Tropospheric ozone (O3) is a trace gas involved in the global greenhouse effect. To quantify its contribution to global warming, an accurate determination of O3 profiles is necessary. The instrument IASI (Infrared Atmospheric Sounding Interferometer), on board satellite MetOP-A, is the more sensitive sensor to tropospheric O3 with a high spatio-temporal coverage. Satellite retrievals are often based on the inversion of the measured radiance data with a variational approach. This requires an a priori profile and the correspondent error covariance matrix (COV) as ancillary input. Previous studies have shown some biases ( 20%) in IASI retrievals for tropospheric column in the Southern Hemisphere (SH). A possible source of errors is caused by the a priori profile. This study aims to i) build a dynamical a priori profile O3 with a Chemistry Transport Model (CTM), ii) integrate and to demonstrate the interest of this a priori profile in IASI retrievals.Global O3 profiles are retrieved from IASI radiances with the SOFRID (Software for a fast Retrieval of IASI Data) algorithm. It is based on the RTTOV (Radiative Transfer for TOVS) code and a 1D-Var retrieval scheme. Until now, a constant a priori profile was based on a combination of MOZAIC, WOUDC-SHADOZ and Aura/MLS data named here CLIM PR. The global CTM MOCAGE (Modèle de Chimie Atmosphérique à Grande Echelle) has been used with a linear O3 chemistry scheme to assimilate Microwave Limb Sounder (MLS) data. The model resolution of 2°x2°, with 60 sigma-hybrid vertical levels covering the stratosphere has been used. MLS level 2 products have been assimilated with a 4D-VAR variational algorithm to constrain stratospheric O3 and obtain high quality a priori profiles O3 above the tropopause. From this reanalysis, we built these profiles at a 6h frequency on a coarser resolution grid 10°x20° named MOCAGE+MLS PR.Statistical comparisons between retrievals and ozonesondes have shown better correlations and smaller biases for

  9. Development of a combined multifrequency MRI-DOT system for human breast imaging using a priori information

    NASA Astrophysics Data System (ADS)

    Thayer, David; Liu, Ning; Unlu, Burcin; Chen, Jeon-Hor; Su, Min-Ying; Nalcioglu, Orhan; Gulsen, Gultekin

    2010-02-01

    Breast cancer is a significant cause of mortality and morbidity among women with early diagnosis being vital to successful treatment. Diffuse Optical Tomography (DOT) is an emerging medical imaging modality that provides information that is complementary to current screening modalities such as MRI and mammography, and may improve the specificity in determining cancer malignancy. Using high-resolution anatomic images as a priori information improves the accuracy of DOT. Measurements are presented characterizing the performance of our system. Preliminary data is also shown illustrating the use of a priori MRI data in phantom studies.ä

  10. On Evaluation of Recharge Model Uncertainty: a Priori and a Posteriori

    SciTech Connect

    Ming Ye; Karl Pohlmann; Jenny Chapman; David Shafer

    2006-01-30

    Hydrologic environments are open and complex, rendering them prone to multiple interpretations and mathematical descriptions. Hydrologic analyses typically rely on a single conceptual-mathematical model, which ignores conceptual model uncertainty and may result in bias in predictions and under-estimation of predictive uncertainty. This study is to assess conceptual model uncertainty residing in five recharge models developed to date by different researchers based on different theories for Nevada and Death Valley area, CA. A recently developed statistical method, Maximum Likelihood Bayesian Model Averaging (MLBMA), is utilized for this analysis. In a Bayesian framework, the recharge model uncertainty is assessed, a priori, using expert judgments collected through an expert elicitation in the form of prior probabilities of the models. The uncertainty is then evaluated, a posteriori, by updating the prior probabilities to estimate posterior model probability. The updating is conducted through maximum likelihood inverse modeling by calibrating the Death Valley Regional Flow System (DVRFS) model corresponding to each recharge model against observations of head and flow. Calibration results of DVRFS for the five recharge models are used to estimate three information criteria (AIC, BIC, and KIC) used to rank and discriminate these models. Posterior probabilities of the five recharge models, evaluated using KIC, are used as weights to average head predictions, which gives posterior mean and variance. The posterior quantities incorporate both parametric and conceptual model uncertainties.

  11. A priori mesh quality metric error analysis applied to a high-order finite element method

    NASA Astrophysics Data System (ADS)

    Lowrie, W.; Lukin, V. S.; Shumlak, U.

    2011-06-01

    Characterization of computational mesh's quality prior to performing a numerical simulation is an important step in insuring that the result is valid. A highly distorted mesh can result in significant errors. It is therefore desirable to predict solution accuracy on a given mesh. The HiFi/SEL high-order finite element code is used to study the effects of various mesh distortions on solution quality of known analytic problems for spatial discretizations with different order of finite elements. The measured global error norms are compared to several mesh quality metrics by independently varying both the degree of the distortions and the order of the finite elements. It is found that the spatial spectral convergence rates are preserved for all considered distortion types, while the total error increases with the degree of distortion. For each distortion type, correlations between the measured solution error and the different mesh metrics are quantified, identifying the most appropriate overall mesh metric. The results show promise for future a priori computational mesh quality determination and improvement.

  12. Optimal quantum cloning based on the maximin principle by using a priori information

    NASA Astrophysics Data System (ADS)

    Kang, Peng; Dai, Hong-Yi; Wei, Jia-Hua; Zhang, Ming

    2016-10-01

    We propose an optimal 1 →2 quantum cloning method based on the maximin principle by making full use of a priori information of amplitude and phase about the general cloned qubit input set, which is a simply connected region enclosed by a "longitude-latitude grid" on the Bloch sphere. Theoretically, the fidelity of the optimal quantum cloning machine derived from this method is the largest in terms of the maximin principle compared with that of any other machine. The problem solving is an optimization process that involves six unknown complex variables, six vectors in an uncertain-dimensional complex vector space, and four equality constraints. Moreover, by restricting the structure of the quantum cloning machine, the optimization problem is simplified as a three-real-parameter suboptimization problem with only one equality constraint. We obtain the explicit formula for a suboptimal quantum cloning machine. Additionally, the fidelity of our suboptimal quantum cloning machine is higher than or at least equal to that of universal quantum cloning machines and phase-covariant quantum cloning machines. It is also underlined that the suboptimal cloning machine outperforms the "belt quantum cloning machine" for some cases.

  13. Control-relevant models for glucose control using a priori patient characteristics.

    PubMed

    van Heusden, Klaske; Dassau, Eyal; Zisser, Howard C; Seborg, Dale E; Doyle, Francis J

    2012-07-01

    One of the difficulties in the development of a reliable artificial pancreas for people with type 1 diabetes mellitus (T1DM) is the lack of accurate models of an individual's response to insulin. Most control algorithms proposed to control the glucose level in subjects with T1DM are model-based. Avoiding postprandial hypoglycemia ( 60 mg/dl) while minimizing prandial hyperglycemia ( > 180 mg/dl) has shown to be difficult in a closed-loop setting due to the patient-model mismatch. In this paper, control-relevant models are developed for T1DM, as opposed to models that minimize a prediction error. The parameters of these models are chosen conservatively to minimize the likelihood of hypoglycemia events. To limit the conservatism due to large intersubject variability, the models are personalized using a priori patient characteristics. The models are implemented in a zone model predictive control algorithm. The robustness of these controllers is evaluated in silico, where hypoglycemia is completely avoided even after large meal disturbances. The proposed control approach is simple and the controller can be set up by a physician without the need for control expertise.

  14. A Second Order Expansion of the Separatrix Map for Trigonometric Perturbations of a Priori Unstable Systems

    NASA Astrophysics Data System (ADS)

    Guardia, M.; Kaloshin, V.; Zhang, J.

    2016-11-01

    In this paper we study a so-called separatrix map introduced by Zaslavskii-Filonenko (Sov Phys JETP 27:851-857, 1968) and studied by Treschev (Physica D 116(1-2):21-43, 1998; J Nonlinear Sci 12(1):27-58, 2002), Piftankin (Nonlinearity (19):2617-2644, 2006) Piftankin and Treshchëv (Uspekhi Mat Nauk 62(2(374)):3-108, 2007). We derive a second order expansion of this map for trigonometric perturbations. In Castejon et al. (Random iteration of maps of a cylinder and diffusive behavior. Preprint available at arXiv:1501.03319, 2015), Guardia and Kaloshin (Stochastic diffusive behavior through big gaps in a priori unstable systems (in preparation), 2015), and Kaloshin et al. (Normally Hyperbolic Invariant Laminations and diffusive behavior for the generalized Arnold example away from resonances. Preprint available at http://www.terpconnect.umd.edu/vkaloshi/, 2015), applying the results of the present paper, we describe a class of nearly integrable deterministic systems with stochastic diffusive behavior.

  15. A Priori Analysis of Flamelet-Based Modeling for a Dual-Mode Scramjet Combustor

    NASA Technical Reports Server (NTRS)

    Quinlan, Jesse R.; McDaniel, James C.; Drozda, Tomasz G.; Lacaze, Guilhem; Oefelein, Joseph

    2014-01-01

    An a priori investigation of the applicability of flamelet-based combustion models to dual-mode scramjet combustion was performed utilizing Reynolds-averaged simulations (RAS). For this purpose, the HIFiRE Direct Connect Rig (HDCR) flowpath, fueled with a JP-7 fuel surrogate and operating in dual- and scram-mode was considered. The chemistry of the JP-7 fuel surrogate was modeled using a 22 species, 18-step chemical reaction mechanism. Simulation results were compared to experimentally-obtained, time-averaged, wall pressure measurements to validate the RAS solutions. The analysis of the dual-mode operation of this flowpath showed regions of predominately non-premixed, high-Damkohler number, combustion. Regions of premixed combustion were also present but associated with only a small fraction of the total heat-release in the flow. This is in contrast to the scram-mode operation, where a comparable amount of heat is released from non-premixed and premixed combustion modes. Representative flamelet boundary conditions were estimated by analyzing probability density functions for temperature and pressure for pure fuel and oxidizer conditions. The results of the present study reveal the potential for a flamelet model to accurately model the combustion processes in the HDCR and likely other high-speed flowpaths of engineering interest.

  16. Improving image reconstruction of bioluminescence imaging using a priori information from ultrasound imaging (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Jayet, Baptiste; Ahmad, Junaid; Taylor, Shelley L.; Hill, Philip J.; Dehghani, Hamid; Morgan, Stephen P.

    2017-03-01

    Bioluminescence imaging (BLI) is a commonly used imaging modality in biology to study cancer in vivo in small animals. Images are generated using a camera to map the optical fluence emerging from the studied animal, then a numerical reconstruction algorithm is used to locate the sources and estimate their sizes. However, due to the strong light scattering properties of biological tissues, the resolution is very limited (around a few millimetres). Therefore obtaining accurate information about the pathology is complicated. We propose a combined ultrasound/optics approach to improve accuracy of these techniques. In addition to the BLI data, an ultrasound probe driven by a scanner is used for two main objectives. First, to obtain a pure acoustic image, which provides structural information of the sample. And second, to alter the light emission by the bioluminescent sources embedded inside the sample, which is monitored using a high speed optical detector (e.g. photomultiplier tube). We will show that this last measurement, used in conjunction with the ultrasound data, can provide accurate localisation of the bioluminescent sources. This can be used as a priori information by the numerical reconstruction algorithm, greatly increasing the accuracy of the BLI image reconstruction as compared to the image generated using only BLI data.

  17. Double-difference traveltime tomography with edge-preserving regularization and a priori interfaces

    NASA Astrophysics Data System (ADS)

    Lin, Youzuo; Syracuse, Ellen M.; Maceira, Monica; Zhang, Haijiang; Larmat, Carene

    2015-05-01

    Conventional traveltime seismic tomography methods with Tikhonov regularization (L2 norm) typically produce smooth models, but these models may be inappropriate when subsurface structure contains discontinuous features, such as faults or fractures, indicating that tomographic models should contain sharp boundaries. For this reason, we develop a double-difference (DD) traveltime tomography method that uses a modified total-variation regularization scheme incorporated with a priori information on interfaces to preserve sharp property contrasts and obtain accurate inversion results. In order to solve the inversion problem, we employ an alternating minimization method to decouple the original DD tomography problem into two separate subproblems: a conventional DD tomography with Tikhonov regularization and a L2 total-variation inversion. We use the LSQR linear solver to solve the Tikhonov inversion and the split-Bregman iterative method to solve the total-variation inversion. Through our numerical examples, we show that our new DD tomography method yields more accurate results than the conventional DD tomography method at almost the same computational cost.

  18. A priori complete active space self consistent field localized orbitals: an application on linear polyenes

    NASA Astrophysics Data System (ADS)

    Angeli, Celestino; Sparta, Manuel; Cimiraglia, Renzo

    2006-03-01

    A recently proposed a priori localization technique is used to exploit the possibility to reduce the number of active orbitals in a Complete Active Space Self Consistent Field calculation. The work relies on the fact that the new approach allows a strict control on the nature of the active orbitals and therefore makes it possible to include in the active space only the relevant orbitals. The idea is tested on the calculation of the energy barrier for rigid rotation of linear polyenes. In order to obtain a relevant set of data, a number of possible rotations around double bonds have been considered in the ethylene, butadiene, hexatriene, octatetraene, decapentaene, dodecahexaene molecules. The possibility to reduce the dimension of the active space has been investigated, considering for each possible rotation different active spaces ranging from the minimal dimension of 2 electrons in 2 π orbitals to the π-complete space. The results show that the rigid isomerization in the polyene molecules can be described with a negligible loss in accuracy with active spaces no larger than ten orbitals and ten electrons. In the special case of the rotation around the terminal double bond, the space can be further reduced to six orbitals and six electrons with a large decrease of the computational cost. An interesting summation rule has been found and verified for the stabilization of the energy barriers as a function of the dimension of the conjugated lateral chains and of the dimension of the active space.

  19. SPARSE-A subgrid particle averaged Reynolds stress equivalent model: testing with a priori closure.

    PubMed

    Davis, Sean L; Jacobs, Gustaaf B; Sen, Oishik; Udaykumar, H S

    2017-03-01

    A Lagrangian particle cloud model is proposed that accounts for the effects of Reynolds-averaged particle and turbulent stresses and the averaged carrier-phase velocity of the subparticle cloud scale on the averaged motion and velocity of the cloud. The SPARSE (subgrid particle averaged Reynolds stress equivalent) model is based on a combination of a truncated Taylor expansion of a drag correction function and Reynolds averaging. It reduces the required number of computational parcels to trace a cloud of particles in Eulerian-Lagrangian methods for the simulation of particle-laden flow. Closure is performed in an a priori manner using a reference simulation where all particles in the cloud are traced individually with a point-particle model. Comparison of a first-order model and SPARSE with the reference simulation in one dimension shows that both the stress and the averaging of the carrier-phase velocity on the cloud subscale affect the averaged motion of the particle. A three-dimensional isotropic turbulence computation shows that only one computational parcel is sufficient to accurately trace a cloud of tens of thousands of particles.

  20. A Priori Attitudes Predict Amniocentesis Uptake in Women of Advanced Maternal Age: A Pilot Study.

    PubMed

    Grinshpun-Cohen, Julia; Miron-Shatz, Talya; Rhee-Morris, Laila; Briscoe, Barbara; Pras, Elon; Towner, Dena

    2015-01-01

    Amniocentesis is an invasive procedure performed during pregnancy to determine, among other things, whether the fetus has Down syndrome. It is often preceded by screening, which gives a probabilistic risk assessment. Thus, ample information is conveyed to women with the goal to inform their decisions. This study examined the factors that predict amniocentesis uptake among pregnant women of advanced maternal age (older than 35 years old at the time of childbirth). Participants filled out a questionnaire regarding risk estimates, demographics, and attitudes on screening and pregnancy termination before their first genetic counseling appointment and were followed up to 24 weeks of gestation. Findings show that women's decisions are not always informed by screening results or having a medical indication. Psychological factors measured at the beginning of pregnancy: amniocentesis risk tolerance, pregnancy termination tolerance, and age risk perception affected amniocentesis uptake. Although most women thought that screening for Down syndrome risk would inform their decision, they later stated other reasons for screening, such as preparing for the possibility of a child with special needs. Findings suggest that women's decisions regarding amniocentesis are driven not only by medical factors, but also by a priori attitudes. The authors believe that these should be addressed in the dialogue on women's informed use of prenatal tests.

  1. The application of a priori structural information based regularization in image reconstruction in magnetic induction tomography

    NASA Astrophysics Data System (ADS)

    Dekdouk, B.; Ktistis, C.; Yin, W.; Armitage, D. W.; Peyton, A. J.

    2010-04-01

    Magnetic induction tomography (MIT) is a non-invasive contactless modality that could be capable of imaging the conductivity distribution of biological tissues. In this paper we consider the possibility of using absolute MIT voltage measurements for monitoring the progress of a peripheral hemorrhagic stroke in a human brain. The pathology is modelled as a local blood accumulation in the white matter. The solution of the MIT inverse problem is nonlinear and ill-posed and hence requires the use of a regularisation method. In this paper, we describe the construction and present the performance of a regularisation matrix based on a priori structural information of the head tissues obtained from a very recent MRI scan. The method takes the MRI scan as an initial state of the stroke and constructs a learning set containing the possible conductivity distributions of the current state of the stroke. This data is used to calculate an approximation of the covariance matrix and then a subspace is constructed using principal component analysis (PCA). It is shown by simulations the method is capable of producing a representative reconstruction of a stroke compared to smoothing Tikhonov regularization in a simplified model of the head.

  2. KIR Genes and Patterns Given by the A Priori Algorithm: Immunity for Haematological Malignancies.

    PubMed

    Rodríguez-Escobedo, J Gilberto; García-Sepúlveda, Christian A; Cuevas-Tello, Juan C

    2015-01-01

    Killer-cell immunoglobulin-like receptors (KIRs) are membrane proteins expressed by cells of innate and adaptive immunity. The KIR system consists of 17 genes and 614 alleles arranged into different haplotypes. KIR genes modulate susceptibility to haematological malignancies, viral infections, and autoimmune diseases. Molecular epidemiology studies rely on traditional statistical methods to identify associations between KIR genes and disease. We have previously described our results by applying support vector machines to identify associations between KIR genes and disease. However, rules specifying which haplotypes are associated with greater susceptibility to malignancies are lacking. Here we present the results of our investigation into the rules governing haematological malignancy susceptibility. We have studied the different haplotypic combinations of 17 KIR genes in 300 healthy individuals and 43 patients with haematological malignancies (25 with leukaemia and 18 with lymphomas). We compare two machine learning algorithms against traditional statistical analysis and show that the "a priori" algorithm is capable of discovering patterns unrevealed by previous algorithms and statistical approaches.

  3. Source apportionment of urban air pollutants using constrained receptor models with a priori profile information.

    PubMed

    Liao, Ho-Tang; Yau, Yu-Chen; Huang, Chun-Sheng; Chen, Nathan; Chow, Judith C; Watson, John G; Tsai, Shih-Wei; Chou, Charles C-K; Wu, Chang-Fu

    2017-08-01

    Exposure to air pollutants such as volatile organic compounds (VOCs) and fine particulate matter (PM2.5) are associated with adverse health effects. This study applied multiple time resolution data of hourly VOCs and 24-h PM2.5 to a constrained Positive Matrix Factorization (PMF) model for source apportionment in Taipei, Taiwan. Ninety-two daily PM2.5 samples and 2208 hourly VOC measurements were collected during four seasons in 2014 and 2015. With some a priori information, we used different procedures to constrain retrieved factors toward realistic sources. A total of nine source factors were identified as: natural gas/liquefied petroleum gas (LPG) leakage, solvent use/industrial process, contaminated marine aerosol, secondary aerosol/long-range transport, oil combustion, traffic related, evaporative gasoline emission, gasoline exhaust, and soil dust. Results showed that solvent use/industrial process was the largest contributor (19%) to VOCs while the largest contributor to PM2.5 mass was secondary aerosol/long-range transport (57%). A robust regression analysis showed that secondary aerosol was mostly contributed by regional transport related factor (25%). Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Improving diffuse optical tomography with structural a priori from fluorescence diffuse optical tomography

    NASA Astrophysics Data System (ADS)

    Ma, Wenjuan; Gao, Feng; Duan, Linjing; Zhu, Qingzhen; Wang, Xin; Zhang, Wei; Wu, Linhui; Yi, Xi; Zhao, Huijuan

    2012-03-01

    We obtain absorption and scattering reconstructed images by incorporating a priori information of target location obtained from fluorescence diffuse optical tomography (FDOT) into the diffuse optical tomography (DOT). The main disadvantage of DOT lies in the low spatial resolution resulting from highly scattering nature of tissue in the near-infrared (NIR), but one can use it to monitor hemoglobin concentration and oxygen saturation simultaneously, as well as several other cheomphores such as water, lipids, and cytochrome-c-oxidase. Up to date, extensive effort has been made to integrate DOT with other imaging modalities such as MRI, CT, to obtain accurate optical property maps of the tissue. However, the experimental apparatus is intricate. In this study, DOT image reconstruction algorithm that incorporates a prior structural information provided by FDOT is investigated in an attempt to optimize recovery of a simulated optical property distribution. By use of a specifically designed multi-channel time-correlated single photon counting system, the proposed scheme in a transmission mode is experimentally validated to achieve simultaneous reconstruction of the fluorescent yield, lifetime, absorption and scattering coefficient. The experimental results demonstrate that the quantitative recovery of the tumor optical properties has doubled and the spatial resolution improves as well by applying the new improved method.

  5. A Priori Analyses of Three Subgrid-Scale Models for One-Parameter Families of Filters

    NASA Technical Reports Server (NTRS)

    Pruett, C. David; Adams, Nikolaus A.

    1998-01-01

    The decay of isotropic turbulence a compressible flow is examined by direct numerical simulation (DNS). A priori analyses of the DNS data are then performed to evaluate three subgrid-scale (SGS) models for large-eddy simulation (LES): a generalized Smagorinsky model (M1), a stress-similarity model (M2), and a gradient model (M3). The models exploit one-parameter second- or fourth-order filters of Pade type, which permit the cutoff wavenumber k(sub c) to be tuned independently of the grid increment (delta)x. The modeled (M) and exact (E) SGS-stresses are compared component-wise by correlation coefficients of the form C(E,M) computed over the entire three-dimensional fields. In general, M1 correlates poorly against exact stresses (C < 0.2), M3 correlates moderately well (C approx. 0.6), and M2 correlates remarkably well (0.8 < C < 1.0). Specifically, correlations C(E, M2) are high provided the grid and test filters are of the same order. Moreover, the highest correlations (C approx.= 1.0) result whenever the grid and test filters are identical (in both order and cutoff). Finally, present results reveal the exact SGS stresses obtained by grid filters of differing orders to be only moderately well correlated. Thus, in LES the model should not be specified independently of the filter.

  6. A quantum question order model supported by empirical tests of an a priori and precise prediction.

    PubMed

    Wang, Zheng; Busemeyer, Jerome R

    2013-10-01

    Question order effects are commonly observed in self-report measures of judgment and attitude. This article develops a quantum question order model (the QQ model) to account for four types of question order effects observed in literature. First, the postulates of the QQ model are presented. Second, an a priori, parameter-free, and precise prediction, called the QQ equality, is derived from these mathematical principles, and six empirical data sets are used to test the prediction. Third, a new index is derived from the model to measure similarity between questions. Fourth, we show that in contrast to the QQ model, Bayesian and Markov models do not generally satisfy the QQ equality and thus cannot account for the reported empirical data that support this equality. Finally, we describe the conditions under which order effects are predicted to occur, and we review a broader range of findings that are encompassed by these very same quantum principles. We conclude that quantum probability theory, initially invented to explain order effects on measurements in physics, appears to be a powerful natural explanation for order effects of self-report measures in social and behavioral sciences, too.

  7. Musical Probabilities, Abductive Reasoning, and Brain Mechanisms: Extended Perspective of "A Priori" Listening to Music within the Creative Cognition Approach

    ERIC Educational Resources Information Center

    Schmidt, Sebastian; Troge, Thomas A.; Lorrain, Denis

    2013-01-01

    A theory of listening to music is proposed. It suggests that, for listeners, the process of prediction is the starting point to experiencing music. This implies that perception of music starts through both a predisposed and an experience-based extrapolation into the future (this is labeled "a priori" listening). Indications for this…

  8. Musical Probabilities, Abductive Reasoning, and Brain Mechanisms: Extended Perspective of "A Priori" Listening to Music within the Creative Cognition Approach

    ERIC Educational Resources Information Center

    Schmidt, Sebastian; Troge, Thomas A.; Lorrain, Denis

    2013-01-01

    A theory of listening to music is proposed. It suggests that, for listeners, the process of prediction is the starting point to experiencing music. This implies that perception of music starts through both a predisposed and an experience-based extrapolation into the future (this is labeled "a priori" listening). Indications for this…

  9. An a priori DNS study of the shadow-position mixing model

    DOE PAGES

    Zhao, Xin -Yu; Bhagatwala, Ankit; Chen, Jacqueline H.; ...

    2016-01-15

    In this study, the modeling of mixing by molecular diffusion is a central aspect for transported probability density function (tPDF) methods. In this paper, the newly-proposed shadow position mixing model (SPMM) is examined, using a DNS database for a temporally evolving di-methyl ether slot jet flame. Two methods that invoke different levels of approximation are proposed to extract the shadow displacement (equivalent to shadow position) from the DNS database. An approach for a priori analysis of the mixing-model performance is developed. The shadow displacement is highly correlated with both mixture fraction and velocity, and the peak correlation coefficient of themore » shadow displacement and mixture fraction is higher than that of the shadow displacement and velocity. This suggests that the composition-space localness is reasonably well enforced by the model, with appropriate choices of model constants. The conditional diffusion of mixture fraction and major species from DNS and from SPMM are then compared, using mixing rates that are derived by matching the mixture fraction scalar dissipation rates. Good qualitative agreement is found, for the prediction of the locations of zero and maximum/minimum conditional diffusion locations for mixture fraction and individual species. Similar comparisons are performed for DNS and the IECM (interaction by exchange with the conditional mean) model. The agreement between SPMM and DNS is better than that between IECM and DNS, in terms of conditional diffusion iso-contour similarities and global normalized residual levels. It is found that a suitable value for the model constant c that controls the mixing frequency can be derived using the local normalized scalar variance, and that the model constant a controls the localness of the model. A higher-Reynolds-number test case is anticipated to be more appropriate to evaluate the mixing models, and stand-alone transported PDF simulations are required to more fully enforce

  10. An a priori DNS study of the shadow-position mixing model

    SciTech Connect

    Zhao, Xin -Yu; Bhagatwala, Ankit; Chen, Jacqueline H.; Haworth, Daniel C.; Pope, Stephen B.

    2016-01-15

    In this study, the modeling of mixing by molecular diffusion is a central aspect for transported probability density function (tPDF) methods. In this paper, the newly-proposed shadow position mixing model (SPMM) is examined, using a DNS database for a temporally evolving di-methyl ether slot jet flame. Two methods that invoke different levels of approximation are proposed to extract the shadow displacement (equivalent to shadow position) from the DNS database. An approach for a priori analysis of the mixing-model performance is developed. The shadow displacement is highly correlated with both mixture fraction and velocity, and the peak correlation coefficient of the shadow displacement and mixture fraction is higher than that of the shadow displacement and velocity. This suggests that the composition-space localness is reasonably well enforced by the model, with appropriate choices of model constants. The conditional diffusion of mixture fraction and major species from DNS and from SPMM are then compared, using mixing rates that are derived by matching the mixture fraction scalar dissipation rates. Good qualitative agreement is found, for the prediction of the locations of zero and maximum/minimum conditional diffusion locations for mixture fraction and individual species. Similar comparisons are performed for DNS and the IECM (interaction by exchange with the conditional mean) model. The agreement between SPMM and DNS is better than that between IECM and DNS, in terms of conditional diffusion iso-contour similarities and global normalized residual levels. It is found that a suitable value for the model constant c that controls the mixing frequency can be derived using the local normalized scalar variance, and that the model constant a controls the localness of the model. A higher-Reynolds-number test case is anticipated to be more appropriate to evaluate the mixing models, and stand-alone transported PDF simulations are required to more fully enforce localness and

  11. A review of a priori regression models for warfarin maintenance dose prediction.

    PubMed

    Francis, Ben; Lane, Steven; Pirmohamed, Munir; Jorgensen, Andrea

    2014-01-01

    A number of a priori warfarin dosing algorithms, derived using linear regression methods, have been proposed. Although these dosing algorithms may have been validated using patients derived from the same centre, rarely have they been validated using a patient cohort recruited from another centre. In order to undertake external validation, two cohorts were utilised. One cohort formed by patients from a prospective trial and the second formed by patients in the control arm of the EU-PACT trial. Of these, 641 patients were identified as having attained stable dosing and formed the dataset used for validation. Predicted maintenance doses from six criterion fulfilling regression models were then compared to individual patient stable warfarin dose. Predictive ability was assessed with reference to several statistics including the R-square and mean absolute error. The six regression models explained different amounts of variability in the stable maintenance warfarin dose requirements of the patients in the two validation cohorts; adjusted R-squared values ranged from 24.2% to 68.6%. An overview of the summary statistics demonstrated that no one dosing algorithm could be considered optimal. The larger validation cohort from the prospective trial produced more consistent statistics across the six dosing algorithms. The study found that all the regression models performed worse in the validation cohort when compared to the derivation cohort. Further, there was little difference between regression models that contained pharmacogenetic coefficients and algorithms containing just non-pharmacogenetic coefficients. The inconsistency of results between the validation cohorts suggests that unaccounted population specific factors cause variability in dosing algorithm performance. Better methods for dosing that take into account inter- and intra-individual variability, at the initiation and maintenance phases of warfarin treatment, are needed.

  12. A priori typology-based prediction of benthic macroinvertebrate fauna for ecological classification of rivers.

    PubMed

    Aroviita, Jukka; Koskenniemi, Esa; Kotanen, Juho; Hämäläinen, Heikki

    2008-11-01

    We evaluated a simple bioassessment method based on a priori river typology to predict benthic macroinvertebrate fauna in riffle sites of rivers in the absence of human influence. Our approach predicted taxon lists specific to four river types differing in catchment area with a method analogous to the site-specific RIVPACS-type models. The reference sites grouped in accordance with their type in NMS ordination, indicating that the typology efficiently accounted for natural variation in macroinvertebrate assemblages. Compared with a null model, typology greatly increased the precision of prediction and sensitivity to detect human impairment and strengthened the correlation of the ratio of observed-to-expected number of predicted taxa (O/E) with the measured stressor variables. The performance of the typology-based approach was equal to that of a RIVPACS-type predictive model that we developed. Exclusion of rarest taxa with low occurrence probabilities improved the performance of both approaches by all criteria. With an increasing inclusion threshold of occurrence probability, especially the predictive model sensitivity first increased but then decreased. Many common taxa with intermediate type-specific occurrence probabilities were consistently missing from impacted sites, a result suggesting that these taxa may be especially important in detecting human disturbances. We conclude that if a typology-based approach such as that suggested by the European Union's Water Framework Directive is required, the O/E ratio of type-specific taxa can be a useful metric for assessment of the status of riffle macroinvertebrate communities. Successful application of the approach, however, requires biologically meaningful river types with a sufficient pool of reference sites for each type.

  13. Limited angle breast ultrasound tomography with a priori information and artifact removal

    NASA Astrophysics Data System (ADS)

    Jintamethasawat, Rungroj; Zhu, Yunhao; Kripfgans, Oliver D.; Yuan, Jie; Goodsitt, Mitchell M.; Carson, Paul L.

    2017-03-01

    In B-mode images from dual-sided ultrasound, it has been shown that by delineating structures suspected of being relatively homogeneous, one can enhance limited angle tomography to produce speed of sound images in the same view as X-ray Digital Breast Tomography (DBT). This could allow better breast cancer detection and discrimination, as well as improved registration of the ultrasound and X-ray images, because of the similarity of SOS and X-ray contrast in the breast. However, this speed of sound reconstruction method relies strongly on B-mode or other reflection mode segmentation. If that information is limited or incorrect, artifacts will appear in the reconstructed images. Therefore, the iterative speed of sound reconstruction algorithm has been modified in a manner of simultaneously utilizing the image segmentations and removing most artifacts. The first step of incorporating a priori information is solved by any nonlinearnonconvex optimization method while artifact removal is accomplished by employing the fast split Bregman method to perform total-variation (TV) regularization for image denoising. The proposed method was demonstrated in simplified simulations of our dual-sided ultrasound scanner. To speed these computations two opposed 40-element ultrasound linear arrays with 0.5 MHz center frequency were simulated for imaging objects in a uniform background. The proposed speed of sound reconstruction method worked well with both bent-ray and full-wave inversion methods. This is also the first demonstration of successful full-wave medical ultrasound tomography in the limited angle geometry. Presented results lend credibility to a possible translation of this method to clinical breast imaging.

  14. AN A PRIORI INVESTIGATION OF ASTROPHYSICAL FALSE POSITIVES IN GROUND-BASED TRANSITING PLANET SURVEYS

    SciTech Connect

    Evans, Tom M.; Sackett, Penny D.

    2010-03-20

    Astrophysical false positives due to stellar eclipsing binaries pose one of the greatest challenges to ground-based surveys for transiting hot Jupiters. We have used known properties of multiple star systems and hot Jupiter systems to predict, a priori, the number of such false detections and the number of genuine planet detections recovered in two hypothetical but realistic ground-based transit surveys targeting fields close to the galactic plane (b {approx} 10{sup 0}): a shallow survey covering a magnitude range 10 < V < 13 and a deep survey covering a magnitude range 15 < V < 19. Our results are consistent with the commonly reported experience of false detections outnumbering planet detections by a factor of {approx}10 in shallow surveys, while in our synthetic deep survey we find {approx}1-2 false detections for every planet detection. We characterize the eclipsing binary configurations that are most likely to cause false detections and find that they can be divided into three main types: (1) two dwarfs undergoing grazing transits, (2) two dwarfs undergoing low-latitude transits in which one component has a substantially smaller radius than the other, and (3) two eclipsing dwarfs blended with one or more physically unassociated foreground stars. We also predict that a significant fraction of hot Jupiter detections are blended with the light from other stars, showing that care must be taken to identify the presence of any unresolved neighbors in order to obtain accurate estimates of planetary radii. This issue is likely to extend to terrestrial planet candidates in the CoRoT and Kepler transit surveys, for which neighbors of much fainter relative brightness will be important.

  15. A priori patient-specific collision avoidance in radiotherapy using consumer grade depth cameras.

    PubMed

    Cardan, Rex A; Popple, Richard A; Fiveash, John

    2017-07-01

    In this study, we demonstrate and evaluate a low cost, fast, and accurate avoidance framework for radiotherapy treatments. Furthermore, we provide an implementation which is patient specific and can be implemented during the normal simulation process. Four patients and a treatment unit were scanned with a set of consumer depth cameras to create a polygon mesh of each object. Using a fast polygon interference algorithm, the models were virtually collided to map out feasible treatment positions of the couch and gantry. The actual physical collision space was then mapped in the treatment room by moving the gantry and couch until a collision occurred with either the patient or hardware. The physical and virtual collision spaces were then compared to determine the accuracy of the system. To improve the collision predictions, a buffer geometry was added to the scanned gantry mesh and performance was assessed as a function of buffer thickness. Each patient was optically scanned during simulation in less than 1 min. The average time to virtually map the collision space for 64, 800 gantry/couch states was 5.40 ± 2.88 s. The system had an average raw accuracy and negative prediction rate (NPR) across all patients of 97.3% ± 2.4% and 96.9% ± 2.2% respectively. Using a polygon buffer of 6 cm over the gantry geometry, the NPR was raised to unity for all patients, signifying the detection of all collision events. However, the average accuracy fell from 95.3% ± 3.1% to 91.5% ± 3.6% between the 3 and 6 cm buffer as more false positives were detected. We successfully demonstrated a fast and low cost framework which can map an entire collision space a priori for a given patient during the time of simulation. All collisions can be avoided using polygon interference, but a polygon buffer may be required to account for geometric uncertainties of scanned objects. © 2017 American Association of Physicists in Medicine.

  16. Optical tomographic imaging of activation of the infant auditory cortex using perturbation Monte Carlo with anatomical a priori information

    NASA Astrophysics Data System (ADS)

    Heiskala, Juha; Kotilahti, Kalle; Lipiäinen, Lauri; Hiltunen, Petri; Grant, P. Ellen; Nissilä, Ilkka

    2007-07-01

    We have developed a perturbation Monte Carlo method for calculating forward and inverse solutions to the optical tomography imaging problem in the presence of anatomical a priori information. The method uses frequency domain data. In the present work, we consider the problem of imaging hemodynamic changes due to brain activation in the infant brain. We test finite element method and Monte Carlo based implementations using a homogeneous model with the exterior of the domain warped to match digitized points on the skin. With the perturbation Monte Carlo model, we also test a heterogeneous model based on anatomical a priori information derived from a previously recorded infant T1 magnetic resonance (MR) image. Our simulations show that the anatomical information improves the accuracy of reconstructions quite significantly even if the anatomical MR images are based on another infant. This suggests that significant benefits can be obtained by the use of generic infant brain atlas information in near-infrared spectroscopy and optical tomography studies.

  17. A priori error estimates for an hp-version of the discontinuous Galerkin method for hyperbolic conservation laws

    NASA Technical Reports Server (NTRS)

    Bey, Kim S.; Oden, J. Tinsley

    1993-01-01

    A priori error estimates are derived for hp-versions of the finite element method for discontinuous Galerkin approximations of a model class of linear, scalar, first-order hyperbolic conservation laws. These estimates are derived in a mesh dependent norm in which the coefficients depend upon both the local mesh size h(sub K) and a number p(sub k) which can be identified with the spectral order of the local approximations over each element.

  18. Tomographic inversion of time-domain resistivity and chargeability data for the investigation of landfills using a priori information.

    PubMed

    De Donno, Giorgio; Cardarelli, Ettore

    2017-01-01

    In this paper, we present a new code for the modelling and inversion of resistivity and chargeability data using a priori information to improve the accuracy of the reconstructed model for landfill. When a priori information is available in the study area, we can insert them by means of inequality constraints on the whole model or on a single layer or assigning weighting factors for enhancing anomalies elongated in the horizontal or vertical directions. However, when we have to face a multilayered scenario with numerous resistive to conductive transitions (the case of controlled landfills), the effective thickness of the layers can be biased. The presented code includes a model-tuning scheme, which is applied after the inversion of field data, where the inversion of the synthetic data is performed based on an initial guess, and the absolute difference between the field and synthetic inverted models is minimized. The reliability of the proposed approach has been supported in two real-world examples; we were able to identify an unauthorized landfill and to reconstruct the geometrical and physical layout of an old waste dump. The combined analysis of the resistivity and chargeability (normalised) models help us to remove ambiguity due to the presence of the waste mass. Nevertheless, the presence of certain layers can remain hidden without using a priori information, as demonstrated by a comparison of the constrained inversion with a standard inversion. The robustness of the above-cited method (using a priori information in combination with model tuning) has been validated with the cross-section from the construction plans, where the reconstructed model is in agreement with the original design. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. A priori parameter estimates for global hydrological modeling using geographically based information: Application of the CREST hydrologic model

    NASA Astrophysics Data System (ADS)

    Gao, Z.; Zhang, K.; Xue, X.; Huang, J.; Hong, Y.

    2016-12-01

    Floods are among the most common natural disasters with worldwide impacts that cause significant humanitarian and economic negative consequences. The increasing availability of satellite-based precipitation estimates and geospatial datasets with global coverage and improved temporal resolutions has enhanced our capability of forecasting floods and monitoring water resources across the world. This study presents an approach combing physically based and empirical methods for a-priori parameter estimates and a parameter dataset for the Coupled Routing and Excess Storage (CREST) hydrological model at the global scale. This approach takes advantage of geographic information such as topography, land cover, and soil properties to derive the distributed parameter values across the world. The main objective of this study is to evaluate the utility of a-priori parameter estimates to improve the performance of the CREST distributed hydrologic model and enable its prediction at poorly gauged or ungauged catchments. Using the CREST hydrologic model, several typical river basins in different continents were selected to serve as test areas. The results show that the simulated daily stream flows using the parameters derived from geographically based information outperform the results using the lumped parameters. Overall, this early study highlights that a priori parameter estimates for hydrologic model warrants improved model predictive capability in ungauged basins at regional to global scales.

  20. Evaluating a Priori Ozone Profile Information Used in TEMPO (Tropospheric Emissions: Monitoring of Pollution) Tropospheric Ozone Retrievals

    NASA Technical Reports Server (NTRS)

    Johnson, Matthew Stephen

    2017-01-01

    A primary objective for TOLNet is the evaluation and validation of space-based tropospheric O3 retrievals from future systems such as the Tropospheric Emissions: Monitoring of Pollution (TEMPO) satellite. This study is designed to evaluate the tropopause-based O3 climatology (TB-Clim) dataset which will be used as the a priori profile information in TEMPO O3 retrievals. This study also evaluates model simulated O3 profiles, which could potentially serve as a priori O3 profile information in TEMPO retrievals, from near-real-time (NRT) data assimilation model products (NASA Global Modeling and Assimilation Office (GMAO) Goddard Earth Observing System (GEOS-5) Forward Processing (FP) and Modern-Era Retrospective analysis for Research and Applications version 2 (MERRA2)) and full chemical transport model (CTM), GEOS-Chem, simulations. The TB-Clim dataset and model products are evaluated with surface (0-2 km) and tropospheric (0-10 km) TOLNet observations to demonstrate the accuracy of the suggested a priori dataset and information which could potentially be used in TEMPO O3 algorithms. This study also presents the impact of individual a priori profile sources on the accuracy of theoretical TEMPO O3 retrievals in the troposphere and at the surface. Preliminary results indicate that while the TB-Clim climatological dataset can replicate seasonally-averaged tropospheric O3 profiles observed by TOLNet, model-simulated profiles from a full CTM (GEOS-Chem is used as a proxy for CTM O3 predictions) resulted in more accurate tropospheric and surface-level O3 retrievals from TEMPO when compared to hourly (diurnal cycle evaluation) and daily-averaged (daily variability evaluation) TOLNet observations. Furthermore, it was determined that when large daily-averaged surface O3 mixing ratios are observed (65 ppb), which are important for air quality purposes, TEMPO retrieval values at the surface display higher correlations and less bias when applying CTM a priori profile information

  1. A Whole-Brain Voxel Based Measure of Intrinsic Connectivity Contrast Reveals Local Changes in Tissue Connectivity with Anesthetic without A Priori Assumptions on Thresholds or Regions of Interest

    PubMed Central

    Martuzzi, Roberto; Ramani, Ramachandran; Qiu, Maolin; Shen, Xilin; Papademetris, Xenophon; Constable, R. Todd

    2011-01-01

    The analysis of spontaneous fluctuations of functional magnetic resonance imaging (fMRI) signals has recently gained attention as a powerful tool for investigating brain circuits in a non-invasive manner. Correlation-based connectivity analysis investigates the correlations of spontaneous fluctuations of the fMRI signal either between a single seed region of interest (ROI) and the rest of the brain or between multiple ROIs. To do this, a priori knowledge is required for defining the ROI(s) and without such knowledge functional connectivity fMRI cannot be used as an exploratory tool for investigating the functional organization of the brain and its modulation under the different conditions. In this work we examine two indices that provide voxel based maps reflecting the intrinsic connectivity contrast (ICC) of individual tissue elements without the need for defining ROIs and hence require no a priori information or assumptions. These voxel based ICC measures can also be used to delineate regions of interest for further functional or network analyses. The indices were applied to the study of sevoflurane anesthesia-induced alterations in intrinsic connectivity. In concordance with previous studies, the results show that sevoflurane affects different brain circuits in a heterogeneous manner. In addition ICC analyses revealed changes in regions not previously identified using conventional ROI connectivity analyses, probably because of an inappropriate choice of the ROI in the earlier studies. This work highlights the importance of such voxel based connectivity methodology. PMID:21763437

  2. Using models to guide field experiments: a priori predictions for the CO 2 response of a nutrient- and water-limited native Eucalypt woodland

    DOE PAGES

    Medlyn, Belinda E.; De Kauwe, Martin G.; Zaehle, Sönke; ...

    2016-05-09

    One major uncertainty in Earth System models is the response of terrestrial ecosystems to rising atmospheric CO2 concentration (Ca), particularly under nutrient-lim- ited conditions. The Eucalyptus Free-Air CO2 Enrichment (EucFACE) experiment, recently established in a nutrient- and water-limited woodlands, presents a unique opportunity to address this uncertainty, but can best do so if key model uncertainties have been identified in advance. Moreover, we applied seven vegetation models, which have previously been comprehensively assessed against earlier forest FACE experi- ments, to simulate a priori possible outcomes from EucFACE. Our goals were to provide quantitative projections against which to evaluate data asmore » they are collected, and to identify key measurements that should be made in the experiment to allow discrimination among alternative model assumptions in a postexperiment model intercompari- son. Simulated responses of annual net primary productivity (NPP) to elevated Ca ranged from 0.5 to 25% across models. The simulated reduction of NPP during a low-rainfall year also varied widely, from 24 to 70%. Key processes where assumptions caused disagreement among models included nutrient limitations to growth; feedbacks to nutri- ent uptake; autotrophic respiration; and the impact of low soil moisture availability on plant processes. Finally, knowledge of the causes of variation among models is now guiding data collection in the experiment, with the expectation that the experimental data can optimally inform future model improvements.« less

  3. Using models to guide field experiments: a priori predictions for the CO2 response of a nutrient- and water-limited native Eucalypt woodland.

    PubMed

    Medlyn, Belinda E; De Kauwe, Martin G; Zaehle, Sönke; Walker, Anthony P; Duursma, Remko A; Luus, Kristina; Mishurov, Mikhail; Pak, Bernard; Smith, Benjamin; Wang, Ying-Ping; Yang, Xiaojuan; Crous, Kristine Y; Drake, John E; Gimeno, Teresa E; Macdonald, Catriona A; Norby, Richard J; Power, Sally A; Tjoelker, Mark G; Ellsworth, David S

    2016-08-01

    The response of terrestrial ecosystems to rising atmospheric CO2 concentration (Ca ), particularly under nutrient-limited conditions, is a major uncertainty in Earth System models. The Eucalyptus Free-Air CO2 Enrichment (EucFACE) experiment, recently established in a nutrient- and water-limited woodland presents a unique opportunity to address this uncertainty, but can best do so if key model uncertainties have been identified in advance. We applied seven vegetation models, which have previously been comprehensively assessed against earlier forest FACE experiments, to simulate a priori possible outcomes from EucFACE. Our goals were to provide quantitative projections against which to evaluate data as they are collected, and to identify key measurements that should be made in the experiment to allow discrimination among alternative model assumptions in a postexperiment model intercomparison. Simulated responses of annual net primary productivity (NPP) to elevated Ca ranged from 0.5 to 25% across models. The simulated reduction of NPP during a low-rainfall year also varied widely, from 24 to 70%. Key processes where assumptions caused disagreement among models included nutrient limitations to growth; feedbacks to nutrient uptake; autotrophic respiration; and the impact of low soil moisture availability on plant processes. Knowledge of the causes of variation among models is now guiding data collection in the experiment, with the expectation that the experimental data can optimally inform future model improvements. © 2016 John Wiley & Sons Ltd.

  4. Testing the hypothesis of neurodegeneracy in respiratory network function with a priori transected arterially perfused brain stem preparation of rat

    PubMed Central

    Jones, Sarah E.

    2016-01-01

    Degeneracy of respiratory network function would imply that anatomically discrete aspects of the brain stem are capable of producing respiratory rhythm. To test this theory we a priori transected brain stem preparations before reperfusion and reoxygenation at 4 rostrocaudal levels: 1.5 mm caudal to obex (n = 5), at obex (n = 5), and 1.5 (n = 7) and 3 mm (n = 6) rostral to obex. The respiratory activity of these preparations was assessed via recordings of phrenic and vagal nerves and lumbar spinal expiratory motor output. Preparations with a priori transection at level of the caudal brain stem did not produce stable rhythmic respiratory bursting, even when the arterial chemoreceptors were stimulated with sodium cyanide (NaCN). Reperfusion of brain stems that preserved the pre-Bötzinger complex (pre-BötC) showed spontaneous and sustained rhythmic respiratory bursting at low phrenic nerve activity (PNA) amplitude that occurred simultaneously in all respiratory motor outputs. We refer to this rhythm as the pre-BötC burstlet-type rhythm. Conserving circuitry up to the pontomedullary junction consistently produced robust high-amplitude PNA at lower burst rates, whereas sequential motor patterning across the respiratory motor outputs remained absent. Some of the rostrally transected preparations expressed both burstlet-type and regular PNA amplitude rhythms. Further analysis showed that the burstlet-type rhythm and high-amplitude PNA had 1:2 quantal relation, with burstlets appearing to trigger high-amplitude bursts. We conclude that no degenerate rhythmogenic circuits are located in the caudal medulla oblongata and confirm the pre-BötC as the primary rhythmogenic kernel. The absence of sequential motor patterning in a priori transected preparations suggests that pontine circuits govern respiratory pattern formation. PMID:26888109

  5. A priori identifiability of a one-compartment model with two input functions for liver blood flow measurements

    NASA Astrophysics Data System (ADS)

    Becker, Georg A.; Müller-Schauenburg, Wolfgang; Spilker, Mary E.; Machulla, Hans-Jürgen; Piert, Morand

    2005-04-01

    An extended dual-input Kety-Schmidt model can be applied to positron emission tomography data for the quantification of local arterial (fa) and local portal-venous blood flow (fp) in the liver by freely diffusible tracers (e.g., [15O]H2O). We investigated the a priori identifiability of the three-parameter model (fa, fp and distribution volume (Vd)) under ideal (noise-free) conditions. The results indicate that the full identifiability of the model depends on the form of the portal-venous input function (cp(t)), which is assumed to be a sum of m exponentials convolved with the arterial input function (ca(t)). When m >= 2, all three-model parameters are uniquely identifiable. For m = 1 identifiability of fp fails if cp(t) coincides with tissue concentration (q(t)/Vd), which occurs if cp(t) is generated from an intestinal compartment with transit time Vd/fa. Any portal input, fp cp(t), is balanced by the portal contribution, fp q(t)/Vd, to the liver efflux, leaving q(t) unchanged by fp and only fa and Vd are a priori uniquely identifiable. An extension to this condition of unidentifiability is obtained if we leave the assumption of a generating intestinal compartment system and allow for an arbitrary proportionality constant between cp(t) and q(t). In this case, only fa remains a priori uniquely identifiable. These findings provide important insights into the behaviour and identifiability of the model applied to the unique liver environment.

  6. Regional Travel-Time Uncertainty and Seismic Location Improvement Using a Three- Dimensional a priori Velocity Model

    NASA Astrophysics Data System (ADS)

    Flanagan, M. P.; Myers, S. C.; Koper, K. D.

    2006-12-01

    We demonstrate our ability to improve regional travel-time prediction and seismic event location accuracy using ana priori, three-dimensional velocity model of Western Eurasia and North Africa (WENA1.0). Travel- time residuals are assessed relative to the iasp91 model for approximately 6,000 Pg, Pn, and P arrivals, from seismic events having 2sigma epicenter accuracy between 1 km and 25 km (GT1 and GT25, respectively), recorded at 39 stations throughout the model region. Ray paths range in length between 0 and 40 degrees epicentral distance (local, regional, and near teleseismic) providing depth sounding that spans the crust and upper mantle. The dataset also provides representative geographic sampling across Eurasia and North Africa including aseismic areas. The WENA1.0 model markedly improves travel-time predictions for most stations with an average variance reduction of 29% for all ray paths from the GT25 events; when we consider GT5 and better events alone the variance reduction is 49%. For location tests we use 196 geographically distributed GT5 and better events. In 134 cases (68% of the events), locations are improved, and average mislocation is reduced from 24.9 km to 17.7 km. We develop a travel time uncertainty model that is used to calculate location coverage ellipses. The coverage ellipses for WENA1.0 are validated to be representative of epicenter error and are smaller than those for iasp91 by 37%. We conclude that a priori, models are directly applicable where data coverage limits tomographic and empirical approaches, and the development of the uncertainty model enables merging of a priori, and data-driven approaches using Bayesian techniques. This work was performed under the auspices of the U.S. Department of Energy by the University of California Lawrence Livermore National Laboratory under contract No. W-7405-Eng-48, Contribution UCRL- JRNL-220179.

  7. Sensitivity and a priori uncertainty analysis of the CFRMF central flux spectrum

    SciTech Connect

    Ryskamp, J.M.; Anderl, R.A.; Broadhead, B.L.; Ford, W.E. III; Lucius, J.L.; Marable, J.H.

    1980-01-01

    The Coupled Fast Reactivity Measurements Facility (CFRMF), located at the Idaho National Engineering Laboratory, is a zoned-core critical assembly with a fast-neutron-spectrum zone in the center of an enriched /sup 235/U, water-moderated thermal driver. An accurate knowledge of the central neutron spectrum is important to data-testing analyses which utilize integral reaction-rate data measured for samples placed in the CFRMF field. The purpose of this paper is to present the results of a study made with the AMPX-II and FORSS code systems to deterine the central-spectrum flux covariance matrix due to uncertainties and correlations in the nuclear data for the materials which comprise the facility.

  8. Parametric Study of Urban-Like Topographic Statistical Moments Relevant to a Priori Modelling of Bulk Aerodynamic Parameters

    NASA Astrophysics Data System (ADS)

    Zhu, Xiaowei; Iungo, G. Valerio; Leonardi, Stefano; Anderson, William

    2017-02-01

    For a horizontally homogeneous, neutrally stratified atmospheric boundary layer (ABL), aerodynamic roughness length, z_0, is the effective elevation at which the streamwise component of mean velocity is zero. A priori prediction of z_0 based on topographic attributes remains an open line of inquiry in planetary boundary-layer research. Urban topographies - the topic of this study - exhibit spatial heterogeneities associated with variability of building height, width, and proximity with adjacent buildings; such variability renders a priori, prognostic z_0 models appealing. Here, large-eddy simulation (LES) has been used in an extensive parametric study to characterize the ABL response (and z_0) to a range of synthetic, urban-like topographies wherein statistical moments of the topography have been systematically varied. Using LES results, we determined the hierarchical influence of topographic moments relevant to setting z_0. We demonstrate that standard deviation and skewness are important, while kurtosis is negligible. This finding is reconciled with a model recently proposed by Flack and Schultz (J Fluids Eng 132:041203-1-041203-10, 2010), who demonstrate that z_0 can be modelled with standard deviation and skewness, and two empirical coefficients (one for each moment). We find that the empirical coefficient related to skewness is not constant, but exhibits a dependence on standard deviation over certain ranges. For idealized, quasi-uniform cubic topographies and for complex, fully random urban-like topographies, we demonstrate strong performance of the generalized Flack and Schultz model against contemporary roughness correlations.

  9. A priori evaluation of two-stage cluster sampling for accuracy assessment of large-area land-cover maps

    USGS Publications Warehouse

    Wickham, J.D.; Stehman, S.V.; Smith, J.H.; Wade, T.G.; Yang, L.

    2004-01-01

    Two-stage cluster sampling reduces the cost of collecting accuracy assessment reference data by constraining sample elements to fall within a limited number of geographic domains (clusters). However, because classification error is typically positively spatially correlated, within-cluster correlation may reduce the precision of the accuracy estimates. The detailed population information to quantify a priori the effect of within-cluster correlation on precision is typically unavailable. Consequently, a convenient, practical approach to evaluate the likely performance of a two-stage cluster sample is needed. We describe such an a priori evaluation protocol focusing on the spatial distribution of the sample by land-cover class across different cluster sizes and costs of different sampling options, including options not imposing clustering. This protocol also assesses the two-stage design's adequacy for estimating the precision of accuracy estimates for rare land-cover classes. We illustrate the approach using two large-area, regional accuracy assessments from the National Land-Cover Data (NLCD), and describe how the a priorievaluation was used as a decision-making tool when implementing the NLCD design.

  10. Use of a priori spectral information in the measurement of x-ray flux with filtered diode arrays

    NASA Astrophysics Data System (ADS)

    Marrs, R. E.; Widmann, K.; Brown, G. V.; Heeter, R. F.; MacLaren, S. A.; May, M. J.; Moore, A. S.; Schneider, M. B.

    2015-10-01

    Filtered x-ray diode (XRD) arrays are often used to measure x-ray spectra vs. time from spectrally continuous x-ray sources such as hohlraums. A priori models of the incident x-ray spectrum enable a more accurate unfolding of the x-ray flux as compared to the standard technique of modifying a thermal Planckian with spectral peaks or dips at the response energy of each filtered XRD channel. A model x-ray spectrum consisting of a thermal Planckian, a Gaussian at higher energy, and (in some cases) a high energy background provides an excellent fit to XRD-array measurements of x-ray emission from laser heated hohlraums. If high-resolution measurements of part of the x-ray emission spectrum are available, that information can be included in the a priori model. In cases where the x-ray emission spectrum is not Planckian, candidate x-ray spectra can be allowed or excluded by fitting them to measured XRD voltages. Examples are presented from the filtered XRD arrays, named Dante, at the National Ignition Facility and the Laboratory for Laser Energetics.

  11. Use of a priori spectral information in the measurement of x-ray flux with filtered diode arrays.

    PubMed

    Marrs, R E; Widmann, K; Brown, G V; Heeter, R F; MacLaren, S A; May, M J; Moore, A S; Schneider, M B

    2015-10-01

    Filtered x-ray diode (XRD) arrays are often used to measure x-ray spectra vs. time from spectrally continuous x-ray sources such as hohlraums. A priori models of the incident x-ray spectrum enable a more accurate unfolding of the x-ray flux as compared to the standard technique of modifying a thermal Planckian with spectral peaks or dips at the response energy of each filtered XRD channel. A model x-ray spectrum consisting of a thermal Planckian, a Gaussian at higher energy, and (in some cases) a high energy background provides an excellent fit to XRD-array measurements of x-ray emission from laser heated hohlraums. If high-resolution measurements of part of the x-ray emission spectrum are available, that information can be included in the a priori model. In cases where the x-ray emission spectrum is not Planckian, candidate x-ray spectra can be allowed or excluded by fitting them to measured XRD voltages. Examples are presented from the filtered XRD arrays, named Dante, at the National Ignition Facility and the Laboratory for Laser Energetics.

  12. A Priori and a Posteriori Dietary Patterns during Pregnancy and Gestational Weight Gain: The Generation R Study.

    PubMed

    Tielemans, Myrte J; Erler, Nicole S; Leermakers, Elisabeth T M; van den Broek, Marion; Jaddoe, Vincent W V; Steegers, Eric A P; Kiefte-de Jong, Jessica C; Franco, Oscar H

    2015-11-12

    Abnormal gestational weight gain (GWG) is associated with adverse pregnancy outcomes. We examined whether dietary patterns are associated with GWG. Participants included 3374 pregnant women from a population-based cohort in the Netherlands. Dietary intake during pregnancy was assessed with food-frequency questionnaires. Three a posteriori-derived dietary patterns were identified using principal component analysis: a "Vegetable, oil and fish", a "Nuts, high-fiber cereals and soy", and a "Margarine, sugar and snacks" pattern. The a priori-defined dietary pattern was based on national dietary recommendations. Weight was repeatedly measured around 13, 20 and 30 weeks of pregnancy; pre-pregnancy and maximum weight were self-reported. Normal weight women with high adherence to the "Vegetable, oil and fish" pattern had higher early-pregnancy GWG than those with low adherence (43 g/week (95% CI 16; 69) for highest vs. lowest quartile (Q)). Adherence to the "Margarine, sugar and snacks" pattern was associated with a higher prevalence of excessive GWG (OR 1.45 (95% CI 1.06; 1.99) Q4 vs. Q1). Normal weight women with higher scores on the "Nuts, high-fiber cereals and soy" pattern had more moderate GWG than women with lower scores (-0.01 (95% CI -0.02; -0.00) per SD). The a priori-defined pattern was not associated with GWG. To conclude, specific dietary patterns may play a role in early pregnancy but are not consistently associated with GWG.

  13. A Priori Analysis of a Compressible Flamelet Model using RANS Data for a Dual-Mode Scramjet Combustor

    NASA Technical Reports Server (NTRS)

    Quinlan, Jesse R.; Drozda, Tomasz G.; McDaniel, James C.; Lacaze, Guilhem; Oefelein, Joseph

    2015-01-01

    In an effort to make large eddy simulation of hydrocarbon-fueled scramjet combustors more computationally accessible using realistic chemical reaction mechanisms, a compressible flamelet/progress variable (FPV) model was proposed that extends current FPV model formulations to high-speed, compressible flows. Development of this model relied on observations garnered from an a priori analysis of the Reynolds-Averaged Navier-Stokes (RANS) data obtained for the Hypersonic International Flight Research and Experimentation (HI-FiRE) dual-mode scramjet combustor. The RANS data were obtained using a reduced chemical mechanism for the combustion of a JP-7 surrogate and were validated using avail- able experimental data. These RANS data were then post-processed to obtain, in an a priori fashion, the scalar fields corresponding to an FPV-based modeling approach. In the current work, in addition to the proposed compressible flamelet model, a standard incompressible FPV model was also considered. Several candidate progress variables were investigated for their ability to recover static temperature and major and minor product species. The effects of pressure and temperature on the tabulated progress variable source term were characterized, and model coupling terms embedded in the Reynolds- averaged Navier-Stokes equations were studied. Finally, results for the novel compressible flamelet/progress variable model were presented to demonstrate the improvement attained by modeling the effects of pressure and flamelet boundary conditions on the combustion.

  14. Snow Water Equivalent estimation from AMSR-E data based on priori snow properties in Xinjiang province of China

    NASA Astrophysics Data System (ADS)

    Dai, L.; Che, T.

    2011-12-01

    A novel snow water equivalent (SWE) retrieval algorithm from AMSR-E data is established based on priori snow conditions (such as snow grain size and density, as well as the stratigraphy of snow) in Xinjiang Province of China. Within the retrieval algorithm, a microwave radiative transfer model (i.e. MEMLS) is used to simulate the brightness temperature (TB) datasets at 18 and 36 GHz under all kinds of snow conditions including snow grain size, density, depth, and stratigraphy (namely, the snow layering). Therefore, a series of relationships between snow depth and the difference of TB at these two frequencies can be obtained based on different snow grain size, density and stratigraphic conditions. These snow conditions were measured along a fixed route of this study area for a complete snow season. Furthermore, a layering scheme was established based on the snow depth (estimated by existing SD algorithm in priori), while the depth of each layer and its grain size and density were parameterized according to the age of snow cover. Finally, the SWE can be calculated by snow depth and its density. SWE retrieved by this new algorithm at seven meteorological stations in Xinjiang Province from 2003 to 2008 are compared to two existing SWE products from NSIDC (National snow and ice data center) and WESTDC (Environmental and Ecological Science Data Center for West China). Three groups of root-mean-squared error (RMSE) and mean error (ME) are calculated between observed SWE and the three estimated SWE(s), respectively (Table 1). The three groups of RMSE have the same tendency of increasing with the increase of mean SWE, while the RMSE(s) from the new algorithm and from the WESTDC are much less and ME(s) are much closer to zero than from the NSIDC at all seven stations (see Table 1). The RMSE(s) from the new algorithm are less than from the WESTDC at seven stations, and ME(s) are closer to zero at five stations. At the other two stations, the ME(s) are -2.02mm and -3.05mm

  15. A Priori and a Posteriori Dietary Patterns during Pregnancy and Gestational Weight Gain: The Generation R Study

    PubMed Central

    Tielemans, Myrte J.; Erler, Nicole S.; Leermakers, Elisabeth T. M.; van den Broek, Marion; Jaddoe, Vincent W. V.; Steegers, Eric A. P.; Kiefte-de Jong, Jessica C.; Franco, Oscar H.

    2015-01-01

    Abnormal gestational weight gain (GWG) is associated with adverse pregnancy outcomes. We examined whether dietary patterns are associated with GWG. Participants included 3374 pregnant women from a population-based cohort in the Netherlands. Dietary intake during pregnancy was assessed with food-frequency questionnaires. Three a posteriori-derived dietary patterns were identified using principal component analysis: a “Vegetable, oil and fish”, a “Nuts, high-fiber cereals and soy”, and a “Margarine, sugar and snacks” pattern. The a priori-defined dietary pattern was based on national dietary recommendations. Weight was repeatedly measured around 13, 20 and 30 weeks of pregnancy; pre-pregnancy and maximum weight were self-reported. Normal weight women with high adherence to the “Vegetable, oil and fish” pattern had higher early-pregnancy GWG than those with low adherence (43 g/week (95% CI 16; 69) for highest vs. lowest quartile (Q)). Adherence to the “Margarine, sugar and snacks” pattern was associated with a higher prevalence of excessive GWG (OR 1.45 (95% CI 1.06; 1.99) Q4 vs. Q1). Normal weight women with higher scores on the “Nuts, high-fiber cereals and soy” pattern had more moderate GWG than women with lower scores (−0.01 (95% CI −0.02; −0.00) per SD). The a priori-defined pattern was not associated with GWG. To conclude, specific dietary patterns may play a role in early pregnancy but are not consistently associated with GWG. PMID:26569303

  16. Estimating a-priori kinematic wave model parameters based on regionalization for flash flood forecasting in the Conterminous United States

    NASA Astrophysics Data System (ADS)

    Vergara, Humberto; Kirstetter, Pierre-Emmanuel; Gourley, Jonathan J.; Flamig, Zachary L.; Hong, Yang; Arthur, Ami; Kolar, Randall

    2016-10-01

    This study presents a methodology for the estimation of a-priori parameters of the widely used kinematic wave approximation to the unsteady, 1-D Saint-Venant equations for hydrologic flow routing. The approach is based on a multi-dimensional statistical modeling of the macro scale spatial variability of rating curve parameters using a set of geophysical factors including geomorphology, hydro-climatology and land cover/land use over the Conterminous United States. The main goal of this study was to enable prediction at ungauged locations through regionalization of model parameters. The results highlight the importance of regional and local geophysical factors in uniquely defining characteristics of each stream reach conforming to physical theory of fluvial hydraulics. The application of the estimates is demonstrated through a hydrologic modeling evaluation of a deterministic forecasting system performed on 1672 gauged basins and 47,563 events extracted from a 10-year simulation. Considering the mean concentration time of the basins of the study and the target application on flash flood forecasting, the skill of the flow routing simulations is significantly high for peakflow and timing of peakflow estimation, and shows consistency as indicated by the large sample verification. The resulting a-priori estimates can be used in any hydrologic model that employs the kinematic wave model for flow routing. Furthermore, probabilistic estimates of kinematic wave parameters are enabled based on uncertainty information that is generated during the multi-dimensional statistical modeling. More importantly, the methodology presented in this study enables the estimation of the kinematic wave model parameters anywhere over the globe, thus allowing flood modeling in ungauged basins at regional to global scales.

  17. Identification of the flexural stiffness parameters of an orthotropic plate from the local dynamic equilibrium without a priori knowledge of the principal directions

    NASA Astrophysics Data System (ADS)

    Ablitzer, Frédéric; Pézerat, Charles; Lascoup, Bertrand; Brocail, Julien

    2017-09-01

    This paper proposes an inverse method to characterize orthotropic material properties from vibratory measurements on plate-like structures. The method is an adaptation of the Force Analysis Technique (FAT), which was originally developed to identify the external force distribution acting on a structure using its local discretized equation of motion. This method was recently adapted to the identification of elastic and damping properties of isotropic plates. In the present approach, the equation of motion of an orthotropic plate with respect to an arbitrary set of orthogonal axes is considered. The angle between the axes of the measurement mesh and the principal directions of orthotropy therefore explicitly appears in the equation and constitutes an unknown. A procedure to identify this angle together with the flexural stiffness parameters is proposed, as well as an automatic regularization procedure to overcome the high sensitivity of the inverse problem to measurement noise. The method is illustrated using simulated data. Experimental results shown on various structures demonstrate the ability of the method to simultaneously identify the principal orthotropy directions and the flexural stiffness parameters.

  18. Layering ratios: a systematic approach to the inversion of surface wave data in the absence of a priori information

    NASA Astrophysics Data System (ADS)

    Cox, Brady R.; Teague, David P.

    2016-10-01

    Surface wave methods provide a cost effective means of developing shear wave velocity (Vs) profiles for applications such as dynamic site characterization and seismic site response analyses. However, the inverse problem involved in obtaining a realistic layered earth model from surface wave dispersion data is inherently ill-posed, non-linear and mix-determined, without a unique solution. When available, a priori information such as geotechnical boreholes or geologic well logs should be used to aid in constraining site-specific inversion parameters. Unfortunately, a priori information is often unavailable, particularly at significant depths, and a `blind analysis' must be performed. In these situations, the analyst must decide on an appropriate number of layers and ranges for their corresponding inversion parameters (i.e. trial number of layers and ranges in their respective thicknesses, shear wave velocities, compression wave velocities and mass densities). Selection of these parameters has been shown to significantly impact the results of an inversion. This paper presents a method for conducting multiple inversions utilizing systematically varied inversion layering parametrizations in order to identify and encompass the most reasonable layered earth models for a site. Each parametrization is defined by a unique layering ratio, which represents a multiplier that systemically increases the potential thickness of each layer in the inversion parametrization based on the potential thickness of the layer directly above it. The layering ratio method is demonstrated at two sites associated with the InterPacific Project, wherein it is shown to significantly aid in selecting reasonable Vs profiles that are close representations of the subsurface. While the goal of the layering ratio inversion methodology is not necessarily to find the `optimal' or `best' Vs profile for a site, it may be successful at doing so for certain sites/datasets. However, the primary reason for using

  19. Priori mask guided image reconstruction (p-MGIR) for ultra-low dose cone-beam computed tomography.

    PubMed

    Park, Justin C; Zhang, Hao; Chen, Yunmei; Fan, Qiyong; Kahler, Darren L; Liu, Chihray; Lu, Bo

    2015-11-07

    Recently, the compressed sensing (CS) based iterative reconstruction method has received attention because of its ability to reconstruct cone beam computed tomography (CBCT) images with good quality using sparsely sampled or noisy projections, thus enabling dose reduction. However, some challenges remain. In particular, there is always a tradeoff between image resolution and noise/streak artifact reduction based on the amount of regularization weighting that is applied uniformly across the CBCT volume. The purpose of this study is to develop a novel low-dose CBCT reconstruction algorithm framework called priori mask guided image reconstruction (p-MGIR) that allows reconstruction of high-quality low-dose CBCT images while preserving the image resolution. In p-MGIR, the unknown CBCT volume was mathematically modeled as a combination of two regions: (1) where anatomical structures are complex, and (2) where intensities are relatively uniform. The priori mask, which is the key concept of the p-MGIR algorithm, was defined as the matrix that distinguishes between the two separate CBCT regions where the resolution needs to be preserved and where streak or noise needs to be suppressed. We then alternately updated each part of image by solving two sub-minimization problems iteratively, where one minimization was focused on preserving the edge information of the first part while the other concentrated on the removal of noise/artifacts from the latter part. To evaluate the performance of the p-MGIR algorithm, a numerical head-and-neck phantom, a Catphan 600 physical phantom, and a clinical head-and-neck cancer case were used for analysis. The results were compared with the standard Feldkamp-Davis-Kress as well as conventional CS-based algorithms. Examination of the p-MGIR algorithm showed that high-quality low-dose CBCT images can be reconstructed without compromising the image resolution. For both phantom and the patient cases, the p-MGIR is able to achieve a clinically

  20. Priori mask guided image reconstruction (p-MGIR) for ultra-low dose cone-beam computed tomography

    NASA Astrophysics Data System (ADS)

    Park, Justin C.; Zhang, Hao; Chen, Yunmei; Fan, Qiyong; Kahler, Darren L.; Liu, Chihray; Lu, Bo

    2015-11-01

    Recently, the compressed sensing (CS) based iterative reconstruction method has received attention because of its ability to reconstruct cone beam computed tomography (CBCT) images with good quality using sparsely sampled or noisy projections, thus enabling dose reduction. However, some challenges remain. In particular, there is always a tradeoff between image resolution and noise/streak artifact reduction based on the amount of regularization weighting that is applied uniformly across the CBCT volume. The purpose of this study is to develop a novel low-dose CBCT reconstruction algorithm framework called priori mask guided image reconstruction (p-MGIR) that allows reconstruction of high-quality low-dose CBCT images while preserving the image resolution. In p-MGIR, the unknown CBCT volume was mathematically modeled as a combination of two regions: (1) where anatomical structures are complex, and (2) where intensities are relatively uniform. The priori mask, which is the key concept of the p-MGIR algorithm, was defined as the matrix that distinguishes between the two separate CBCT regions where the resolution needs to be preserved and where streak or noise needs to be suppressed. We then alternately updated each part of image by solving two sub-minimization problems iteratively, where one minimization was focused on preserving the edge information of the first part while the other concentrated on the removal of noise/artifacts from the latter part. To evaluate the performance of the p-MGIR algorithm, a numerical head-and-neck phantom, a Catphan 600 physical phantom, and a clinical head-and-neck cancer case were used for analysis. The results were compared with the standard Feldkamp-Davis-Kress as well as conventional CS-based algorithms. Examination of the p-MGIR algorithm showed that high-quality low-dose CBCT images can be reconstructed without compromising the image resolution. For both phantom and the patient cases, the p-MGIR is able to achieve a clinically

  1. A Novel Principal Component Analysis-Based Acceleration Scheme for LES-ODT: An A Priori Study

    NASA Astrophysics Data System (ADS)

    Echekki, Tarek; Mirgolbabaei, Hessan

    2012-11-01

    A parameterization of the composition space based on principal component analysis (PCA) is proposed to represent the transport equations with the one-dimensional turbulence (ODT) solutions of a hybrid large-eddy simulation (LES) and ODT scheme. An a priori validation of the proposed approach is implemented based on stand-alone ODT solutions of the Sandia Flame F flame, which is characterized by different regimes of combustion starting with pilot stabilization, to extinction and reignition and self-stabilized combustion. The PCA analysis is carried out with a full set of the thermo-chemical scalars' vector as well as a subset of this vector. The subset is made up primarily of major species and temperature. The results show that the different regimes are reproduced using only three principal components for the thermo-chemical scalars based on the full and a subset of the thermo-chemical scalars' vector. Reproduction of the source term of the principal component represents a greater challenge. It is found that using the subset of the thermo-chemical scalars' vector both minor species and the first three principal components source terms are reasonably well predicted.

  2. Enhancing the performance of model-based elastography by incorporating additional a priori information in the modulus image reconstruction process

    NASA Astrophysics Data System (ADS)

    Doyley, Marvin M.; Srinivasan, Seshadri; Dimidenko, Eugene; Soni, Nirmal; Ophir, Jonathan

    2006-01-01

    Model-based elastography is fraught with problems owing to the ill-posed nature of the inverse elasticity problem. To overcome this limitation, we have recently developed a novel inversion scheme that incorporates a priori information concerning the mechanical properties of the underlying tissue structures, and the variance incurred during displacement estimation in the modulus image reconstruction process. The information was procured by employing standard strain imaging methodology, and introduced in the reconstruction process through the generalized Tikhonov approach. In this paper, we report the results of experiments conducted on gelatin phantoms to evaluate the performance of modulus elastograms computed with the generalized Tikhonov (GTK) estimation criterion relative to those computed by employing the un-weighted least-squares estimation criterion, the weighted least-squares estimation criterion and the standard Tikhonov method (i.e., the generalized Tikhonov method with no modulus prior). The results indicate that modulus elastograms computed with the generalized Tikhonov approach had superior elastographic contrast discrimination and contrast recovery. In addition, image reconstruction was more resilient to structural decorrelation noise when additional constraints were imposed on the reconstruction process through the GTK method.

  3. A priori testing of subgrid-scale models for the velocity-pressure and vorticity-velocity formulations

    NASA Technical Reports Server (NTRS)

    Winckelmans, G. S.; Lund, T. S.; Carati, D.; Wray, A. A.

    1996-01-01

    Subgrid-scale models for Large Eddy Simulation (LES) in both the velocity-pressure and the vorticity-velocity formulations were evaluated and compared in a priori tests using spectral Direct Numerical Simulation (DNS) databases of isotropic turbulence: 128(exp 3) DNS of forced turbulence (Re(sub(lambda))=95.8) filtered, using the sharp cutoff filter, to both 32(exp 3) and 16(exp 3) synthetic LES fields; 512(exp 3) DNS of decaying turbulence (Re(sub(Lambda))=63.5) filtered to both 64(exp 3) and 32(exp 3) LES fields. Gaussian and top-hat filters were also used with the 128(exp 3) database. Different LES models were evaluated for each formulation: eddy-viscosity models, hyper eddy-viscosity models, mixed models, and scale-similarity models. Correlations between exact versus modeled subgrid-scale quantities were measured at three levels: tensor (traceless), vector (solenoidal 'force'), and scalar (dissipation) levels, and for both cases of uniform and variable coefficient(s). Different choices for the 1/T scaling appearing in the eddy-viscosity were also evaluated. It was found that the models for the vorticity-velocity formulation produce higher correlations with the filtered DNS data than their counterpart in the velocity-pressure formulation. It was also found that the hyper eddy-viscosity model performs better than the eddy viscosity model, in both formulations.

  4. Wiener filtering of surface EMG with a priori SNR estimation toward myoelectric control for neurological injury patients.

    PubMed

    Liu, Jie; Ying, Dongwen; Zhou, Ping

    2014-12-01

    Voluntary surface electromyogram (EMG) signals from neurological injury patients are often corrupted by involuntary background interference or spikes, imposing difficulties for myoelectric control. We present a novel framework to suppress involuntary background spikes during voluntary surface EMG recordings. The framework applies a Wiener filter to restore voluntary surface EMG signals based on tracking a priori signal to noise ratio (SNR) by using the decision-directed method. Semi-synthetic surface EMG signals contaminated by different levels of involuntary background spikes were constructed from a database of surface EMG recordings in a group of spinal cord injury subjects. After the processing, the onset detection of voluntary muscle activity was significantly improved against involuntary background spikes. The magnitude of voluntary surface EMG signals can also be reliably estimated for myoelectric control purpose. Compared with the previous sample entropy analysis for suppressing involuntary background spikes, the proposed framework is characterized by quick and simple implementation, making it more suitable for application in a myoelectric control system toward neurological injury rehabilitation.

  5. Almost half of the Danish general practitioners have negative a priori attitudes towards a mandatory accreditation programme.

    PubMed

    Waldorff, Frans Boch; Nicolaisdóttir, Dagný Rós; Kousgaard, Marius Brostrøm; Reventlow, Susanne; Søndergaard, Jens; Thorsen, Thorkil; Andersen, Merethe Kirstine; Pedersen, Line Bjørnskov; Bisgaard, Louise; Hutters, Cecilie Lybeck; Bro, Flemming

    2016-09-01

    The objective of this study was to analyse Danish general practitioners' (GPs) a priori attitudes and expectations towards a nationwide mandatory accreditation programme. This study is based on a nationwide electronic survey comprising all Danish GPs (n = 3,403). A total of 1,906 (56%) GPs completed the questionnaire. In all, 861 (45%) had a negative attitude towards accreditation, whereas 429 (21%) were very positive or posi-tive. The negative attitudes towards accreditation were associated with being older, male and with working in a singlehanded practice. A regional difference was observed as well. GPs with negative expectations were more likely to agree that accreditation was a tool meant for external control (odds ratio (OR) = 1.87 (95% confidence interval (CI): 1.18-2.95)), less likely to agree that accreditation was a tool for quality improvement (OR = 0.018 (95% CI: 0.013-0.025)), more likely to agree that it would affect job satisfaction negatively (OR = 21.88 (95% CI: 16.10-29.72)), and they were generally less satisfied with their present job situation (OR = 2.51 (95% CI: 1.85-3.41)). Almost half of the GPs had negative attitudes towards accreditation. The three Research Units for General Practice in Odense, Aarhus and Copenhagen initiated and funded this study. The survey was recommended by the Danish Multipractice Committee (MPU 02-2015) and evaluated by the Danish Data Agency (2015-41-3684).

  6. 'Aussie normals': an a priori study to develop clinical chemistry reference intervals in a healthy Australian population.

    PubMed

    Koerbin, G; Cavanaugh, J A; Potter, J M; Abhayaratna, W P; West, N P; Glasgow, N; Hawkins, C; Armbruster, D; Oakman, C; Hickman, P E

    2015-02-01

    Development of reference intervals is difficult, time consuming, expensive and beyond the scope of most laboratories. The Aussie Normals study is a direct a priori study to determine reference intervals in healthy Australian adults. All volunteers completed a health and lifestyle questionnaire and exclusion was based on conditions such as pregnancy, diabetes, renal or cardiovascular disease. Up to 91 biochemical analyses were undertaken on a variety of analytical platforms using serum samples collected from 1856 volunteers. We report on our findings for 40 of these analytes and two calculated parameters performed on the Abbott ARCHITECTci8200/ci16200 analysers. Not all samples were analysed for all assays due to volume requirements or assay/instrument availability. Results with elevated interference indices and those deemed unsuitable after clinical evaluation were removed from the database. Reference intervals were partitioned based on the method of Harris and Boyd into three scenarios, combined gender, males and females and age and gender. We have performed a detailed reference interval study on a healthy Australian population considering the effects of sex, age and body mass. These reference intervals may be adapted to other manufacturer's analytical methods using method transference.

  7. Randomized clinical trials in orthodontics are rarely registered a priori and often published late or not at all

    PubMed Central

    Antonoglou, Georgios N.; Sándor, George K.; Eliades, Theodore

    2017-01-01

    A priori registration of randomized clinical trials is crucial to the transparency and credibility of their findings. Aim of this study was to assess the frequency with which registered and completed randomized trials in orthodontics are published. We searched ClinicalTrials.gov and ISRCTN for registered randomized clinical trials in orthodontics that had been completed up to January 2017 and judged the publication status and date of registered trials using a systematic protocol. Statistical analysis included descriptive statistics, chi-square or Fisher exact tests, and Kaplan-Meier survival estimates. From the 266 orthodontic trials registered up to January 2017, 80 trials had been completed and included in the present study. Among these 80 included trials, the majority (76%) were registered retrospectively, while only 33 (41%) were published at the time. The median time from completion to publication was 20.1 months (interquartile range: 9.1 to 31.6 months), while survival analysis indicated that less than 10% of the trials were published after 5 years from their completion. Finally, 22 (28%) of completed trials remain unpublished even after 5 years from their completion. Publication rates of registered randomized trials in orthodontics remained low, even 5 years after their completion date. PMID:28777820

  8. Randomized clinical trials in orthodontics are rarely registered a priori and often published late or not at all.

    PubMed

    Papageorgiou, Spyridon N; Antonoglou, Georgios N; Sándor, George K; Eliades, Theodore

    2017-01-01

    A priori registration of randomized clinical trials is crucial to the transparency and credibility of their findings. Aim of this study was to assess the frequency with which registered and completed randomized trials in orthodontics are published. We searched ClinicalTrials.gov and ISRCTN for registered randomized clinical trials in orthodontics that had been completed up to January 2017 and judged the publication status and date of registered trials using a systematic protocol. Statistical analysis included descriptive statistics, chi-square or Fisher exact tests, and Kaplan-Meier survival estimates. From the 266 orthodontic trials registered up to January 2017, 80 trials had been completed and included in the present study. Among these 80 included trials, the majority (76%) were registered retrospectively, while only 33 (41%) were published at the time. The median time from completion to publication was 20.1 months (interquartile range: 9.1 to 31.6 months), while survival analysis indicated that less than 10% of the trials were published after 5 years from their completion. Finally, 22 (28%) of completed trials remain unpublished even after 5 years from their completion. Publication rates of registered randomized trials in orthodontics remained low, even 5 years after their completion date.

  9. An A Priori Hot-Tearing Indicator Applied to Die-Cast Magnesium-Rare Earth Alloys

    NASA Astrophysics Data System (ADS)

    Easton, Mark A.; Gibson, Mark A.; Zhu, Suming; Abbott, Trevor B.

    2014-07-01

    Hot-tearing susceptibility is an important consideration for alloy design. Based on a review of previous research, an a priori indicator for the prediction of an alloy's hot-tearing susceptibility is proposed in this article and is applied to a range of magnesium-rare earth (RE)-based alloys. The indicator involves taking the integral over the solid fraction/temperature curve between the temperature when feeding becomes restricted (coherency) and that when a three-dimension network of solid is formed (coalescence). The hot-tearing propensity of Mg-RE alloys is found to vary greatly depending on which RE is primarily used, due to the difference in the solidification range. Mg-Nd alloys are the most susceptible to hot tearing, followed by Mg-Ce-based alloys, while Mg-La alloys show almost no hot tearing. The proposed indicator can be well applied to hot-tearing propensity of the Mg-RE alloys. It is expected that the indicator could be used as an estimation of the relative hot-tearing propensity in other alloy systems as well.

  10. Tomographic image via background subtraction using an x-ray projection image and a priori computed tomography

    PubMed Central

    Zhang, Jin; Yi, Byongyong; Lasio, Giovanni; Suntharalingam, Mohan; Yu, Cedric

    2009-01-01

    Kilovoltage x-ray projection images (kV images for brevity) are increasingly available in image guided radiotherapy (IGRT) for patient positioning. These images are two-dimensional (2D) projections of a three-dimensional (3D) object along the x-ray beam direction. Projecting a 3D object onto a plane may lead to ambiguities in the identification of anatomical structures and to poor contrast in kV images. Therefore, the use of kV images in IGRT is mainly limited to bony landmark alignments. This work proposes a novel subtraction technique that isolates a slice of interest (SOI) from a kV image with the assistance of a priori information from a previous CT scan. The method separates structural information within a preselected SOI by suppressing contributions to the unprocessed projection from out-of-SOI-plane structures. Up to a five-fold increase in the contrast-to-noise ratios (CNRs) was observed in selected regions of the isolated SOI, when compared to the original unprocessed kV image. The tomographic image via background subtraction (TIBS) technique aims to provide a quick snapshot of the slice of interest with greatly enhanced image contrast over conventional kV x-ray projections for fast and accurate image guidance of radiation therapy. With further refinements, TIBS could, in principle, provide real-time tumor localization using gantry-mounted x-ray imaging systems without the need for implanted markers. PMID:19928074

  11. Knowledge Management.

    ERIC Educational Resources Information Center

    1999

    The first of the four papers in this symposium, "Knowledge Management and Knowledge Dissemination" (Wim J. Nijhof), presents two case studies exploring the strategies companies use in sharing and disseminating knowledge and expertise among employees. "A Theory of Knowledge Management" (Richard J. Torraco), develops a conceptual…

  12. Knowledge Management.

    ERIC Educational Resources Information Center

    1999

    The first of the four papers in this symposium, "Knowledge Management and Knowledge Dissemination" (Wim J. Nijhof), presents two case studies exploring the strategies companies use in sharing and disseminating knowledge and expertise among employees. "A Theory of Knowledge Management" (Richard J. Torraco), develops a conceptual…

  13. Unequal Knowledge.

    ERIC Educational Resources Information Center

    Tilly, Charles

    2003-01-01

    Discusses how the persistence of knowledge inequalities influences higher education. Explores how the control of and access to knowledge affects human well being (i.e., control over production of knowledge, control over its distribution, and access to knowledge by people whose well being it will or could affect). (EV)

  14. Unequal Knowledge.

    ERIC Educational Resources Information Center

    Tilly, Charles

    2003-01-01

    Discusses how the persistence of knowledge inequalities influences higher education. Explores how the control of and access to knowledge affects human well being (i.e., control over production of knowledge, control over its distribution, and access to knowledge by people whose well being it will or could affect). (EV)

  15. LETTER TO THE EDITOR: Essentially all Gaussian two-party quantum states are a priori nonclassical but classically correlated

    NASA Astrophysics Data System (ADS)

    Slater, Paul B.

    2000-08-01

    Duan et al (Duan L-M, Giedke G, Cirac J I and Zoller P 2000 Phys. Rev. Lett. 84 2722) and, independently, Simon (Simon R 2000 Phys. Rev. Lett. 84 2726) have recently found necessary and sufficient conditions for the separability (classical correlation) of the Gaussian two-party (continuous variable) states. Duan et al remark that their criterion is based on a `much stronger bound' on the total variance of a pair of Einstein-Podolsky-Rosen-type operators than is required simply by the uncertainty relation. Here, we seek to formalize and test this particular assertion in both classical and quantum-theoretic frameworks. We first attach to these states the classical a priori probability (Jeffreys' prior), proportional to the volume element of the Fisher information metric on the Riemannian manifold of Gaussian (quadrivariate normal) probability distributions. Then, numerical evidence indicates that more than 99% of the Gaussian two-party states do, in fact, meet the more stringent criterion for separability. We collaterally note that the prior probability assigned to the classical states, that is those having positive Glauber-Sudarshan P-representations, is less than 0.001%. We, then, seek to attach as a measure to the Gaussian two-party states the volume element of the associated (quantum-theoretic) Bures (minimal monotone) metric. Our several extensive analyses, then, persistently yield probabilities of separability and classicality that are, to very high orders of accuracy, unity and zero, respectively, so the two quite distinct (classical and quantum-theoretic) forms of analysis are rather remarkably consistent in their findings.

  16. High-resolution teleseismic tomography of upper-mantle structure using an a priori three-dimensional crustal model

    NASA Astrophysics Data System (ADS)

    Waldhauser, Felix; Lippitsch, Regina; Kissling, Edi; Ansorge, Jörg

    2002-08-01

    The effect of an a priori known 3-D crustal model in teleseismic tomography of upper-mantle structure is investigated. We developed a 3-D crustal P-wave velocity model for the greater Alpine region, encompassing the central and western Alps and the northern Apennines, to estimate the crustal contribution to teleseismic traveltimes. The model is constructed by comparative use of published information from active and passive seismic surveys. The model components are chosen to represent the present large-scale Alpine crustal structure and for their significant effect on the propagation of seismic wavefields. They are first-order structures such as the crust-mantle boundary, sedimentary basins and the high-velocity Ivrea body. Teleseismic traveltime residuals are calculated for a realistic distribution of azimuths and distances by coupling a finite-difference technique to the IASP91 traveltime tables. Residuals are produced for a synthetic upper-mantle model featuring two slab structures and the 3-D crustal model on top of it. The crustal model produces traveltime residuals in the range between -0.7 and 1.5 s that vary strongly as a function of backazimuth and epicentral distance. We find that the non-linear inversion of the synthetic residuals without correcting for the 3-D crustal structure erroneously maps the crustal anomalies into the upper mantle. Correction of the residuals for crustal structure before inversion properly recovers the synthetic slab structures placed in the upper mantle. We conclude that with the increasing amount of high-quality seismic traveltime data, correction for near-surface structure is essential for increasing resolution in tomographic images of upper-mantle structure.

  17. A fast 3D surface reconstruction and volume estimation method for grain storage based on priori model

    NASA Astrophysics Data System (ADS)

    Liang, Xian-hua; Sun, Wei-dong

    2011-06-01

    Inventory checking is one of the most significant parts for grain reserves, and plays a very important role on the macro-control of food and food security. Simple, fast and accurate method to obtain internal structure information and further to estimate the volume of the grain storage is needed. Here in our developed system, a special designed multi-site laser scanning system is used to acquire the range data clouds of the internal structure of the grain storage. However, due to the seriously uneven distribution of the range data, this data should firstly be preprocessed by an adaptive re-sampling method to reduce the data redundancy as well as noise. Then the range data is segmented and useful features, such as plane and cylinder information, are extracted. With these features a coarse registration between all of these single-site range data is done, and then an Iterative Closest Point (ICP) algorithm is carried out to achieve fine registration. Taking advantage of the structure of the grain storage being well defined and the types of them are limited, a fast automatic registration method based on the priori model is proposed to register the multi-sites range data more efficiently. Then after the integration of the multi-sites range data, the grain surface is finally reconstructed by a delaunay based algorithm and the grain volume is estimated by a numerical integration method. This proposed new method has been applied to two common types of grain storage, and experimental results shown this method is more effective and accurate, and it can also avoids the cumulative effect of errors when registering the overlapped area pair-wisely.

  18. A priori assumptions about characters as a cause of incongruence between molecular and morphological hypotheses of primate interrelationships.

    PubMed

    Tornow, Matthew A; Skelton, Randall R

    2012-01-01

    When molecules and morphology produce incongruent hypotheses of primate interrelationships, the data are typically viewed as incompatible, and molecular hypotheses are often considered to be better indicators of phylogenetic history. However, it has been demonstrated that the choice of which taxa to include in cladistic analysis as well as assumptions about character weighting, character state transformation order, and outgroup choice all influence hypotheses of relationships and may positively influence tree topology, so that relationships between extant taxa are consistent with those found using molecular data. Thus, the source of incongruence between morphological and molecular trees may lie not in the morphological data themselves but in assumptions surrounding the ways characters evolve and their impact on cladistic analysis. In this study, we investigate the role that assumptions about character polarity and transformation order play in creating incongruence between primate phylogenies based on morphological data and those supported by multiple lines of molecular data. By releasing constraints imposed on published morphological analyses of primates from disparate clades and subjecting those data to parsimony analysis, we test the hypothesis that incongruence between morphology and molecules results from inherent flaws in morphological data. To quantify the difference between incongruent trees, we introduce a new method called branch slide distance (BSD). BSD mitigates many of the limitations attributed to other tree comparison methods, thus allowing for a more accurate measure of topological similarity. We find that releasing a priori constraints on character behavior often produces trees that are consistent with molecular trees. Case studies are presented that illustrate how congruence between molecules and unconstrained morphological data may provide insight into issues of polarity, transformation order, homology, and homoplasy.

  19. Local digital control of power electronic converters in a dc microgrid based on a-priori derivation of switching surfaces

    NASA Astrophysics Data System (ADS)

    Banerjee, Bibaswan

    In power electronic basedmicrogrids, the computational requirements needed to implement an optimized online control strategy can be prohibitive. The work presented in this dissertation proposes a generalized method of derivation of geometric manifolds in a dc microgrid that is based on the a-priori computation of the optimal reactions and trajectories for classes of events in a dc microgrid. The proposed states are the stored energies in all the energy storage elements of the dc microgrid and power flowing into them. It is anticipated that calculating a large enough set of dissimilar transient scenarios will also span many scenarios not specifically used to develop the surface. These geometric manifolds will then be used as reference surfaces in any type of controller, such as a sliding mode hysteretic controller. The presence of switched power converters in microgrids involve different control actions for different system events. The control of the switch states of the converters is essential for steady state and transient operations. A digital memory look-up based controller that uses a hysteretic sliding mode control strategy is an effective technique to generate the proper switch states for the converters. An example dcmicrogrid with three dc-dc boost converters and resistive loads is considered for this work. The geometric manifolds are successfully generated for transient events, such as step changes in the loads and the sources. The surfaces corresponding to a specific case of step change in the loads are then used as reference surfaces in an EEPROM for experimentally validating the control strategy. The required switch states corresponding to this specific transient scenario are programmed in the EEPROM as a memory table. This controls the switching of the dc-dc boost converters and drives the system states to the reference manifold. In this work, it is shown that this strategy effectively controls the system for a transient condition such as step changes

  20. Visual Knowledge.

    ERIC Educational Resources Information Center

    Chipman, Susan F.

    Visual knowledge is an enormously important part of our total knowledge. The psychological study of learning and knowledge has focused almost exclusively on verbal materials. Today, the advance of technology is making the use of visual communication increasingly feasible and popular. However, this enthusiasm involves the illusion that visual…

  1. Preserving Knowledge

    ERIC Educational Resources Information Center

    Taintor, Spence

    2008-01-01

    Every year, teachers leave the profession and take valuable experience and knowledge with them. An increasing retirement rate makes schools vulnerable to a significant loss of knowledge. This article describes how implementing a knowledge management process will ensure that valuable assets are captured and shared. (Contains 3 online resources.)

  2. Knowledge Management

    NASA Technical Reports Server (NTRS)

    Shariq, Syed Z.; Kutler, Paul (Technical Monitor)

    1997-01-01

    The emergence of rapidly expanding technologies for distribution and dissemination of information and knowledge has brought to focus the opportunities for development of knowledge-based networks, knowledge dissemination and knowledge management technologies and their potential applications for enhancing productivity of knowledge work. The challenging and complex problems of the future can be best addressed by developing the knowledge management as a new discipline based on an integrative synthesis of hard and soft sciences. A knowledge management professional society can provide a framework for catalyzing the development of proposed synthesis as well as serve as a focal point for coordination of professional activities in the strategic areas of education, research and technology development. Preliminary concepts for the development of the knowledge management discipline and the professional society are explored. Within this context of knowledge management discipline and the professional society, potential opportunities for application of information technologies for more effectively delivering or transferring information and knowledge (i.e., resulting from the NASA's Mission to Planet Earth) for the development of policy options in critical areas of national and global importance (i.e., policy decisions in economic and environmental areas) can be explored, particularly for those policy areas where a global collaborative knowledge network is likely to be critical to the acceptance of the policies.

  3. Knowledge Management

    NASA Technical Reports Server (NTRS)

    Shariq, Syed Z.; Kutler, Paul (Technical Monitor)

    1997-01-01

    The emergence of rapidly expanding technologies for distribution and dissemination of information and knowledge has brought to focus the opportunities for development of knowledge-based networks, knowledge dissemination and knowledge management technologies and their potential applications for enhancing productivity of knowledge work. The challenging and complex problems of the future can be best addressed by developing the knowledge management as a new discipline based on an integrative synthesis of hard and soft sciences. A knowledge management professional society can provide a framework for catalyzing the development of proposed synthesis as well as serve as a focal point for coordination of professional activities in the strategic areas of education, research and technology development. Preliminary concepts for the development of the knowledge management discipline and the professional society are explored. Within this context of knowledge management discipline and the professional society, potential opportunities for application of information technologies for more effectively delivering or transferring information and knowledge (i.e., resulting from the NASA's Mission to Planet Earth) for the development of policy options in critical areas of national and global importance (i.e., policy decisions in economic and environmental areas) can be explored, particularly for those policy areas where a global collaborative knowledge network is likely to be critical to the acceptance of the policies.

  4. Insights into organogelation and its kinetics from Hansen solubility parameters. Toward a priori predictions of molecular gelation.

    PubMed

    Diehn, Kevin K; Oh, Hyuntaek; Hashemipour, Reza; Weiss, Richard G; Raghavan, Srinivasa R

    2014-04-21

    Many small molecules can self-assemble by non-covalent interactions into fibrous networks and thereby induce gelation of organic liquids. However, no capability currently exists to predict whether a molecule in a given solvent will form a gel, a low-viscosity solution (sol), or an insoluble precipitate. Gelation has been recognized as a phenomenon that reflects a balance between solubility and insolubility; however, the distinction between these regimes has not been quantified in a systematic fashion. In this work, we focus on a well-known gelator, 1,3:2,4-dibenzylidene sorbitol (DBS), and study its self-assembly in various solvents. From these data, we build a framework for DBS gelation based on Hansen solubility parameters (HSPs). While the HSPs for DBS are not known a priori, the HSPs are available for each solvent and they quantify the solvent's ability to interact via dispersion, dipole-dipole, and hydrogen bonding interactions. Using the three HSPs, we construct three-dimensional plots showing regions of solubility (S), slow gelation (SG), instant gelation (IG), and insolubility (I) for DBS in the different solvents at a given temperature and concentration. Our principal finding is that the above regions radiate out as concentric shells: i.e., a central solubility (S) sphere, followed in order by spheres corresponding to SG, IG, and I regions. The distance (R0) from the origin of the central sphere quantifies the incompatibility between DBS and a solvent-the larger this distance, the more incompatible the pair. The elastic modulus of the final gel increases with R0, while the time required for a super-saturated sol to form a gel decreases with R0. Importantly, if R0 is too small, the gels are weak, but if R0 is too large, insolubility occurs-thus, strong gels fall within an optimal window of incompatibility between the gelator and the solvent. Our approach can be used to design organogels of desired strength and gelation time by judicious choice of a

  5. The fetal fraction of cell-free DNA in maternal plasma is not affected by a priori risk of fetal trisomy

    PubMed Central

    2013-01-01

    Objective: To determine the relationship between a priori risk for fetal trisomy and the fraction of fetal cell-free DNA (cfDNA) in maternal blood. Methods: A comparative analysis on fetal cfDNA amounts was performed in subjects stratified into a priori risk groups based on maternal age, prenatal screening results, or nuchal translucency measurement. Results: Across the highest and lowest deciles within each group, there were no significant differences in the fetal cfDNA fraction. Conclusions: These data support the concept that non-invasive prenatal test performance as determined by fetal cfDNA fraction is not predicted to be different based on patient risk classification. PMID:22913322

  6. Heavy metal and polycyclic aromatic hydrocarbon concentrations in Quercus ilex L. leaves fit an a priori subdivision in site typologies based on human management.

    PubMed

    De Nicola, Flavia; Baldantoni, Daniela; Maisto, Giulia; Alfani, Anna

    2017-05-01

    Concentrations of four heavy metals (HMs) (Cd, Cr, Fe, Pb) and four polycyclic aromatic hydrocarbons (PAHs) (fluoranthene, phenanthrene, chrysene, benzo[a]pyrene) in Quercus ilex L. leaves collected at the Campania Region (Southern Italy) in previous air biomonitoring studies were employed to (1) test the correspondence with an a priori site subdivision (remote, periurban, and urban) and (2) evaluate long temporal trends of HM (approximately 20 years) and PAH (approximately 10 years) air contaminations. Overall, Q. ilex leaf HM and PAH concentrations resulted along the gradient: remote < periurban < urban sites, reflecting the a priori subdivision based on human management. Over a long time, although a clear decrease of leaf Pb, chrysene, fluoranthene, and phenanthrene concentrations occurred at the urban sites, a high contamination level persists.

  7. [An a priori risk analysis study. Securisation of transfusion of blood product in a hospital: from the reception in the medical unit to its administration].

    PubMed

    Bertrand, E; Lévy, R; Boyeldieu, D

    2013-12-01

    Following an ABO accident after transfusion of red blood cells, an a priori risk analysis study is being performed in a hospital. The scope of this analysis covers from the reception of the blood product in the medical unit to its administration. The risk analysis enables to identify the potentially dangerous situations and the evaluation of the risks in order to propose corrective measures (precautionary or protective) and bring the system back to an acceptable risk level. The innovative concept of an a priori risk analysis in the medical field allows the extension of the analysis of this transfusion risk to other hospitals. In addition, it allows the extension of the use of this approach to other medical fields.

  8. Bonner sphere measurements of 241Am-B and 241Am-F neutron energy spectra unfolded using high-resolution a priori data.

    PubMed

    Roberts, N J; Jones, L N; Liu, Z Z; Tagziria, H; Thomas, D J

    2014-10-01

    High-resolution neutron energy spectra, covering the entire energy range of interest, for two standard radionuclide neutron sources ((241)Am-B and (241)Am-F) have been derived from Bonner sphere measurements by using high-resolution a priori data in the unfolding process. In each case, two a priori spectra were used, one from a two-stage calculation and also one from a combination of the calculated spectrum with a high-resolution measured spectrum. The unfolded spectra are compared with those published elsewhere and show significant differences from the ISO- and IAEA-recommended spectra for (241)Am-B and (241)Am-F, respectively. Values for the fluence-average energy and fluence-to-dose-equivalent conversion coefficients are presented for the new spectra, and the implications of the new spectra for the emission rates of the sources when measured by the manganese bath technique are also determined. © Crown copyright 2013.

  9. A gridded version of the US EPA inventory of methane emissions for use as a priori and reference in methane source inversions

    NASA Astrophysics Data System (ADS)

    Maasakkers, J. D.; Jacob, D. J.; Payer Sulprizio, M.; Turner, A. J.; Weitz, M.; Wirth, T. C.; Hight, C.; DeFigueiredo, M.; Desai, M.; Schmeltz, R.; Hockstad, L.; Bloom, A. A.; Bowman, K. W.

    2015-12-01

    The US EPA produces annual estimates of national anthropogenic methane emissions in the Inventory of US Greenhouse Gas Emissions and Sinks (EPA inventory). These are reported to the UN and inform national climate policy. The EPA inventory uses best available information on emitting processes (IPCC Tier 2/3 approaches). However, inversions of atmospheric observations suggest that the inventory could be too low. These inversions rely on crude bottom-up estimates as a priori because the EPA inventory is only available as national totals for most sources. Reliance on an incorrect a priori greatly limits the value of inversions for testing and improving the EPA inventory as allocation of methane emissions by source types and regions can vary greatly between different bottom-up inventories. Here we present a 0.1° × 0.1° monthly version of the EPA inventory to serve as a priori for inversions of atmospheric data and to interpret inversion results. We use a wide range of process-specific information to allocate emissions, incorporating facility-level data reported through the EPA Greenhouse Gas Reporting Program where possible. As an illustration of used gridding strategies, gridded livestock emissions are based on EPA emission data per state, USDA livestock inventories per county, and USDA weighted land cover maps for sub-county localization. Allocation of emissions from natural gas systems incorporates monthly well-level production data, EIA compressor station and processing plant databases, and information on pipelines. Our gridded EPA inventory shows large differences in spatial emission patterns compared to the EDGAR v4.2 global inventory used as a priori in previous inverse studies. Our work greatly enhances the potential of future inversions to test and improve the EPA inventory and more broadly to improve understanding of the factors controlling methane concentrations and their trends. Preliminary inversion results using GOSAT satellite data will be presented.

  10. Comparative analysis of a-priori and a-posteriori dietary patterns using state-of-the-art classification algorithms: a case/case-control study.

    PubMed

    Kastorini, Christina-Maria; Papadakis, George; Milionis, Haralampos J; Kalantzi, Kallirroi; Puddu, Paolo-Emilio; Nikolaou, Vassilios; Vemmos, Konstantinos N; Goudevenos, John A; Panagiotakos, Demosthenes B

    2013-11-01

    To compare the accuracy of a-priori and a-posteriori dietary patterns in the prediction of acute coronary syndrome (ACS) and ischemic stroke. This is actually the first study to employ state-of-the-art classification methods for this purpose. During 2009-2010, 1000 participants were enrolled; 250 consecutive patients with a first ACS and 250 controls (60±12 years, 83% males), as well as 250 consecutive patients with a first stroke and 250 controls (75±9 years, 56% males). The controls were population-based and age-sex matched to the patients. The a-priori dietary patterns were derived from the validated MedDietScore, whereas the a-posteriori ones were extracted from principal components analysis. Both approaches were modeled using six classification algorithms: multiple logistic regression (MLR), naïve Bayes, decision trees, repeated incremental pruning to produce error reduction (RIPPER), artificial neural networks and support vector machines. The classification accuracy of the resulting models was evaluated using the C-statistic. For the ACS prediction, the C-statistic varied from 0.587 (RIPPER) to 0.807 (MLR) for the a-priori analysis, while for the a-posteriori one, it fluctuated between 0.583 (RIPPER) and 0.827 (MLR). For the stroke prediction, the C-statistic varied from 0.637 (RIPPER) to 0.767 (MLR) for the a-priori analysis, and from 0.617 (decision tree) to 0.780 (MLR) for the a-posteriori. Both dietary pattern approaches achieved equivalent classification accuracy over most classification algorithms. The choice, therefore, depends on the application at hand. Copyright © 2013 Elsevier B.V. All rights reserved.

  11. Seismicity patterns along the Ecuadorian subduction zone: new constraints from earthquake location in a 3-D a priori velocity model

    NASA Astrophysics Data System (ADS)

    Font, Yvonne; Segovia, Monica; Vaca, Sandro; Theunissen, Thomas

    2013-04-01

    To improve earthquake location, we create a 3-D a priori P-wave velocity model (3-DVM) that approximates the large velocity variations of the Ecuadorian subduction system. The 3-DVM is constructed from the integration of geophysical and geological data that depend on the structural geometry and velocity properties of the crust and the upper mantle. In addition, specific station selection is carried out to compensate for the high station density on the Andean Chain. 3-D synthetic experiments are then designed to evaluate the network capacity to recover the event position using only P arrivals and the MAXI technique. Three synthetic earthquake location experiments are proposed: (1) noise-free and (2) noisy arrivals used in the 3-DVM, and (3) noise-free arrivals used in a 1-DVM. Synthetic results indicate that, under the best conditions (exact arrival data set and 3-DVM), the spatiotemporal configuration of the Ecuadorian network can accurately locate 70 per cent of events in the frontal part of the subduction zone (average azimuthal gap is 289° ± 44°). Noisy P arrivals (up to ± 0.3 s) can accurately located 50 per cent of earthquakes. Processing earthquake location within a 1-DVM almost never allows accurate hypocentre position for offshore earthquakes (15 per cent), which highlights the role of using a 3-DVM in subduction zone. For the application to real data, the seismicity distribution from the 3-D-MAXI catalogue is also compared to the determinations obtained in a 1-D-layered VM. In addition to good-quality location uncertainties, the clustering and the depth distribution confirm the 3-D-MAXI catalogue reliability. The pattern of the seismicity distribution (a 13 yr record during the inter-seismic period of the seismic cycle) is compared to the pattern of rupture zone and asperity of the Mw = 7.9 1942 and the Mw = 7.7 1958 events (the Mw = 8.8 1906 asperity patch is not defined). We observe that the nucleation of 1942, 1958 and 1906 events coincides with

  12. Improvement of Tidal Analysis Results by a Priori Rain Fall Modelling at the Vienna and Membach stations

    NASA Astrophysics Data System (ADS)

    Meurers, B.; van Camp, M.; Petermans, T.

    2005-12-01

    We investigate how far tidal analysis results can be improved when a rain fall admittance model is applied on the superconducting gravity (SG) data. For that purpose both Vienna and Membach data have been analysed with and without a priori rain fall correction. In Membach the residual drop for most events (80%) can be explained by the rain water load, while in Vienna only 50% of all events fit the model in detail. In the other cases the Newtonian effect of vertical air mass redistribution (vertical density variation without air pressure change), predominantly connected with high vertical convection activity, e.g. thunderstorms, plays an essential role: short-term atmospheric signals show up steep gravity residual decreases of a few nms-2 within 10 - 60 min, well correlated with outdoor air temperature in most cases. However, even in those cases the water load model is able to explain the dominating part of the residual drop especially during heavy rain fall. In Vienna more than 110 events have been detected over 10 years. 84% of them are associated with heavy rain starting at or up to 10 min later than the residual drop while the rest (16%) shows no or only little rainfall. The magnitude of the gravity drop depends on the total amount of rainfall accumulated during the meteorological event. Step like signals deteriorate the frequency spectrum estimates. This even holds for tidal analysis. As the drops are of physical origin, they should not be eliminated blindly but corrected using water load modeling constrained by high temporal resolution (1 min) rain data. 3D modeling of the water mass load due to a rain event is based on the following assumptions: (1) Rain water intrudes into the uppermost soil layer (close to the topography surface) and remains there at least until rain has stopped. This is justified for a period of some hours after the rainfall as evapotranspiration is not yet effective. (2) No run-off except of sealed areas or building roofs, where water can

  13. Knowledge Management

    ERIC Educational Resources Information Center

    Deepak

    2005-01-01

    Knowledge Management (KM) is the process through which organizations generate value from their intellectual and knowledge-based assets. Frequently generating value from such assets means sharing them among employees, divisions and even with other companies in order to develop best practices. This article discusses three basic aspects of…

  14. A priori estimate of the quality of a data compression system based on statistical characteristics of the sensors used

    NASA Technical Reports Server (NTRS)

    Khodarev, Y. K.; Yevdokimov, V. P.; Pokras, V. M.

    1974-01-01

    Knowledge of the composition and certain statistical characteristics of the output signals of the scientific instruments on board a space vehicle allows determination of possible signal processing algorithms in the system and to evaluate the prospects for utilization of data reduction. A method is presented to estimate the compression factor of an on-board data collection and processing system with known mean activities of the sensors used and unknown mutual correlation of their output signals.

  15. Knowledge representation system for assembly using robots

    NASA Technical Reports Server (NTRS)

    Jain, A.; Donath, M.

    1987-01-01

    Assembly robots combine the benefits of speed and accuracy with the capability of adaptation to changes in the work environment. However, an impediment to the use of robots is the complexity of the man-machine interface. This interface can be improved by providing a means of using a priori-knowledge and reasoning capabilities for controlling and monitoring the tasks performed by robots. Robots ought to be able to perform complex assembly tasks with the help of only supervisory guidance from human operators. For such supervisory quidance, it is important to express the commands in terms of the effects desired, rather than in terms of the motion the robot must undertake in order to achieve these effects. A suitable knowledge representation can facilitate the conversion of task level descriptions into explicit instructions to the robot. Such a system would use symbolic relationships describing the a priori information about the robot, its environment, and the tasks specified by the operator to generate the commands for the robot.

  16. Using models to guide field experiments: a priori predictions for the CO 2 response of a nutrient- and water-limited native Eucalypt woodland

    SciTech Connect

    Medlyn, Belinda E.; De Kauwe, Martin G.; Zaehle, Sönke; Walker, Anthony P.; Duursma, Remko A.; Luus, Kristina; Mishurov, Mikhail; Pak, Bernard; Smith, Benjamin; Wang, Ying-Ping; Yang, Xiaojuan; Crous, Kristine Y.; Drake, John E.; Gimeno, Teresa E.; Macdonald, Catriona A.; Norby, Richard J.; Power, Sally A.; Tjoelker, Mark G.; Ellsworth, David S.

    2016-05-09

    One major uncertainty in Earth System models is the response of terrestrial ecosystems to rising atmospheric CO2 concentration (Ca), particularly under nutrient-lim- ited conditions. The Eucalyptus Free-Air CO2 Enrichment (EucFACE) experiment, recently established in a nutrient- and water-limited woodlands, presents a unique opportunity to address this uncertainty, but can best do so if key model uncertainties have been identified in advance. Moreover, we applied seven vegetation models, which have previously been comprehensively assessed against earlier forest FACE experi- ments, to simulate a priori possible outcomes from EucFACE. Our goals were to provide quantitative projections against which to evaluate data as they are collected, and to identify key measurements that should be made in the experiment to allow discrimination among alternative model assumptions in a postexperiment model intercompari- son. Simulated responses of annual net primary productivity (NPP) to elevated Ca ranged from 0.5 to 25% across models. The simulated reduction of NPP during a low-rainfall year also varied widely, from 24 to 70%. Key processes where assumptions caused disagreement among models included nutrient limitations to growth; feedbacks to nutri- ent uptake; autotrophic respiration; and the impact of low soil moisture availability on plant processes. Finally, knowledge of the causes of variation among models is now guiding data collection in the experiment, with the expectation that the experimental data can optimally inform future model improvements.

  17. Using models to guide field experiments: a priori predictions for the CO 2 response of a nutrient- and water-limited native Eucalypt woodland

    SciTech Connect

    Medlyn, Belinda E.; De Kauwe, Martin G.; Zaehle, Sönke; Walker, Anthony P.; Duursma, Remko A.; Luus, Kristina; Mishurov, Mikhail; Pak, Bernard; Smith, Benjamin; Wang, Ying-Ping; Yang, Xiaojuan; Crous, Kristine Y.; Drake, John E.; Gimeno, Teresa E.; Macdonald, Catriona A.; Norby, Richard J.; Power, Sally A.; Tjoelker, Mark G.; Ellsworth, David S.

    2016-05-09

    One major uncertainty in Earth System models is the response of terrestrial ecosystems to rising atmospheric CO2 concentration (Ca), particularly under nutrient-lim- ited conditions. The Eucalyptus Free-Air CO2 Enrichment (EucFACE) experiment, recently established in a nutrient- and water-limited woodlands, presents a unique opportunity to address this uncertainty, but can best do so if key model uncertainties have been identified in advance. Moreover, we applied seven vegetation models, which have previously been comprehensively assessed against earlier forest FACE experi- ments, to simulate a priori possible outcomes from EucFACE. Our goals were to provide quantitative projections against which to evaluate data as they are collected, and to identify key measurements that should be made in the experiment to allow discrimination among alternative model assumptions in a postexperiment model intercompari- son. Simulated responses of annual net primary productivity (NPP) to elevated Ca ranged from 0.5 to 25% across models. The simulated reduction of NPP during a low-rainfall year also varied widely, from 24 to 70%. Key processes where assumptions caused disagreement among models included nutrient limitations to growth; feedbacks to nutri- ent uptake; autotrophic respiration; and the impact of low soil moisture availability on plant processes. Finally, knowledge of the causes of variation among models is now guiding data collection in the experiment, with the expectation that the experimental data can optimally inform future model improvements.

  18. Prediction of extinction and reignition in nonpremixed turbulent flames using a flamelet/progress variable model. 1. A priori study and presumed PDF closure

    SciTech Connect

    Ihme, Matthias; Pitsch, Heinz

    2008-10-15

    Previously conducted studies of the flamelet/progress variable model for the prediction of nonpremixed turbulent combustion processes identified two areas for model improvements: the modeling of the presumed probability density function (PDF) for the reaction progress parameter and the consideration of unsteady effects [Ihme et al., Proc. Combust. Inst. 30 (2005) 793]. These effects are of particular importance during local flame extinction and subsequent reignition. Here, the models for the presumed PDFs for conserved and reactive scalars are re-examined and a statistically most likely distribution (SMLD) is employed and tested in a priori studies using direct numerical simulation (DNS) data and experimental results from the Sandia flame series. In the first part of the paper, the SMLD model is employed for a reactive scalar distribution. Modeling aspects of the a priori PDF, accounting for the bias in composition space, are discussed. The convergence of the SMLD with increasing number of enforced moments is demonstrated. It is concluded that information about more than two moments is beneficial to accurately represent the reactive scalar distribution in turbulent flames with strong extinction and reignition. In addition to the reactive scalar analysis, the potential of the SMLD for the representation of conserved scalar distributions is also analyzed. In the a priori study using DNS data it is found that the conventionally employed beta distribution provides a better representation for the scalar distribution. This is attributed to the fact that the beta-PDF implicitly enforces higher moment information that is in excellent agreement with the DNS data. However, the SMLD outperforms the beta distribution in free shear flow applications, which are typically characterized by strongly skewed scalar distributions, in the case where higher moment information can be enforced. (author)

  19. Developing framework to constrain the geometry of the seismic rupture plane on subduction interfaces a priori - A probabilistic approach

    USGS Publications Warehouse

    Hayes, G.P.; Wald, D.J.

    2009-01-01

    A key step in many earthquake source inversions requires knowledge of the geometry of the fault surface on which the earthquake occurred. Our knowledge of this surface is often uncertain, however, and as a result fault geometry misinterpretation can map into significant error in the final temporal and spatial slip patterns of these inversions. Relying solely on an initial hypocentre and CMT mechanism can be problematic when establishing rupture characteristics needed for rapid tsunami and ground shaking estimates. Here, we attempt to improve the quality of fast finite-fault inversion results by combining several independent and complementary data sets to more accurately constrain the geometry of the seismic rupture plane of subducting slabs. Unlike previous analyses aimed at defining the general form of the plate interface, we require mechanisms and locations of the seismicity considered in our inversions to be consistent with their occurrence on the plate interface, by limiting events to those with well-constrained depths and with CMT solutions indicative of shallow-dip thrust faulting. We construct probability density functions about each location based on formal assumptions of their depth uncertainty and use these constraints to solve for the ‘most-likely’ fault plane. Examples are shown for the trench in the source region of the Mw 8.6 Southern Sumatra earthquake of March 2005, and for the Northern Chile Trench in the source region of the November 2007 Antofagasta earthquake. We also show examples using only the historic catalogues in regions without recent great earthquakes, such as the Japan and Kamchatka Trenches. In most cases, this method produces a fault plane that is more consistent with all of the data available than is the plane implied by the initial hypocentre and CMT mechanism. Using the aggregated data sets, we have developed an algorithm to rapidly determine more accurate initial fault plane geometries for source inversions of future

  20. Procedural knowledge

    NASA Technical Reports Server (NTRS)

    Georgeff, Michael P.; Lansky, Amy L.

    1986-01-01

    Much of commonsense knowledge about the real world is in the form of procedures or sequences of actions for achieving particular goals. In this paper, a formalism is presented for representing such knowledge using the notion of process. A declarative semantics for the representation is given, which allows a user to state facts about the effects of doing things in the problem domain of interest. An operational semantics is also provided, which shows how this knowledge can be used to achieve particular goals or to form intentions regarding their achievement. Given both semantics, the formalism additionally serves as an executable specification language suitable for constructing complex systems. A system based on this formalism is described, and examples involving control of an autonomous robot and fault diagnosis for NASA's Space Shuttle are provided.

  1. Knowledge River

    ERIC Educational Resources Information Center

    Berry, John N., III

    2004-01-01

    One of the most promising of all diversity initiatives in library and information studies (LIS) is Knowledge River (KR) at the School of Information Resources and Library Science (SIRLS) at the University of Arizona, Tucson. Created and directed by Patricia A. Tarin, the program has already recruited some 42 students into the profession, 20 of…

  2. Working Knowledge.

    ERIC Educational Resources Information Center

    Beckett, David

    The resurgence of "lifelong learning" has renewed consideration of the nature of "working knowledge." Lifelong learning has many aspects, including construction and distribution of individuals' very self-hood, educational institutions' role in capturing informal experiences, and the juggling required between family and…

  3. The utility of evolutionary psychology for generating novel, specific, and a priori hypotheses about psychopathology in a parsimonious fashion: reply to Hankin (2013).

    PubMed

    Martel, Michelle M

    2013-11-01

    The comment of Hankin (2013) elucidated several strengths of the target article (Martel, 2013), in which I reviewed extant literature on sex differences in common childhood-onset externalizing and adolescent-onset internalizing disorders. Hankin also raised important questions about the utility of evolutionary psychological principles, particularly those of sexual selection, to generate novel, specific, and a priori hypotheses about sex differences in common forms of psychopathology. I acknowledge these points, and I contend that a metatheory derived from evolutionary psychological principles is quite useful in 2 ways. First, it provides a parsimonious framework for understanding sex differences across multiple levels of analysis (e.g., hormones, gene by environment interactions, dispositional traits, behavioral and emotional symptoms). Second, it provides a framework for the generation of novel, specific, and a priori hypotheses such as those elucidated in my review. Existing disorder-specific theories cannot do as well in serving these functions. The pursuit of metatheories that both organize existing findings and generate novel cross-disorder hypotheses is crucial for ongoing progress in psychological science. Evolutionary psychology provides one such fruitful metatheory. © 2013 American Psychological Association

  4. Combined use of a priori data for fast system self-calibration of a non-rigid multi-camera fringe projection system

    NASA Astrophysics Data System (ADS)

    Stavroulakis, Petros I.; Chen, Shuxiao; Sims-Waterhouse, Danny; Piano, Samanta; Southon, Nicholas; Bointon, Patrick; Leach, Richard

    2017-06-01

    In non-rigid fringe projection 3D measurement systems, where either the camera or projector setup can change significantly between measurements or the object needs to be tracked, self-calibration has to be carried out frequently to keep the measurements accurate1. In fringe projection systems, it is common to use methods developed initially for photogrammetry for the calibration of the camera(s) in the system in terms of extrinsic and intrinsic parameters. To calibrate the projector(s) an extra correspondence between a pre-calibrated camera and an image created by the projector is performed. These recalibration steps are usually time consuming and involve the measurement of calibrated patterns on planes, before the actual object can continue to be measured after a motion of a camera or projector has been introduced in the setup and hence do not facilitate fast 3D measurement of objects when frequent experimental setup changes are necessary. By employing and combining a priori information via inverse rendering, on-board sensors, deep learning and leveraging a graphics processor unit (GPU), we assess a fine camera pose estimation method which is based on optimising the rendering of a model of a scene and the object to match the view from the camera. We find that the success of this calibration pipeline can be greatly improved by using adequate a priori information from the aforementioned sources.

  5. A simulation study of the variability of indocyanine green kinetics and using structural a priori information in dynamic contrast enhanced diffuse optical tomography (DCE-DOT)

    NASA Astrophysics Data System (ADS)

    Burcin Unlu, Mehmet; Birgul, Ozlem; Gulsen, Gultekin

    2008-06-01

    We investigated (1) the variability of indocyanine green kinetics (ICG) between different cases in the existence of random noise, changing the size of the imaging region, the location and the size of the inclusion, (2) the use of structural a priori information to reduce the variability. We performed two-dimensional simulation studies for this purpose. In the simulations, we used a two-compartmental model to describe the ICG transport and obtained pharmacokinetic parameters. The transfer constant and the rate constant showed a wide variation, i.e. 60% and 95%, respectively, when random Gaussian noise with a standard deviation of 1% in amplitude and 0.4° in phase was added to data. Moreover, recovered peak ICG concentration and time to reach the peak concentration was different within different cases. When structural a priori information was used in the reconstructions, the variations in the transfer and the rate constant were reduced to 29%, 15%, respectively. As a result, although the recovered peak concentration was still case dependent, the variability of the shape of the kinetic curve was reduced.

  6. Sample Size Calculation: Inaccurate A Priori Assumptions for Nuisance Parameters Can Greatly Affect the Power of a Randomized Controlled Trial.

    PubMed

    Tavernier, Elsa; Giraudeau, Bruno

    2015-01-01

    We aimed to examine the extent to which inaccurate assumptions for nuisance parameters used to calculate sample size can affect the power of a randomized controlled trial (RCT). In a simulation study, we separately considered an RCT with continuous, dichotomous or time-to-event outcomes, with associated nuisance parameters of standard deviation, success rate in the control group and survival rate in the control group at some time point, respectively. For each type of outcome, we calculated a required sample size N for a hypothesized treatment effect, an assumed nuisance parameter and a nominal power of 80%. We then assumed a nuisance parameter associated with a relative error at the design stage. For each type of outcome, we randomly drew 10,000 relative errors of the associated nuisance parameter (from empirical distributions derived from a previously published review). Then, retro-fitting the sample size formula, we derived, for the pre-calculated sample size N, the real power of the RCT, taking into account the relative error for the nuisance parameter. In total, 23%, 0% and 18% of RCTs with continuous, binary and time-to-event outcomes, respectively, were underpowered (i.e., the real power was < 60%, as compared with the 80% nominal power); 41%, 16% and 6%, respectively, were overpowered (i.e., with real power > 90%). Even with proper calculation of sample size, a substantial number of trials are underpowered or overpowered because of imprecise knowledge of nuisance parameters. Such findings raise questions about how sample size for RCTs should be determined.

  7. Comparison of a priori versus provisional heparin therapy on radial artery occlusion after transradial coronary angiography and patent hemostasis (from the PHARAOH Study).

    PubMed

    Pancholy, Samir B; Bertrand, Olivier F; Patel, Tejas

    2012-07-15

    Systemic anticoagulation decreases the risk of radial artery occlusion (RAO) after transradial catheterization and standard occlusive hemostasis. We compared the efficacy and safety of provisional heparin use only when the technique of patent hemostasis was not achievable to standard a priori heparin administration after radial sheath introduction. Patients referred for coronary angiography were randomized in 2 groups. In the a priori group, 200 patients received intravenous heparin (50 IU/kg) immediately after sheath insertion. In the provisional group, 200 patients did not receive heparin during the procedure. After sheath removal, hemostasis was obtained using a TR band (Terumo corporation, Tokyo, Japan) with a plethysmography-guided patent hemostasis technique. In the provisional group, no heparin was given if radial artery patency could be obtained and maintained. If radial patency was not achieved, a bolus of heparin (50 IU/kg) was given. Radial artery patency was evaluated at 24 hours (early RAO) and 30 days after the procedure (late RAO) by plethysmography. Patent hemostasis was obtained in 67% in the a priori group and 74% in the provisional group (p = 0.10). Incidence of RAO remained similar in the 2 groups at the early (7.5% vs 7.0%, p = 0.84) and late (4.5% vs 5.0%, p = 0.83) evaluations. Women, patients with diabetes, patients having not received heparin, and patients without radial artery patency during hemostasis had more RAO. By multivariate analysis, patent radial artery during hemostasis (odds ratio [OR] 0.03, 95% confidence interval [CI] 0.004 to 0.28, p = 0.002) and diabetes (OR 11, 95% CI 3 to 38,p <0.0001) were independent predictors of late RAO, whereas heparin was not (OR 0.45 95% CI 0.13 to 1.54, p = 0.20). In conclusion, our results suggest that maintenance of radial artery patency during hemostasis is the most important parameter to decrease the risk of RAO. In selected cases, provisional use of heparin appears feasible and safe when

  8. Extending Wireless Rechargeable Sensor Network Life without Full Knowledge.

    PubMed

    Najeeb, Najeeb W; Detweiler, Carrick

    2017-07-17

    When extending the life of Wireless Rechargeable Sensor Networks (WRSN), one challenge is charging networks as they grow larger. Overcoming this limitation will render a WRSN more practical and highly adaptable to growth in the real world. Most charging algorithms require a priori full knowledge of sensor nodes' power levels in order to determine the nodes that require charging. In this work, we present a probabilistic algorithm that extends the life of scalable WRSN without a priori power knowledge and without full network exploration. We develop a probability bound on the power level of the sensor nodes and utilize this bound to make decisions while exploring a WRSN. We verify the algorithm by simulating a wireless power transfer unmanned aerial vehicle, and charging a WRSN to extend its life. Our results show that, without knowledge, our proposed algorithm extends the life of a WRSN on average 90% of what an optimal full knowledge algorithm can achieve. This means that the charging robot does not need to explore the whole network, which enables the scaling of WRSN. We analyze the impact of network parameters on our algorithm and show that it is insensitive to a large range of parameter values.

  9. Extending Wireless Rechargeable Sensor Network Life without Full Knowledge

    PubMed Central

    Najeeb, Najeeb W.; Detweiler, Carrick

    2017-01-01

    When extending the life of Wireless Rechargeable Sensor Networks (WRSN), one challenge is charging networks as they grow larger. Overcoming this limitation will render a WRSN more practical and highly adaptable to growth in the real world. Most charging algorithms require a priori full knowledge of sensor nodes’ power levels in order to determine the nodes that require charging. In this work, we present a probabilistic algorithm that extends the life of scalable WRSN without a priori power knowledge and without full network exploration. We develop a probability bound on the power level of the sensor nodes and utilize this bound to make decisions while exploring a WRSN. We verify the algorithm by simulating a wireless power transfer unmanned aerial vehicle, and charging a WRSN to extend its life. Our results show that, without knowledge, our proposed algorithm extends the life of a WRSN on average 90% of what an optimal full knowledge algorithm can achieve. This means that the charging robot does not need to explore the whole network, which enables the scaling of WRSN. We analyze the impact of network parameters on our algorithm and show that it is insensitive to a large range of parameter values. PMID:28714936

  10. Priori calculations of pK/sub a/'s for organic compounds in water. The pK/sub a/ of ethane

    SciTech Connect

    Jorgensen, W.L.; Briggs, J.M.; Gao, J.

    1987-10-28

    The enduring fascination of organic chemists with acidities and basicities reflects the fundamental importance of these concepts in understanding organic reactivity. Developing scales of aqueous acidities for weak organic acids is challenging in view of the need for extrapolations from organic solvents to water, ion-pairing and aggregation effects for organometallic compounds, and the derivation of thermodynamic quantities from kinetic measurements. The problems are reflected in the experimental ranges for the pK/sub a/'s of the simplest alkanes, methane and ethane, which cover from 40 to 60. In the present communication, they demonstrate how simulation methodology can be used to obtain a priori predictions for the relative pK/sub a/'s of organic compounds in water. The first applications are for the pK/sub a/'s of acetonitrile and ethane relative to methanethiol.

  11. Using a priori contrasts for multivariate repeated-measures ANOVA to analyze thermoregulatory responses of the dibbler (Parantechinus apicalis; Marsupialia, Dasyuridae).

    PubMed

    Withers, Philip C; Cooper, Christine E

    2011-01-01

    Physiological studies often involve the repeated measurement of individuals over a range of ordered categorical conditions, for example, varying ambient temperature. We illustrate here the use of a priori contrasts for multivariate repeated-measures ANOVA by analyzing the thermal responses of various physiological variables for a small marsupial, the dibbler (Parantechinus apicalis). Our analyses showed that dibblers conform closely to the Scholander-Irving model of endothermy. Body temperature was constant at low air temperatures, was 36.3 ± 0.24°C at thermoneutrality (30°C), and increased at 35°C. Metabolic rate decreased with increasing ambient temperature to a basal rate of 0.619 ± 0.036 mL O(2) g(-1) h(-1) at 30°C; it extrapolated closely to thermoneutral body temperature. Increased oxygen demand at lower ambient temperature was met by increased respiratory minute volume, achieved by increased respiratory frequency and tidal volume; oxygen extraction was constant at about 19%. Evaporative water loss and wet and dry thermal conductance increased markedly at high ambient temperatures but not sufficiently to maintain constant body temperature. Relative water economy was similar to that of other small marsupials, increasing linearly at lower air temperatures with a point of relative water economy of 20.3°C. We conclude that a priori contrasts provide a statistically appropriate and powerful analysis that can be used routinely to statistically describe the pattern of response of physiological variables to a categorical factor and are especially useful for repeated-measures ANOVA designs common to many physiological studies.

  12. Feasibility of improving a priori regional climate model estimates of Greenland ice sheet surface mass loss through assimilation of measured ice surface temperatures

    NASA Astrophysics Data System (ADS)

    Navari, M.; Margulis, S. A.; Bateni, S. M.; Tedesco, M.; Alexander, P.; Fettweis, X.

    2016-01-01

    The Greenland ice sheet (GrIS) has been the focus of climate studies due to its considerable impact on sea level rise. Accurate estimates of surface mass fluxes would contribute to understanding the cause of its recent changes and would help to better estimate the past, current and future contribution of the GrIS to sea level rise. Though the estimates of the GrIS surface mass fluxes have improved significantly over the last decade, there is still considerable disparity between the results from different methodologies (e.g., Rae et al., 2012; Vernon et al., 2013). The data assimilation approach can merge information from different methodologies in a consistent way to improve the GrIS surface mass fluxes. In this study, an ensemble batch smoother data assimilation approach was developed to assess the feasibility of generating a reanalysis estimate of the GrIS surface mass fluxes via integrating remotely sensed ice surface temperature measurements with a regional climate model (a priori) estimate. The performance of the proposed methodology for generating an improved posterior estimate was investigated within an observing system simulation experiment (OSSE) framework using synthetically generated ice surface temperature measurements. The results showed that assimilation of ice surface temperature time series were able to overcome uncertainties in near-surface meteorological forcing variables that drive the GrIS surface processes. Our findings show that the proposed methodology is able to generate posterior reanalysis estimates of the surface mass fluxes that are in good agreement with the synthetic true estimates. The results also showed that the proposed data assimilation framework improves the root-mean-square error of the posterior estimates of runoff, sublimation/evaporation, surface condensation, and surface mass loss fluxes by 61, 64, 76, and 62 %, respectively, over the nominal a priori climate model estimates.

  13. Measuring unconscious knowledge: distinguishing structural knowledge and judgment knowledge.

    PubMed

    Dienes, Zoltán; Scott, Ryan

    2005-06-01

    This paper investigates the dissociation between conscious and unconscious knowledge in an implicit learning paradigm. Two experiments employing the artificial grammar learning task explored the acquisition of unconscious and conscious knowledge of structure (structural knowledge). Structural knowledge was contrasted to knowledge of whether an item has that structure (judgment knowledge). For both structural and judgment knowledge, conscious awareness was assessed using subjective measures. It was found that unconscious structural knowledge could lead to both conscious and unconscious judgment knowledge. When structural knowledge was unconscious, there was no tendency for judgment knowledge to become more conscious over time. Furthermore, conscious rather than unconscious structural knowledge produced more consistent errors in judgments, was facilitated by instructions to search for rules, and after such instructions was harmed by a secondary task. The dissociations validate the use of these subjective measures of conscious awareness.

  14. Nursing knowledge development: where to from here?

    PubMed

    Geanellos, R

    1997-01-01

    Issues related to nursing epistemology are reviewed. This review includes discussion of logical positivism, empiricism and interpretive-emancipatory paradigms, their influence on the construction of knowledge and on its methods of derivation and verification. Changes in the conceptualisation of science are explored, and scientific realism is introduced as a contemporary philosophy of science through which the discipline of nursing can develop. Questions surrounding the development of nursing knowledge are examined; for example, the implications of theory construction through the use of borrowed theory and the acceptance of external philosophies of science. Argument is offered for and against borrowing external theories and philosophies, or developing theories and philosophies from research into nursing practice. The relationship between research method and the phenomenon under study is discussed. The need to develop a broad base of nursing knowledge through diverse research methods is addressed. Links are created between the development of non-practice-based theories, the derivation of knowledge a priori, and the poor use of nursing theory and research in nursing practice. It is suggested that nursing science should develop through a dialectic between nursing research and practice, and that such a dialectic could assist the forward movement of nursing through the evolution of meaningful nursing theories and philosophies of nursing science.

  15. Physician knowledge of nuclear medicine radiation exposure.

    PubMed

    Riley, Paul; Liu, Hongjie; Wilson, John D

    2013-01-01

    Because physician knowledge of patient exposure to ionizing radiation from computed tomography (CT) procedures previously has been recognized as poor, the purpose of this systematic review is to determine whether physician or physician trainee knowledge of patient exposure to radiation from nuclear medicine procedures is similarly insufficient. Online databases and printed literature were systematically searched to acquire peer-reviewed published research studies involving assessment of physician or physician trainee knowledge of patient radiation exposure levels incurred during nuclear medicine and CT procedures. An a priori inclusion/exclusion criteria for study selection was used as a review protocol aimed at extracting information pertaining to participants, collection methods, comparisons within studies, outcomes, and study design. Fourteen studies from 8 countries were accepted into the review and revealed similar insufficiencies in physician knowledge of nuclear medicine and CT patient radiation exposures. Radiation exposure estimates for both modalities similarly featured a strong tendency toward physician underestimation. Discussion Comparisons were made and ratios established between physican estimates of patient radiation exposure from nuclear medicine procedures and estimates of CT procedures. A theoretical median of correct physician exposure estimates was used to examine factors affecting lower and higher estimates. The tendency for ordering physicians to underestimate patient radiation exposures from nuclear medicine and CT procedures could lead to their overuse and contribute to increasing the public's exposure to ionizing radiation.

  16. Constructing Knowledge

    NASA Astrophysics Data System (ADS)

    Blanton, Patricia

    2003-02-01

    Schools are expected to lay the foundation upon which knowledge can be built and equip students with the tools necessary to accomplish the construction. The role of the teacher in this building process is crucial to the type of structure the student can build. Whether you call it constructivism, discussion teaching, project-based learning, inquiry learning, or any of the other names given to the instructional strategies being suggested by education researchers, the key is getting students to become active participants in the process. While some students may be able to learn from eloquently delivered lectures and dynamic demonstrations, the majority of students cannot effectively retain and apply ideas communicated in this manner.

  17. Knowledge-based image bandwidth compression and enhancement

    NASA Astrophysics Data System (ADS)

    Saghri, John A.; Tescher, Andrew G.

    1987-01-01

    Techniques for incorporating a priori knowledge in the digital coding and bandwidth compression of image data are described and demonstrated. An algorithm for identifying and highlighting thin lines and point objects prior to coding is presented, and the precoding enhancement of a slightly smoothed version of the image is shown to be more effective than enhancement of the original image. Also considered are readjustment of the local distortion parameter and variable-block-size coding. The line-segment criteria employed in the classification are listed in a table, and sample images demonstrating the effectiveness of the enhancement techniques are presented.

  18. Star Identification Without Attitude Knowledge: Testing with X-Ray Timing Experiment Data

    NASA Technical Reports Server (NTRS)

    Ketchum, Eleanor

    1997-01-01

    As the budget for the scientific exploration of space shrinks, the need for more autonomous spacecraft increases. For a spacecraft with a star tracker, the ability to determinate attitude from a lost in space state autonomously requires the capability to identify the stars in the field of view of the tracker. Although there have been efforts to produce autonomous star trackers which perform this function internally, many programs cannot afford these sensors. The author previously presented a method for identifying stars without a priori attitude knowledge specifically targeted for onboard computers as it minimizes the necessary computer storage. The method has previously been tested with simulated data. This paper provides results of star identification without a priori attitude knowledge using flight data from two 8 by 8 degree charge coupled device star trackers onboard the X-Ray Timing Experiment.

  19. Star Identification Without Attitude Knowledge: Testing with X-Ray Timing Experiment Data

    NASA Technical Reports Server (NTRS)

    Ketchum, Eleanor

    1997-01-01

    As the budget for the scientific exploration of space shrinks, the need for more autonomous spacecraft increases. For a spacecraft with a star tracker, the ability to determinate attitude from a lost in space state autonomously requires the capability to identify the stars in the field of view of the tracker. Although there have been efforts to produce autonomous star trackers which perform this function internally, many programs cannot afford these sensors. The author previously presented a method for identifying stars without a priori attitude knowledge specifically targeted for onboard computers as it minimizes the necessary computer storage. The method has previously been tested with simulated data. This paper provides results of star identification without a priori attitude knowledge using flight data from two 8 by 8 degree charge coupled device star trackers onboard the X-Ray Timing Experiment.

  20. Identification of dietary patterns associated with obesity in a nationally representative survey of Canadian adults: application of a priori, hybrid, and simplified dietary pattern techniques.

    PubMed

    Jessri, Mahsa; Wolfinger, Russell D; Lou, Wendy Y; L'Abbé, Mary R

    2017-03-01

    Background: Analyzing the effects of dietary patterns is an important approach for examining the complex role of nutrition in the etiology of obesity and chronic diseases.Objectives: The objectives of this study were to characterize the dietary patterns of Canadians with the use of a priori, hybrid, and simplified dietary pattern techniques, and to compare the associations of these patterns with obesity risk in individuals with and without chronic diseases (unhealthy and healthy obesity).Design: Dietary recalls from 11,748 participants (≥18 y of age) in the cross-sectional, nationally representative Canadian Community Health Survey 2.2 were used. A priori dietary pattern was characterized with the use of the previously validated 2015 Dietary Guidelines for Americans Adherence Index (DGAI). Weighted partial least squares (hybrid method) was used to derive an energy-dense (ED), high-fat (HF), low-fiber density (LFD) dietary pattern with the use of 38 food groups. The associations of derived dietary patterns with disease outcomes were then tested with the use of multinomial logistic regression.Results: An ED, HF, and LFD dietary pattern had high positive loadings for fast foods, carbonated drinks, and refined grains, and high negative loadings for whole fruits and vegetables (≥|0.17|). Food groups with a high loading were summed to form a simplified dietary pattern score. Moving from the first (healthiest) to the fourth (least healthy) quartiles of the ED, HF, and LFD pattern and the simplified dietary pattern scores was associated with increasingly elevated ORs for unhealthy obesity, with individuals in quartile 4 having an OR of 2.57 (95% CI: 1.75, 3.76) and 2.73 (95% CI: 1.88, 3.98), respectively (P-trend < 0.0001). Individuals who adhered the most to the 2015 DGAI recommendations (quartile 4) had a 53% lower OR of unhealthy obesity (P-trend < 0.0001). The associations of dietary patterns with healthy obesity and unhealthy nonobesity were weaker, albeit

  1. Knowledge Management: An Introduction.

    ERIC Educational Resources Information Center

    Mac Morrow, Noreen

    2001-01-01

    Discusses issues related to knowledge management and organizational knowledge. Highlights include types of knowledge; the knowledge economy; intellectual capital; knowledge and learning organizations; knowledge management strategies and processes; organizational culture; the role of technology; measuring knowledge; and the role of the information…

  2. A coupled radiative transfer and diffusion approximation model for the solution of the forward problem and the a-priori fluorophore distribution estimation in fluorescence imaging

    NASA Astrophysics Data System (ADS)

    Gorpas, D.; Yova, D.; Politopoulos, K.

    2009-02-01

    Although fluorescence imaging has been applied in tumour diagnosis from the early 90s, just the last few years it has met an increasing scientific interest due to the advances in the biophotonics field and the combined technological progress of the acquisition and computational systems. In addition there are expectations that fluorescence imaging will be further developed and applied in deep tumour diagnosis in the years to come. However, this evolving field of imaging sciences has still to encounter important challenges. Among them is the expression of an accurate forward model for the solution of the reconstruction problem. The scope of this work is to introduce a three dimensional coupled radiative transfer and diffusion approximation model, applicable on the fluorescence imaging. Furthermore, the solver incorporates the super-ellipsoid models and sophisticated image processing algorithms to additionally provide a-priori estimation about the fluorophores distribution, information that is very important for the solution of the inverse problem. Simulation experiments have proven that the proposed methodology preserves the accuracy levels of the radiative transfer equation and the time efficacy of the diffusion approximation, while in the same time shows extended success on the registration between acquired and simulated images.

  3. Repeated evolution of net venation and fleshy fruits among monocots in shaded habitats confirms a priori predictions: evidence from an ndhF phylogeny

    PubMed Central

    Givnish, Thomas J; Pires, J. Chris; Graham, Sean W; McPherson, Marc A; Prince, Linda M; Patterson, Thomas B; Rai, Hardeep S; Roalson, Eric H; Evans, Timothy M; Hahn, William J; Millam, Kendra C; Meerow, Alan W; Molvray, Mia; Kores, Paul J; O'Brien, Heath E; Hall, Jocelyn C; Kress, W. John; Sytsma, Kenneth J

    2005-01-01

    We present a well-resolved, highly inclusive phylogeny for monocots, based on ndhF sequence variation, and use it to test a priori hypotheses that net venation and vertebrate-dispersed fleshy fruits should undergo concerted convergence, representing independent but often concurrent adaptations to shaded conditions. Our data demonstrate that net venation arose at least 26 times and was lost eight times over the past 90 million years; fleshy fruits arose at least 21 times and disappeared 11 times. Both traits show a highly significant pattern of concerted convergence (p<10−9), arising 16 times and disappearing four times in tandem. This phenomenon appears driven by even stronger tendencies for both traits to evolve in shade and be lost in open habitats (p<10−13–10−29). These patterns are among the strongest ever demonstrated for evolutionary convergence in individual traits and the predictability of evolution, and the strongest evidence yet uncovered for concerted convergence. The rate of adaptive shifts per taxon has declined exponentially over the past 90 million years, as expected when large-scale radiations fill adaptive zones. PMID:16011923

  4. A Priori Analysis of Subgrid Mass Flux Vectors from Massively Parallel Direct Numerical Simulations of High Pressure H2/O2 Reacting Shear Layers

    NASA Astrophysics Data System (ADS)

    Foster, Justin; Miller, Richard

    2011-11-01

    Direct Numerical Simulations (DNS) are conducted for temporally developing reacting H2/O2 shear layers at an ambient pressure of 100atm. The compressible form of the governing equations are coupled with the Peng Robinson real gas equation of state and are solved using eighth order central finite differences and fourth order Runge Kutta time integration with resolutions up to ~3/4 billion grid points. The formulation includes a detailed pressure dependent kinetics mechanism having 8 species and 19 steps, detailed property models, and generalized forms of the multicomponent heat and mass diffusion vectors derived from nonequilibrium thermodynamics and fluctuation theory. The DNS is performed over a range of Reynolds numbers up to 4500 based on the free stream velocity difference and initial vorticity thickness. The results are then analyzed in an a priori manner to illustrate the role of the subgrid mass flux vector within the filtered form of the governing equations relevant to Large Eddy Simulations. The subgrid mass flux vector is found to be a significant term; particularly within localized regions of the flame. Research supported by NSF Grant CBET-0965624 and Clemson University's Palmetto Cluster.

  5. Advancing techniques to constrain the geometry of the seismic rupture plane on subduction interfaces a priori: Higher-order functional fits

    USGS Publications Warehouse

    Hayes, G.P.; Wald, D.J.; Keranen, K.

    2009-01-01

    Ongoing developments in earthquake source inversions incorporate nonplanar fault geometries as inputs to the inversion process, improving previous approaches that relied solely on planar fault surfaces. This evolution motivates advancing the existing framework for constraining fault geometry, particularly in subduction zones where plate boundary surfaces that host highly hazardous earthquakes are clearly nonplanar. Here, we improve upon the existing framework for the constraint of the seismic rupture plane of subduction interfaces by incorporating active seismic and seafloor sediment thickness data with existing independent data sets and inverting for the most probable nonplanar subduction geometry. Constraining the rupture interface a priori with independent geological and seismological information reduces the uncertainty in the derived earthquake source inversion parameters over models that rely on simpler assumptions, such as the moment tensor inferred fault plane. Examples are shown for a number of wellconstrained global locations. We expand the coverage of previous analyses to a more uniform global data set and show that even in areas of sparse data this approach is able to accurately constrain the approximate subduction geometry, particularly when aided with the addition of data from local active seismic surveys. In addition, we show an example of the integration of many two-dimensional profiles into a threedimensional surface for the Sunda subduction zone and introduce the development of a new global threedimensional subduction interface model: Slab1.0. ?? 2009 by the American Geophysical Union.

  6. Gathering Knowledge for Your Knowledge Management System.

    ERIC Educational Resources Information Center

    Cowley-Durst, Barbara

    1999-01-01

    Discusses knowledge management that seeks to minimize information overload in order to enhance performance. Highlights include the differences between data, information, and knowledge; the relationship between learning, knowledge, and performance; the use of focus groups; documenting results; and knowledge classification. (LRW)

  7. A priori and a posteriori investigations for developing large eddy simulations of multi-species turbulent mixing under high-pressure conditions

    SciTech Connect

    Borghesi, Giulio; Bellan, Josette

    2015-03-15

    A Direct Numerical Simulation (DNS) database was created representing mixing of species under high-pressure conditions. The configuration considered is that of a temporally evolving mixing layer. The database was examined and analyzed for the purpose of modeling some of the unclosed terms that appear in the Large Eddy Simulation (LES) equations. Several metrics are used to understand the LES modeling requirements. First, a statistical analysis of the DNS-database large-scale flow structures was performed to provide a metric for probing the accuracy of the proposed LES models as the flow fields obtained from accurate LESs should contain structures of morphology statistically similar to those observed in the filtered-and-coarsened DNS (FC-DNS) fields. To characterize the morphology of the large-scales structures, the Minkowski functionals of the iso-surfaces were evaluated for two different fields: the second-invariant of the rate of deformation tensor and the irreversible entropy production rate. To remove the presence of the small flow scales, both of these fields were computed using the FC-DNS solutions. It was found that the large-scale structures of the irreversible entropy production rate exhibit higher morphological complexity than those of the second invariant of the rate of deformation tensor, indicating that the burden of modeling will be on recovering the thermodynamic fields. Second, to evaluate the physical effects which must be modeled at the subfilter scale, an a priori analysis was conducted. This a priori analysis, conducted in the coarse-grid LES regime, revealed that standard closures for the filtered pressure, the filtered heat flux, and the filtered species mass fluxes, in which a filtered function of a variable is equal to the function of the filtered variable, may no longer be valid for the high-pressure flows considered in this study. The terms requiring modeling are the filtered pressure, the filtered heat flux, the filtered pressure work

  8. Genetic basis of delay discounting in frequent gamblers: examination of a priori candidates and exploration of a panel of dopamine-related loci

    PubMed Central

    Gray, Joshua C; MacKillop, James

    2014-01-01

    Introduction Delay discounting is a behavioral economic index of impulsivity that reflects preferences for small immediate rewards relative to larger delayed rewards. It has been consistently linked to pathological gambling and other forms of addictive behavior, and has been proposed to be a behavioral characteristic that may link genetic variation and risk of developing addictive disorders (i.e., an endophenotype). Studies to date have revealed significant associations with polymorphisms associated with dopamine neurotransmission. The current study examined associations between delay discounting and both previously linked variants and a novel panel of dopamine-related variants in a sample of frequent gamblers. Methods Participants were 175 weekly gamblers of European ancestry who completed the Monetary Choice Questionnaire to assess delay discounting preferences and provided a DNA via saliva. Results In a priori tests, two loci previously associated with delayed reward discounting (rs1800497 and rs4680) were not replicated, however, the long form of DRD4 VNTR was significantly associated with lower discounting of delayed rewards. Exploratory analysis of the dopamine-related panel revealed 11 additional significant associations in genes associated with dopamine synthesis, breakdown, reuptake, and receptor function (DRD3, SLC6A3, DDC, DBH, and SLC18A2). An aggregate genetic risk score from the nominally significant loci accounted for 17% of the variance in discounting. Mediational analyses largely supported the presence of indirect effects between the associated loci, delay discounting, and pathological gambling severity. Conclusions These findings do not replicate previously reported associations but identify several novel candidates and provide preliminary support for a systems biology approach to understand the genetic basis of delay discounting. PMID:25365808

  9. Chromosomal microarray analysis as a first-line test in pregnancies with a priori low risk for the detection of submicroscopic chromosomal abnormalities

    PubMed Central

    Fiorentino, Francesco; Napoletano, Stefania; Caiazzo, Fiorina; Sessa, Mariateresa; Bono, Sara; Spizzichino, Letizia; Gordon, Anthony; Nuccitelli, Andrea; Rizzo, Giuseppe; Baldi, Marina

    2013-01-01

    In this study, we aimed to explore the utility of chromosomal microarray analysis (CMA) in groups of pregnancies with a priori low risk for detection of submicroscopic chromosome abnormalities, usually not considered an indication for testing, in order to assess whether CMA improves the detection rate of prenatal chromosomal aberrations. A total of 3000 prenatal samples were processed in parallel using both whole-genome CMA and conventional karyotyping. The indications for prenatal testing included: advanced maternal age, maternal serum screening test abnormality, abnormal ultrasound findings, known abnormal fetal karyotype, parental anxiety, family history of a genetic condition and cell culture failure. The use of CMA resulted in an increased detection rate regardless of the indication for analysis. This was evident in high risk groups (abnormal ultrasound findings and abnormal fetal karyotype), in which the percentage of detection was 5.8% (7/120), and also in low risk groups, such as advanced maternal age (6/1118, 0.5%), and parental anxiety (11/1674, 0.7%). A total of 24 (0.8%) fetal conditions would have remained undiagnosed if only a standard karyotype had been performed. Importantly, 17 (0.6%) of such findings would have otherwise been overlooked if CMA was offered only to high risk pregnancies.The results of this study suggest that more widespread CMA testing of fetuses would result in a higher detection of clinically relevant chromosome abnormalities, even in low risk pregnancies. Our findings provide substantial evidence for the introduction of CMA as a first-line diagnostic test for all pregnant women undergoing invasive prenatal testing, regardless of risk factors. PMID:23211699

  10. Knowledge management: organizing nursing care knowledge.

    PubMed

    Anderson, Jane A; Willson, Pamela

    2009-01-01

    Almost everything we do in nursing is based on our knowledge. In 1984, Benner (From Novice to Expert: Excellence and Power in Clinical Nursing Practice. Menlo Park, CA: Addison-Wesley; 1984) described nursing knowledge as the culmination of practical experience and evidence from research, which over time becomes the "know-how" of clinical experience. This "know-how" knowledge asset is dynamic and initially develops in the novice critical care nurse, expands within competent and proficient nurses, and is actualized in the expert intensive care nurse. Collectively, practical "know-how" and investigational (evidence-based) knowledge culminate into the "knowledge of caring" that defines the profession of nursing. The purpose of this article is to examine the concept of knowledge management as a framework for identifying, organizing, analyzing, and translating nursing knowledge into daily practice. Knowledge management is described in a model case and implemented in a nursing research project.

  11. Overcoming knowledge stickiness in scientific knowledge transfer.

    PubMed

    Blackman, Deborah; Benson, Angela M

    2012-07-01

    This paper explores the transfer and dissemination of knowledge between scientists, the volunteers who collect the knowledge and the communities which learn from it in order to implement change. The role of knowledge "stickiness" in the reduction of knowledge transfer is outlined. The characteristics of the knowledge and the situation combine to develop a range of factors, "stickiness predictors," which can deter knowledge transfer. These stickiness predictors are used to analyse data gathered from three qualitative cases, which were developed from both participant observation and semi-structured interviews studying the interactions between the scientists, volunteers and organisations. A reconsideration of the way that knowledge and knowledge transfer are being conceptualised by scientists is proposed, in order to enable "stickiness" factors to be recognised and managed, thereby increasing the potential for scientific literacy. A move towards a more broadly constituted community of practice is proposed.

  12. The Knowledge Warehouse: Reusing Knowledge Components.

    ERIC Educational Resources Information Center

    Yacci, Michael

    1999-01-01

    Currently there is little knowledge reuse across training, documentation, and performance support. Knowledge-based materials developed for one purpose are not shared or reused in others. The Knowledge Warehouse, a conceptual solution to this problem is discussed. Benefits and limitations are outlined and a definition of a standardized…

  13. Knowledge Management, Codification and Tacit Knowledge

    ERIC Educational Resources Information Center

    Kimble, Chris

    2013-01-01

    Introduction: This article returns to a theme addressed in Vol. 8(1) October 2002 of the journal: knowledge management and the problem of managing tacit knowledge. Method: The article is primarily a review and analysis of the literature associated with the management of knowledge. In particular, it focuses on the works of a group of economists who…

  14. A general method for sifting linguistic knowledge from structured terminologies.

    PubMed Central

    Grabar, N.; Zweigenbaum, P.

    2000-01-01

    Morphological knowledge is useful for medical language processing, information retrieval and terminology or ontology development. We show how a large volume of morphological associations between words can be learnt from existing medical terminologies by taking advantage of the semantic relations already encoded between terms in these terminologies: synonymy, hierarchy and transversal relations. The method proposed relies on no a priori linguistic knowledge. Since it can work with different relations between terms, it can be applied to any structured terminology. Tested on SNOMED and ICD in French and English, it proves to identify fairly reliable morphological relations (precision > 90%) with a good coverage (over 88% compared to the UMLS lexical variant generation program). For English words with a stem longer than 3 characters, recall reaches 98.8% for inflection and 94.7% for derivation. PMID:11079895

  15. Doctoring the Knowledge Worker

    ERIC Educational Resources Information Center

    Tennant, Mark

    2004-01-01

    In this paper I examine the impact of the new 'knowledge economy' on contemporary doctoral education. I argue that the knowledge economy promotes a view of knowledge and knowledge workers that fundamentally challenges the idea of a university as a community of autonomous scholars transmitting and adding to society's 'stock of knowledge'. The paper…

  16. Doctoring the Knowledge Worker

    ERIC Educational Resources Information Center

    Tennant, Mark

    2004-01-01

    In this paper I examine the impact of the new 'knowledge economy' on contemporary doctoral education. I argue that the knowledge economy promotes a view of knowledge and knowledge workers that fundamentally challenges the idea of a university as a community of autonomous scholars transmitting and adding to society's 'stock of knowledge'. The paper…

  17. Extensible knowledge-based architecture for segmenting CT data

    NASA Astrophysics Data System (ADS)

    Brown, Matthew S.; McNitt-Gray, Michael F.; Goldin, Jonathan G.; Aberle, Denise R.

    1998-06-01

    A knowledge-based system has been developed for segmenting computed tomography (CT) images. Its modular architecture includes an anatomical model, image processing engine, inference engine and blackboard. The model contains a priori knowledge of size, shape, X-ray attenuation and relative position of anatomical structures. This knowledge is used to constrain low-level segmentation routines. Model-derived constraints and segmented image objects are both transformed into a common feature space and posted on the blackboard. The inference engine then matches image to model objects, based on the constraints. The transformation to feature space allows the knowledge and image data representations to be independent. Thus a high-level model can be used, with data being stored in a frame-based semantic network. This modularity and explicit representation of knowledge allows for straightforward system extension. We initially demonstrate an application to lung segmentation in thoracic CT, with subsequent extension of the knowledge-base to include tumors within the lung fields. The anatomical model was later augmented to include basic brain anatomy including the skull and blood vessels, to allow automatic segmentation of vascular structures in CT angiograms for 3D rendering and visualization.

  18. Earthquake relocation using a 3D a-priori geological velocity model from the western Alps to Corsica: Implication for seismic hazard

    NASA Astrophysics Data System (ADS)

    Béthoux, Nicole; Theunissen, Thomas; Beslier, Marie-Odile; Font, Yvonne; Thouvenot, François; Dessa, Jean-Xavier; Simon, Soazig; Courrioux, Gabriel; Guillen, Antonio

    2016-02-01

    The region between the inner zones of the Alps and Corsica juxtaposes an overthickened crust to an oceanic domain, which makes difficult to ascertain the focal depth of seismic events using routine location codes and average 1D velocity models. The aim of this article is to show that, even with a rather lose monitoring network, accurate routine locations can be achieved by using realistic 3D modelling and advanced location techniques. Previous earthquake tomography studies cover the whole region with spatial resolutions of several tens of kilometres on land, but they fail to resolve the marine domain due to the absence of station coverage and sparse seismicity. To overcome these limitations, we first construct a 3D a-priori P and S velocity model integrating known geophysical and geological information. Significant progress has been achieved in the 3D numerical modelling of complex geological structures by the development of dedicated softwares (e.g. 3D GeoModeller), capable at once of elaborating a 3D structural model from geological and geophysical constraints and, possibly, of refining it by inversion processes (Calcagno et al., 2008). Then, we build an arrival-time catalogue of 1500 events recorded from 2000 to 2011. Hypocentres are then located in this model using a numerical code based on the maximum intersection method (Font et al., 2004), updated by Theunissen et al. (2012), as well as another 3D location technique, the NonLinLoc software (Lomax and Curtis, 2001). The reduction of arrival-time residuals and uncertainties (dh, dz) with respect to classical 1D locations demonstrates the improved accuracy allowed by our approach and confirms the coherence of the 3D geological model built and used in this study. Our results are also compared with previous works that benefitted from the installation of dense temporary networks surrounding the studied epicentre area. The resulting 3D location catalogue allows us to improve the regional seismic hazard assessment

  19. A priori performance prediction in pharmaceutical wet granulation: testing the applicability of the nucleation regime map to a formulation with a broad size distribution and dry binder addition.

    PubMed

    Kayrak-Talay, Defne; Litster, James D

    2011-10-14

    In this study, Hapgood's nucleation regime map (Hapgood et al., 2003) was tested for a formulation that consists of an active pharmaceutical ingredient (API) of broad size distribution and a fine dry binder. Gabapentin was used as the API and hydroxypropyl cellulose (HPC) as the dry binder with deionized water as the liquid binder. The formulation was granulated in a 6l Diosna high shear granulator. The effect of liquid addition method (spray, dripping), liquid addition rate (29-245 g/min), total liquid content (2, 4 and 10%), and impeller speed (250 and 500 rpm) on the granule size distribution and lump formation were investigated. Standard methods were successfully used to characterize the process parameters (spray drop size, spray geometry and powder surface velocity) for calculating the dimensionless spray flux. However, the addition of dry binder had a very strong effect on drop penetration time that could not be predicted from simple capillary flow considerations. This is most likely due to preferential liquid penetration into the fine pores related to the dry binder particles and subsequent partial softening and dissolution of the binder. For systems containing a dry binder or other amorphous powders, it is recommended that drop penetration time be measured directly for the blended formulation and then scaled to the drop size during spraying. Using these approaches to characterize the key dimensionless groups (dimensionless spray flux and drop penetration time), Hapgood's nucleation regime map was successfully used to predict a priori the effect of process conditions on the quality of the granule size distribution as measured by lump formation and the span of the size distribution, both before and after wet massing for range of conditions studied. Wider granule size distributions and higher amount of lumps were obtained moving from intermediate to mechanical dispersion regime. Addition of the liquid in the dripping mode gave the broadest size distribution

  20. 3-D multiobservable probabilistic inversion for the compositional and thermal structure of the lithosphere and upper mantle. I: a priori petrological information and geophysical observables

    NASA Astrophysics Data System (ADS)

    Afonso, J. C.; Fullea, J.; Griffin, W. L.; Yang, Y.; Jones, A. G.; D. Connolly, J. A.; O'Reilly, S. Y.

    2013-05-01

    Traditional inversion techniques applied to the problem of characterizing the thermal and compositional structure of the upper mantle are not well suited to deal with the nonlinearity of the problem, the trade-off between temperature and compositional effects on wave velocities, the nonuniqueness of the compositional space, and the dissimilar sensitivities of physical parameters to temperature and composition. Probabilistic inversions, on the other hand, offer a powerful formalism to cope with all these difficulties, while allowing for an adequate treatment of the intrinsic uncertainties associated with both data and physical theories. This paper presents a detailed analysis of the two most important elements controlling the outputs of probabilistic (Bayesian) inversions for temperature and composition of the Earth's mantle, namely the a priori information on model parameters, ρ(m), and the likelihood function, L(m). The former is mainly controlled by our current understanding of lithosphere and mantle composition, while the latter conveys information on the observed data, their uncertainties, and the physical theories used to relate model parameters to observed data. The benefits of combining specific geophysical datasets (Rayleigh and Love dispersion curves, body wave tomography, magnetotelluric, geothermal, petrological, gravity, elevation, and geoid), and their effects on L(m), are demonstrated by analyzing their individual and combined sensitivities to composition and temperature as well as their observational uncertainties. The dependence of bulk density, electrical conductivity, and seismic velocities to major-element composition is systematically explored using Monte Carlo simulations. We show that the dominant source of uncertainty in the identification of compositional anomalies within the lithosphere is the intrinsic nonuniqueness in compositional space. A general strategy for defining ρ(m) is proposed based on statistical analyses of a large database

  1. An approach for the long-term 30-m land surface snow-free albedo retrieval from historic Landsat surface reflectance and MODIS-based a priori anisotropy knowledge

    USDA-ARS?s Scientific Manuscript database

    Land surface albedo has been recognized by the Global Terrestrial Observing System (GTOS) as an essential climate variable crucial for accurate modeling and monitoring of the Earth’s radiative budget. While global climate studies can leverage albedo datasets from MODIS, VIIRS, and other coarse-reso...

  2. Overview of Knowledge Management.

    ERIC Educational Resources Information Center

    Serban, Andreea M.; Luan, Jing

    2002-01-01

    Defines knowledge management, its components, processes, and outcomes. Addresses the importance of knowledge management for higher education in general and for institutional research in particular. (EV)

  3. Knowledge Repository for Fmea Related Knowledge

    NASA Astrophysics Data System (ADS)

    Cândea, Gabriela Simona; Kifor, Claudiu Vasile; Cândea, Ciprian

    2014-11-01

    This paper presents innovative usage of knowledge system into Failure Mode and Effects Analysis (FMEA) process using the ontology to represent the knowledge. Knowledge system is built to serve multi-projects work that nowadays are in place in any manufacturing or services provider, and knowledge must be retained and reused at the company level and not only at project level. The system is following the FMEA methodology and the validation of the concept is compliant with the automotive industry standards published by Automotive Industry Action Group, and not only. Collaboration is assured trough web-based GUI that supports multiple users access at any time

  4. Knowledge Engineering and Education.

    ERIC Educational Resources Information Center

    Lopez, Antonio M., Jr.; Donlon, James

    2001-01-01

    Discusses knowledge engineering, computer software, and possible applications in the field of education. Highlights include the distinctions between data, information, and knowledge; knowledge engineering as a subfield of artificial intelligence; knowledge acquisition; data mining; ontology development for subject terms; cognitive apprentices; and…

  5. Knowledge and Its Enemies

    ERIC Educational Resources Information Center

    Kruk, Miroslav

    2007-01-01

    As libraries are the physical manifestations of knowledge, some refection about the concept of knowledge would not be unjustified. In modern societies, knowledge plays such a central role that it requires some effort and imagination to understand on what grounds knowledge could be rejected. Karl Popper wrote about the open society and its enemies.…

  6. Knowledge and Its Enemies

    ERIC Educational Resources Information Center

    Kruk, Miroslav

    2007-01-01

    As libraries are the physical manifestations of knowledge, some refection about the concept of knowledge would not be unjustified. In modern societies, knowledge plays such a central role that it requires some effort and imagination to understand on what grounds knowledge could be rejected. Karl Popper wrote about the open society and its enemies.…

  7. Teacher Knowledge-Ability and Pupil Achievement.

    ERIC Educational Resources Information Center

    Tanner, Daniel; Celso, Nicholas

    The effectiveness of schools and the levels of investment in schooling have been in question since the 1966 Coleman report "Equality of Educational Opportunity." Based on a theory of "knowledge-ability," this study challenges the assumption that given "inputs" will yield equivalent effects or "outputs." In…

  8. Teacher Knowledge-Ability and Pupil Achievement.

    ERIC Educational Resources Information Center

    Tanner, Daniel; Celso, Nicholas

    The effectiveness of schools and the levels of investment in schooling have been in question since the 1966 Coleman report "Equality of Educational Opportunity." Based on a theory of "knowledge-ability," this study challenges the assumption that given "inputs" will yield equivalent effects or "outputs." In…

  9. Why Explicit Knowledge Cannot Become Implicit Knowledge

    ERIC Educational Resources Information Center

    VanPatten, Bill

    2016-01-01

    In this essay, I review one of the conclusions in Lindseth (2016) published in "Foreign Language Annals." That conclusion suggests that explicit learning and practice (what she called form-focused instruction) somehow help the development of implicit knowledge (or might even become implicit knowledge). I argue for a different…

  10. Why Explicit Knowledge Cannot Become Implicit Knowledge

    ERIC Educational Resources Information Center

    VanPatten, Bill

    2016-01-01

    In this essay, I review one of the conclusions in Lindseth (2016) published in "Foreign Language Annals." That conclusion suggests that explicit learning and practice (what she called form-focused instruction) somehow help the development of implicit knowledge (or might even become implicit knowledge). I argue for a different…

  11. Transformation through Knowledge--Knowledge through Transformation.

    ERIC Educational Resources Information Center

    Cadena, Felix

    1991-01-01

    Defines systematization as the process of creating critical knowledge (conscientization), a form of transformative research. Explains how systematization contributes to popular education and presents the form components of the process: identifying limits of research, obtaining data, interpretation, and socialization. (SK)

  12. Documentation and knowledge acquisition

    NASA Technical Reports Server (NTRS)

    Rochowiak, Daniel; Moseley, Warren

    1990-01-01

    Traditional approaches to knowledge acquisition have focused on interviews. An alternative focuses on the documentation associated with a domain. Adopting a documentation approach provides some advantages during familiarization. A knowledge management tool was constructed to gain these advantages.

  13. Introduction to knowledge base

    SciTech Connect

    Ohsuga, S.

    1986-01-01

    This work provides a broad range of easy-to-understand information on basic knowledge base concepts and basic element technology for the building of a knowledge base system. It also discusses various languages and networks for development of knowledge base systems. It describes applications of knowledge base utilization methodology and prospects for the future in such areas as pattern recognition, natural language processing, expert systems, and CAD/CAM.

  14. Knowledge, People, and Risk

    NASA Technical Reports Server (NTRS)

    Rogers, Edward W.

    2008-01-01

    NASA's mandate is to take risks to got into space while applying its best knowledge. NASA's knowledge is the result of scientific insights from research, engineering wisdom from experience, project management skills, safety and team consciousness and institutional support and collaboration. This presentation highlights NASA's organizational knowledge, communication and growth efforts.

  15. Knowledge Discovery in Databases.

    ERIC Educational Resources Information Center

    Norton, M. Jay

    1999-01-01

    Knowledge discovery in databases (KDD) revolves around the investigation and creation of knowledge, processes, algorithms, and mechanisms for retrieving knowledge from data collections. The article is an introductory overview of KDD. The rationale and environment of its development and applications are discussed. Issues related to database design…

  16. Activating Event Knowledge

    ERIC Educational Resources Information Center

    Hare, Mary; Jones, Michael; Thomson, Caroline; Kelly, Sarah; McRae, Ken

    2009-01-01

    An increasing number of results in sentence and discourse processing demonstrate that comprehension relies on rich pragmatic knowledge about real-world events, and that incoming words incrementally activate such knowledge. If so, then even outside of any larger context, nouns should activate knowledge of the generalized events that they denote or…

  17. Seafarers Knowledge Inventory.

    ERIC Educational Resources Information Center

    Hounshell, Paul B.

    This 60-item, multiple-choice Seafarers Knowledge Inventory was developed for use in marine vocational classes (grades 9-12) to measure a student's knowledge of information that "seafarers" should know. Items measure knowledge of various aspects of boating operation, weather, safety, winds, and oceanography. Steps in the construction of…

  18. Knowledge Discovery in Databases.

    ERIC Educational Resources Information Center

    Norton, M. Jay

    1999-01-01

    Knowledge discovery in databases (KDD) revolves around the investigation and creation of knowledge, processes, algorithms, and mechanisms for retrieving knowledge from data collections. The article is an introductory overview of KDD. The rationale and environment of its development and applications are discussed. Issues related to database design…

  19. Relationships between Knowledge(s): Implications for "Knowledge Integration"

    ERIC Educational Resources Information Center

    Evering, Brigitte

    2012-01-01

    This article contributes to a critical dialogue about what is currently called "knowledge integration" in environmental research and related educational programming. Indigenous understandings in particular are seen as offering (re)new(ed) ways of thinking that have and will lead to innovative practices for addressing complex environmental issues.…

  20. A priori and a posteriori dietary patterns at the age of 1 year and body composition at the age of 6 years: the Generation R Study.

    PubMed

    Voortman, Trudy; Leermakers, Elisabeth T M; Franco, Oscar H; Jaddoe, Vincent W V; Moll, Henriette A; Hofman, Albert; van den Hooven, Edith H; Kiefte-de Jong, Jessica C

    2016-08-01

    Dietary patterns have been linked to obesity in adults, however, not much is known about this association in early childhood. We examined associations of different types of dietary patterns in 1-year-old children with body composition at school age in 2026 children participating in a population-based cohort study. Dietary intake at the age of 1 year was assessed with a food-frequency questionnaire. At the children's age of 6 years we measured their body composition with dual-energy X-ray absorptiometry and we calculated body mass index, fat mass index (FMI), and fat-free mass index (FFMI). Three dietary pattern approaches were used: (1) An a priori-defined diet quality score; (2) dietary patterns based on variation in food intake, derived from principal-component-analysis (PCA); and (3) dietary patterns based on variations in FMI and FFMI, derived with reduced-rank-regression (RRR). Both the a priori-defined diet score and a 'Health-conscious' PCA-pattern were characterized by a high intake of fruit, vegetables, grains, and vegetable oils, and, after adjustment for confounders, children with higher adherence to these patterns had a higher FFMI at 6 years [0.19 SD (95 % CI 0.08;0.30) per SD increase in diet score], but had no different FMI. One of the two RRR-patterns was also positively associated with FFMI and was characterized by intake of whole grains, pasta and rice, and vegetable oils. Our results suggest that different a priori- and a posteriori-derived health-conscious dietary patterns in early childhood are associated with a higher fat-free mass, but not with fat mass, in later childhood.

  1. Induction as Knowledge Integration

    NASA Technical Reports Server (NTRS)

    Smith, Benjamin D.; Rosenbloom, Paul S.

    1996-01-01

    Two key issues for induction algorithms are the accuracy of the learned hypothesis and the computational resources consumed in inducing that hypothesis. One of the most promising ways to improve performance along both dimensions is to make use of additional knowledge. Multi-strategy learning algorithms tackle this problem by employing several strategies for handling different kinds of knowledge in different ways. However, integrating knowledge into an induction algorithm can be difficult when the new knowledge differs significantly from the knowledge the algorithm already uses. In many cases the algorithm must be rewritten. This paper presents Knowledge Integration framework for Induction (KII), a KII, that provides a uniform mechanism for integrating knowledge into induction. In theory, arbitrary knowledge can be integrated with this mechanism, but in practice the knowledge representation language determines both the knowledge that can be integrated, and the costs of integration and induction. By instantiating KII with various set representations, algorithms can be generated at different trade-off points along these dimensions. One instantiation of KII, called RS-KII, is presented that can implement hybrid induction algorithms, depending on which knowledge it utilizes. RS-KII is demonstrated to implement AQ-11, as well as a hybrid algorithm that utilizes a domain theory and noisy examples. Other algorithms are also possible.

  2. Knowledge and luck.

    PubMed

    Turri, John; Buckwalter, Wesley; Blouw, Peter

    2015-04-01

    Nearly all success is due to some mix of ability and luck. But some successes we attribute to the agent's ability, whereas others we attribute to luck. To better understand the criteria distinguishing credit from luck, we conducted a series of four studies on knowledge attributions. Knowledge is an achievement that involves reaching the truth. But many factors affecting the truth are beyond our control, and reaching the truth is often partly due to luck. Which sorts of luck are compatible with knowledge? We found that knowledge attributions are highly sensitive to lucky events that change the explanation for why a belief is true. By contrast, knowledge attributions are surprisingly insensitive to lucky events that threaten, but ultimately fail to change the explanation for why a belief is true. These results shed light on our concept of knowledge, help explain apparent inconsistencies in prior work on knowledge attributions, and constitute progress toward a general understanding of the relation between success and luck.

  3. Knowledge-based goal-driven approach for information extraction and decision making for target recognition

    NASA Astrophysics Data System (ADS)

    Wilson, Roderick D.; Wilson, Anitra C.

    1996-06-01

    This paper presents a novel goal-driven approach for designing a knowledge-based system for information extraction and decision-making for target recognition. The underlying goal-driven model uses a goal frame tree schema for target organization, a hybrid rule-based pattern- directed formalism for target structural encoding, and a goal-driven inferential control strategy. The knowledge-base consists of three basic structures for the organization and control of target information: goals, target parameters, and an object-rulebase. Goal frames represent target recognition tasks as goals and subgoals in the knowledge base. Target parameters represent characteristic attributes of targets that are encoded as information atoms. Information atoms may have one or more assigned values and are used for information extraction. The object-rulebase consists of pattern/action assertional implications that describe the logical relationships existing between target parameter values. A goal realization process formulates symbolic patten expressions whose atomic values map to target parameters contained a priori in a hierarchical database of target state information. Symbolic pattern expression creation is accomplished via the application of a novel goal-driven inference strategy that logically prunes an AND/OR tree constructed object-rulebase. Similarity analysis is performed via pattern matching of query symbolic patterns and a priori instantiated target parameters.

  4. Knowledge Interaction Design for Creative Knowledge Work

    NASA Astrophysics Data System (ADS)

    Nakakoji, Kumiyo; Yamamoto, Yasuhiro

    This paper describes our approach for the development of application systems for creative knowledge work, particularly for early stages of information design tasks. Being a cognitive tool serving as a means of externalization, an application system affects how the user is engaged in the creative process through its visual interaction design. Knowledge interaction design described in this paper is a framework where a set of application systems for different information design domains are developed based on an interaction model, which is designed for a particular model of a thinking process. We have developed two sets of application systems using the knowledge interaction design framework: one includes systems for linear information design, such as writing, movie-editing, and video-analysis; the other includes systems for network information design, such as file-system navigation and hypertext authoring. Our experience shows that the resulting systems encourage users to follow a certain cognitive path through graceful user experience.

  5. [Knowledge production: a dialogue among different knowledge].

    PubMed

    Erdmann, Alacoque Lorenzini; Schlindwein, Betina Hömer; de Sousa, Francisca Georgina Macedo

    2006-01-01

    This text approaches the necessary dialogue among different knowledge and considers the advances within Nursing in the search for consistence and clarity within the Nursing discipline. Towards this end, the text is based upon transdisciplinarity, intersectorality, complexity, and the interaction of different pairs in health and other areas, as well as the sustainance of scientific and technological space within Nursing. It argues perspectives that open possibilities for scientific and technological knowledge construction within a more responsible and mutual social commitment. The purpose of the paperis to amplify the aptitude for contextualization and globalize different knowledge, as well as transcend differences and peculiarities within the perspective of more qualitative policies which may overcome disciplinary barriers.

  6. Generating a mortality model from a pediatric ICU (PICU) database utilizing knowledge discovery.

    PubMed Central

    Kennedy, Curtis E.; Aoki, Noriaki

    2002-01-01

    Current models for predicting outcomes are limited by biases inherent in a priori hypothesis generation. Knowledge discovery algorithms generate models directly from databases, minimizing such limitations. Our objective was to generate a mortality model from a PICU database utilizing knowledge discovery techniques. The database contained 5067 records with 192 clinically relevant variables. It was randomly split into training (75%) and validation (25%) groups. We used decision tree induction to generate a mortality model from the training data, and validated its performance on the validation data. The original PRISM algorithm was used for comparison. The decision tree model contained 25 variables and predicted 53/88 deaths; 29 correctly (Sens:33%, Spec:98%, PPV:54%). PRISM predicted 27/88 deaths correctly (Sens:30%, Spec:98%, PPV:51%). Performance difference between models was not significant. We conclude that knowledge discovery algorithms can generate a mortality model from a PICU database, helping establish validity of such tools in the clinical medical domain. PMID:12463850

  7. A priori and a posteriori approaches for finding genes of evolutionary interest in non-model species: osmoregulatory genes in the kidney transcriptome of the desert rodent Dipodomys spectabilis (banner-tailed kangaroo rat).

    PubMed

    Marra, Nicholas J; Eo, Soo Hyung; Hale, Matthew C; Waser, Peter M; DeWoody, J Andrew

    2012-12-01

    One common goal in evolutionary biology is the identification of genes underlying adaptive traits of evolutionary interest. Recently next-generation sequencing techniques have greatly facilitated such evolutionary studies in species otherwise depauperate of genomic resources. Kangaroo rats (Dipodomys sp.) serve as exemplars of adaptation in that they inhabit extremely arid environments, yet require no drinking water because of ultra-efficient kidney function and osmoregulation. As a basis for identifying water conservation genes in kangaroo rats, we conducted a priori bioinformatics searches in model rodents (Mus musculus and Rattus norvegicus) to identify candidate genes with known or suspected osmoregulatory function. We then obtained 446,758 reads via 454 pyrosequencing to characterize genes expressed in the kidney of banner-tailed kangaroo rats (Dipodomys spectabilis). We also determined candidates a posteriori by identifying genes that were overexpressed in the kidney. The kangaroo rat sequences revealed nine different a priori candidate genes predicted from our Mus and Rattus searches, as well as 32 a posteriori candidate genes that were overexpressed in kidney. Mutations in two of these genes, Slc12a1 and Slc12a3, cause human renal diseases that result in the inability to concentrate urine. These genes are likely key determinants of physiological water conservation in desert rodents. Copyright © 2012 Elsevier Inc. All rights reserved.

  8. Teacher Knowledge/Ability and Pupil Achievement.

    ERIC Educational Resources Information Center

    Tanner, Daniel; Celso, Nicholas

    1982-01-01

    Studies of the relationship between increased spending for education and student achievement have failed to take into account teacher knowledge and ability. The authors' study showed wide variation within schools and concluded that research on the effects of schooling is weakened by concentrating on entire schools and school systems. (Author/WD)

  9. Knowledge to Manage the Knowledge Society

    ERIC Educational Resources Information Center

    Minati, Gianfranco

    2012-01-01

    Purpose: The purpose of this research is to make evident the inadequateness of concepts and language based on industrial knowledge still used in current practices by managers to cope with problems of the post-industrial societies characterised by non-linear process of emergence and acquisition of properties. The purpose is to allow management to…

  10. Knowledge to Manage the Knowledge Society

    ERIC Educational Resources Information Center

    Minati, Gianfranco

    2012-01-01

    Purpose: The purpose of this research is to make evident the inadequateness of concepts and language based on industrial knowledge still used in current practices by managers to cope with problems of the post-industrial societies characterised by non-linear process of emergence and acquisition of properties. The purpose is to allow management to…

  11. Indigenous Knowledge within a Global Knowledge System

    ERIC Educational Resources Information Center

    Durie, Mason

    2005-01-01

    Faced with globalizing forces that promote universal approaches to knowledge and understanding, indigenous peoples have reacted by abandoning the old ways or alternately seeking to re-discover ancient wisdoms as foundations for pathways to the future. Increasingly, however, a third way has been to focus on the interface between indigenous…

  12. Reasoning about procedural knowledge

    NASA Technical Reports Server (NTRS)

    Georgeff, M. P.

    1985-01-01

    A crucial aspect of automated reasoning about space operations is that knowledge of the problem domain is often procedural in nature - that is, the knowledge is often in the form of sequences of actions or procedures for achieving given goals or reacting to certain situations. In this paper a system is described that explicitly represents and reasons about procedural knowledge. The knowledge representation used is sufficiently rich to describe the effects of arbitrary sequences of tests and actions, and the inference mechanism provides a means for directly using this knowledge to reach desired operational goals. Furthermore, the representation has a declarative semantics that provides for incremental changes to the system, rich explanatory capabilities, and verifiability. The approach also provides a mechanism for reasoning about the use of this knowledge, thus enabling the system to choose effectively between alternative courses of action.

  13. Knowledge and Distributed computation

    DTIC Science & Technology

    1990-05-01

    convincing evidence that reasoning in terms of knowledge can lead to .. n... uif.yi ...... lts" about diStfibuLuc computation, and we extend the standard...can be made precise in the context of computer science. In this thesis, we pro- vide convincing evidence that reasoning in terms of knowledge can lead ...against different adversaries. We show how different adversaries lead to different definitions of probabilistic knowledge, and given a particular adversary

  14. Knowledge Resources or Decisions?

    DTIC Science & Technology

    1991-08-01

    representation as grammatical resources proposed within Systemic Func- tional Linguistics (SFL) ( Halliday , 1978) is not only corn- conducted taking the first...knowledge. In this paper, we argue that all the linguistic knowledge What we therefore aim at is to draw a clear borderline necessary to construct a...text should be declaratively rep- between linguistic knowledge to be encoded declaratively resented as resources available to a text planner. and

  15. Recording Scientific Knowledge

    SciTech Connect

    Bowker, Geof

    2006-01-09

    The way we record knowledge, and the web of technical, formal, and social practices that surrounds it, inevitably affects the knowledge that we record. The ways we hold knowledge about the past - in handwritten manuscripts, in printed books, in file folders, in databases - shape the kind of stories we tell about that past. In this talk, I look at how over the past two hundred years, information technology has affected the nature and production of scientific knowledge. Further, I explore ways in which the emergent new cyberinfrastructure is changing our relationship to scientific practice.

  16. Interactive knowledge acquisition tools

    NASA Technical Reports Server (NTRS)

    Dudziak, Martin J.; Feinstein, Jerald L.

    1987-01-01

    The problems of designing practical tools to aid the knowledge engineer and general applications used in performing knowledge acquisition tasks are discussed. A particular approach was developed for the class of knowledge acquisition problem characterized by situations where acquisition and transformation of domain expertise are often bottlenecks in systems development. An explanation is given on how the tool and underlying software engineering principles can be extended to provide a flexible set of tools that allow the application specialist to build highly customized knowledge-based applications.

  17. Recording Scientific Knowledge

    SciTech Connect

    Bowker, Geof

    2006-01-09

    The way we record knowledge, and the web of technical, formal, and social practices that surrounds it, inevitably affects the knowledge that we record. The ways we hold knowledge about the past - in handwritten manuscripts, in printed books, in file folders, in databases - shape the kind of stories we tell about that past. In this talk, I look at how over the past two hundred years, information technology has affected the nature and production of scientific knowledge. Further, I explore ways in which the emergent new cyberinfrastructure is changing our relationship to scientific practice.

  18. Automated knowledge generation

    NASA Technical Reports Server (NTRS)

    Myler, Harley R.; Gonzalez, Avelino J.

    1988-01-01

    The general objectives of the NASA/UCF Automated Knowledge Generation Project were the development of an intelligent software system that could access CAD design data bases, interpret them, and generate a diagnostic knowledge base in the form of a system model. The initial area of concentration is in the diagnosis of the process control system using the Knowledge-based Autonomous Test Engineer (KATE) diagnostic system. A secondary objective was the study of general problems of automated knowledge generation. A prototype was developed, based on object-oriented language (Flavors).

  19. Knowledge-based vision and simple visual machines.

    PubMed Central

    Cliff, D; Noble, J

    1997-01-01

    The vast majority of work in machine vision emphasizes the representation of perceived objects and events: it is these internal representations that incorporate the 'knowledge' in knowledge-based vision or form the 'models' in model-based vision. In this paper, we discuss simple machine vision systems developed by artificial evolution rather than traditional engineering design techniques, and note that the task of identifying internal representations within such systems is made difficult by the lack of an operational definition of representation at the causal mechanistic level. Consequently, we question the nature and indeed the existence of representations posited to be used within natural vision systems (i.e. animals). We conclude that representations argued for on a priori grounds by external observers of a particular vision system may well be illusory, and are at best place-holders for yet-to-be-identified causal mechanistic interactions. That is, applying the knowledge-based vision approach in the understanding of evolved systems (machines or animals) may well lead to theories and models that are internally consistent, computationally plausible, and entirely wrong. PMID:9304684

  20. Public Knowledge Cultures

    ERIC Educational Resources Information Center

    Peters, Michael A.; Besley, A. C.

    2006-01-01

    This article first reviews claims for the knowledge economy in terms of excludability, rivalry, and transparency indicating the way that digital goods behave differently from other commodities. In the second section it discusses the theory of "public knowledge cultures" starting from the primacy of practice based on Marx, Wittgenstein and…

  1. How Knowledge Powers Reading

    ERIC Educational Resources Information Center

    Lemov, Doug

    2017-01-01

    Recent research shows that reading comprehension relies heavily on prior knowledge. Far more than generic "reading skills" like drawing inferences, making predictions, and knowing the function of subheads, how well students learn from a nonfiction text depends on their background knowledge of the text's subject matter. And in a cyclical…

  2. Children's Knowledge about Medicines.

    ERIC Educational Resources Information Center

    Almarsdottir, Anna B.; Zimmer, Catherine

    1998-01-01

    Examined knowledge about medicines and perceived benefit among 101 children, ages 7 and 10. Found that medicine knowledge was explained using age, educational environment, and degree of internal locus of control as significant predictors. The negative effect of internal locus of control predicted perceived benefit. Retention of drug advertising…

  3. Translating Facts into Knowledge

    ERIC Educational Resources Information Center

    Umewaka, Soraya

    2011-01-01

    Many education systems have a tendency to be limiting and rigid. These systems teach children to value facts over knowledge and routine and repetition over playfulness and curiosity to seek knowledge. How can we unleash our children's imagination and permit them to use play and other creative tools as a means of learning? This article proposes new…

  4. Cooperative Knowledge Bases.

    DTIC Science & Technology

    1988-02-01

    intellegent knowledge bases. The present state of our system for concurrent evaluation of a knowledge base of logic clauses using static allocation...de Kleer, J., An assumption-based TMS, Artificial Intelligence, Vol. 28, No. 2, 1986. [Doyle 79) Doyle, J. A truth maintenance system, Artificial

  5. Educating the Knowledge Worker.

    ERIC Educational Resources Information Center

    Leddick, Susan; Gharajedaghi, Jamshid

    2001-01-01

    In the new economy, knowledge (not labor, raw material, or capital) is the key resource to be converted to goods and services. Public schools will have to educate three tiers of knowledge workers (doers, problem solvers, and designers) using differentiated assessment, curricula, and instruction. Organizational action, not mantras, is needed. (MLH)

  6. Translating Facts into Knowledge

    ERIC Educational Resources Information Center

    Umewaka, Soraya

    2011-01-01

    Many education systems have a tendency to be limiting and rigid. These systems teach children to value facts over knowledge and routine and repetition over playfulness and curiosity to seek knowledge. How can we unleash our children's imagination and permit them to use play and other creative tools as a means of learning? This article proposes new…

  7. Is Knowledge Like Love?

    ERIC Educational Resources Information Center

    Saussois, Jean-Michel

    2014-01-01

    The label "knowledge management" is a source of ambiguity within the education community. In fact, the role played by knowledge within economics has an impact on the education sector, especially on the nature of the teacher's job. Compared to other sectors such as engineering and health, research and development investment is still weak.…

  8. Essays on Knowledge Management

    ERIC Educational Resources Information Center

    Xiao, Wenli

    2012-01-01

    For many firms, particularly those operating in high technology and competitive markets, knowledge is cited as the most important strategic asset to the firm, which significantly drives its survival and success (Grant 1996, Webber 1993). Knowledge management (KM) impacts the firm's ability to develop process features that reduce manufacturing…

  9. Is Knowledge Like Love?

    ERIC Educational Resources Information Center

    Saussois, Jean-Michel

    2014-01-01

    The label "knowledge management" is a source of ambiguity within the education community. In fact, the role played by knowledge within economics has an impact on the education sector, especially on the nature of the teacher's job. Compared to other sectors such as engineering and health, research and development investment is still weak.…

  10. Knowledge and resilience.

    PubMed

    Lau, Joe Yen-fong

    2015-01-01

    Kalisch et al. regard a positive appraisal style as the mechanism for promoting resilience. I argue that knowledge can enhance resilience without affecting appraisal style. Furthermore, the relationship between positive appraisals and resilience ought to be mediated by knowledge and is not monotonic. Finally, I raise some questions about how appraisals fit into the dual-process model of the mind.

  11. Interest and Prior Knowledge.

    ERIC Educational Resources Information Center

    Tobias, Sigmund

    This paper selectively reviews research on the relationship between topic interest and prior knowledge, and discusses the optimal association between these variables. The paper points out that interest has a facilitating impact on learning, and at least part of this effect must be ascribed to prior knowledge. While the interest-knowledge…

  12. Organizational Knowledge Management Structure

    ERIC Educational Resources Information Center

    Walczak, Steven

    2005-01-01

    Purpose: To propose and evaluate a novel management structure that encourages knowledge sharing across an organization. Design/methodology/approach: The extant literature on the impact of organizational culture and its link to management structure is examined and used to develop a new knowledge sharing management structure. Roadblocks to…

  13. Essays on Knowledge Management

    ERIC Educational Resources Information Center

    Xiao, Wenli

    2012-01-01

    For many firms, particularly those operating in high technology and competitive markets, knowledge is cited as the most important strategic asset to the firm, which significantly drives its survival and success (Grant 1996, Webber 1993). Knowledge management (KM) impacts the firm's ability to develop process features that reduce manufacturing…

  14. Children's Knowledge about Medicines.

    ERIC Educational Resources Information Center

    Almarsdottir, Anna B.; Zimmer, Catherine

    1998-01-01

    Examined knowledge about medicines and perceived benefit among 101 children, ages 7 and 10. Found that medicine knowledge was explained using age, educational environment, and degree of internal locus of control as significant predictors. The negative effect of internal locus of control predicted perceived benefit. Retention of drug advertising…

  15. Building Background Knowledge

    ERIC Educational Resources Information Center

    Fisher, Douglas; Ross, Donna; Grant, Maria

    2010-01-01

    Too often, students enter our classrooms with insufficient knowledge of physical science. As a result, they have a difficult time understanding content in texts, lectures, and laboratory activities. This lack of background knowledge can have an impact on their ability to ask questions and wonder--both key components of inquiry. In this article,…

  16. The Knowledge Race.

    ERIC Educational Resources Information Center

    Krell, Eric

    2001-01-01

    Shorter learning cycles for workers are a strategic advantage for most companies. Companies that complement product cycles with knowledge often employ four strategies: (1) early involvement in training, (2) conducive organizational structure, (3) innovative knowledge delivery, and (4) breadth of content. (JOW)

  17. Reuniting Virtue and Knowledge

    ERIC Educational Resources Information Center

    Culham, Tom

    2015-01-01

    Einstein held that intuition is more important than rational inquiry as a source of discovery. Further, he explicitly and implicitly linked the heart, the sacred, devotion and intuitive knowledge. The raison d'être of universities is the advance of knowledge; however, they have primarily focused on developing student's skills in working with…

  18. Knowledge Production and Utilization.

    ERIC Educational Resources Information Center

    Beal, George M.; Meehan, Peter

    The study of knowledge production and utilization deals with the problems of how to translate theoretical concepts and knowledge into practical solutions to people's problems. Many criticisms have been leveled at aspects of previous information system models, including their linearity, one-way communication paths, overdependence on scientific…

  19. Adding Confidence to Knowledge

    ERIC Educational Resources Information Center

    Goodson, Ludwika Aniela; Slater, Don; Zubovic, Yvonne

    2015-01-01

    A "knowledge survey" and a formative evaluation process led to major changes in an instructor's course and teaching methods over a 5-year period. Design of the survey incorporated several innovations, including: a) using "confidence survey" rather than "knowledge survey" as the title; b) completing an instructional…

  20. Facilitating Naval Knowledge Flow

    DTIC Science & Technology

    2001-07-01

    Transition ...................................................................... 9 Figure 5 Nonaka Knowledge Flow Theory...terms of the dimension reach above. Episteoloogical Explicit Tackt Individual Group Organizabori loter-organization Ontological Figure 5 Nonaka ... Knowledge Flow Theory (Adapted from [48]) As depicted in Figure 5, Nonaka views the interaction between these dimensions as the principal drivers of

  1. Reuniting Virtue and Knowledge

    ERIC Educational Resources Information Center

    Culham, Tom

    2015-01-01

    Einstein held that intuition is more important than rational inquiry as a source of discovery. Further, he explicitly and implicitly linked the heart, the sacred, devotion and intuitive knowledge. The raison d'être of universities is the advance of knowledge; however, they have primarily focused on developing student's skills in working with…

  2. Marine Education Knowledge Inventory.

    ERIC Educational Resources Information Center

    Hounshell, Paul B.; Hampton, Carolyn

    This 35-item, multiple-choice Marine Education Knowledge Inventory was developed for use in upper elementary/middle schools to measure a student's knowledge of marine science. Content of test items is drawn from oceanography, ecology, earth science, navigation, and the biological sciences (focusing on marine animals). Steps in the construction of…

  3. Knowledge representation for commonality

    NASA Technical Reports Server (NTRS)

    Yeager, Dorian P.

    1990-01-01

    Domain-specific knowledge necessary for commonality analysis falls into two general classes: commonality constraints and costing information. Notations for encoding such knowledge should be powerful and flexible and should appeal to the domain expert. The notations employed by the Commonality Analysis Problem Solver (CAPS) analysis tool are described. Examples are given to illustrate the main concepts.

  4. Pedagogical Content Knowledge Taxonomies.

    ERIC Educational Resources Information Center

    Veal, William R.; MaKinster, James G.

    1999-01-01

    Presents two taxonomies that offer a relatively comprehensive categorization scheme for future studies of pedagogical content knowledge (PCK) development in teacher education. "The General Taxonomy of PCK" addresses distinctions within and between the knowledge bases of various disciplines, science subjects, and science topics. "The Taxonomy of…

  5. Building Background Knowledge

    ERIC Educational Resources Information Center

    Fisher, Douglas; Ross, Donna; Grant, Maria

    2010-01-01

    Too often, students enter our classrooms with insufficient knowledge of physical science. As a result, they have a difficult time understanding content in texts, lectures, and laboratory activities. This lack of background knowledge can have an impact on their ability to ask questions and wonder--both key components of inquiry. In this article,…

  6. The Knowledge Bluff

    ERIC Educational Resources Information Center

    Vanderburg, Willem H.

    2007-01-01

    Our knowledge "system" is built up from disciplines and specialties as its components, which are "wired" by patterns of collaboration that constitute its organization. The intellectual autonomy of these components prevents this knowledge system from adequately accounting for what we have gradually discovered during the past 50 years: In human…

  7. The Bridge of Knowledge

    ERIC Educational Resources Information Center

    Dong, Yu Ren

    2014-01-01

    Although many English language learners (ELLs) in the United States have knowledge gaps that make it hard for them to master high-level content and skills, ELLs also often have background knowledge relevant to school learning that teachers neglect to access, this author argues. In the Common Core era, with ELLs being the fastest growing population…

  8. Cultural Knowledge in Translation.

    ERIC Educational Resources Information Center

    Olk, Harald

    2003-01-01

    Describes a study exploring the influence of cultural knowledge on the translation performance of German students of English. Found that the students often lacked sufficient knowledge about British culture to deal with widely-used cultural concepts. Findings suggest that factual reference sources have an important role to play in translation…

  9. Knowledge Production and Utilization.

    ERIC Educational Resources Information Center

    Beal, George M.; Meehan, Peter

    The study of knowledge production and utilization deals with the problems of how to translate theoretical concepts and knowledge into practical solutions to people's problems. Many criticisms have been leveled at aspects of previous information system models, including their linearity, one-way communication paths, overdependence on scientific…

  10. [Acquisition of arithmetic knowledge].

    PubMed

    Fayol, Michel

    2008-01-01

    The focus of this paper is on contemporary research on the number counting and arithmetical competencies that emerge during infancy, the preschool years, and the elementary school. I provide a brief overview of the evolution of children's conceptual knowledge of arithmetic knowledge, the acquisition and use of counting and how they solve simple arithmetic problems (e.g. 4 + 3).

  11. How Knowledge Powers Reading

    ERIC Educational Resources Information Center

    Lemov, Doug

    2017-01-01

    Recent research shows that reading comprehension relies heavily on prior knowledge. Far more than generic "reading skills" like drawing inferences, making predictions, and knowing the function of subheads, how well students learn from a nonfiction text depends on their background knowledge of the text's subject matter. And in a cyclical…

  12. Associations among perceived and objective disease knowledge and satisfaction with physician communication in patients with chronic kidney disease

    PubMed Central

    Nunes, Julie A. Wright; Wallston, Kenneth A.; Eden, Svetlana K.; Shintani, Ayumi K.; Ikizler, T. Alp; Cavanaugh, Kerri L.

    2013-01-01

    It is likely that patients with chronic kidney disease (CKD) have a limited understanding of their illness. Here we studied the relationships between objective and perceived knowledge in CKD using the Kidney Disease Knowledge Survey and the Perceived Kidney Disease Knowledge Survey. We quantified perceived and objective knowledge in 399 patients at all stages of non-dialysis dependent CKD. Demographically, the patient median age was 58 years, 47% were women, 77% had stages 3-5 CKD, and 83% were Caucasians. The overall median score of the perceived knowledge survey was 2.56 (range: 1-4), and this new measure exhibited excellent reliability and construct validity. In unadjusted analysis, perceived knowledge was associated with patient characteristics defined a priori, including objective knowledge and patient satisfaction with physician communication. In adjusted analysis, older age, male gender, and limited health literacy were associated with lower perceived knowledge. Additional analysis revealed that perceived knowledge was associated with significantly higher odds (2.13), and objective knowledge with lower odds (0.91), of patient satisfaction with physician communication. Thus, our results present a mechanism to evaluate distinct forms of patient kidney knowledge, and identify specific opportunities for education tailored to patients with CKD. PMID:21832984

  13. Teaching Knowledge Management (SIG KM).

    ERIC Educational Resources Information Center

    McInerney, Claire

    2000-01-01

    Presents an abstract of a planned session on teaching knowledge management, including knowledge management for information professionals; differences between teaching knowledge management in library schools and in business schools; knowledge practices for small groups; and current research. (LRW)

  14. Intensional reasoning about knowledge

    SciTech Connect

    Popov, O.B.

    1987-01-01

    As demands and ambitions increase in Artificial Intelligence, the need for formal systems that facilitate a study and a simulation of a machine cognition has become an inevitability. This paper explores and develops the foundations of a formal system for propositional reasoning about knowledge. The semantics of every meaningful expression in the system is fully determined by its intension, the set of complexes in which the expression is confirmed. The knowledge system is based on three zeroth-order theories of epistemic reasoning for consciousness, knowledge, and entailed knowledge. The results presented determine the soundness and the completeness of the knowledge system. The modes of reasoning and the relations among the various epistemic notions emphasize the expressive power of the intensional paradigm.

  15. Vision, knowledge, and assertion.

    PubMed

    Turri, John

    2016-04-01

    I report two experiments studying the relationship among explicit judgments about what people see, know, and should assert. When an object of interest was surrounded by visibly similar items, it diminished people's willingness to judge that an agent sees, knows, and should tell others that it is present. This supports the claim, made by many philosophers, that inhabiting a misleading environment intuitively decreases our willingness to attribute perception and knowledge. However, contrary to stronger claims made by some philosophers, inhabiting a misleading environment does not lead to the opposite pattern whereby people deny perception and knowledge. Causal modeling suggests a specific psychological model of how explicit judgments about perception, knowledge, and assertability are made: knowledge attributions cause perception attributions, which in turn cause assertability attributions. These findings advance understanding of how these three important judgments are made, provide new evidence that knowledge is the norm of assertion, and highlight some important subtleties in folk epistemology.

  16. Nurses' knowledge of pain.

    PubMed

    Wilson, Benita

    2007-06-01

    The aim of this study was to establish if postregistration education and clinical experience influence nurses' knowledge of pain. Inadequacies in the pain management process may not be tied to myth and bias originating from general attitudes and beliefs, but reflect inadequate pain knowledge. Design. A pain knowledge survey of 20 true/false statements was used to measure the knowledge base of two groups of nurses. This was incorporated in a self-administered questionnaire that also addressed lifestyle factors of patients in pain, inferences of physical pain, general attitudes and beliefs about pain management. One hundred questionnaires were distributed; 86 nurses returned the questionnaire giving a response rate of 86%. Following selection of the sample, 72 nurses participated in the study: 35 hospice/oncology nurses (specialist) and 37 district nurses (general). Data were analysed using SPSS. The specialist nurses had a more comprehensive knowledge base than the general nurses; however, their knowledge scores did not appear to be related to their experience in terms of years within the nursing profession. Whilst educational programmes contribute to an increase in knowledge, it would appear that the working environment has an influence on the development and use of this knowledge. It is suggested that the clinical environment in which the specialist nurse works can induce feelings of reduced self-efficacy and low personal control. To ease tension, strategies are used that can result in nurses refusing to endorse their knowledge, which can increase patients' pain. Clinical supervision will serve to increase the nurses' self-awareness; however, without power and autonomy to make decisions and affect change, feelings of helplessness, reduced self-efficacy and cognitive dissonance can increase. This may explain why, despite educational efforts to increase knowledge, a concomitant change in practice has not occurred.

  17. Knowledge Convergence and Collaborative Learning

    ERIC Educational Resources Information Center

    Jeong, Heisawn; Chi, Michelene T. H.

    2007-01-01

    This paper operationalized the notion of knowledge convergence and assessed quantitatively how much knowledge convergence occurred during collaborative learning. Knowledge convergence was defined as an increase in common knowledge where common knowledge referred to the knowledge that all collaborating partners had. Twenty pairs of college students…

  18. Knowledge Convergence and Collaborative Learning

    ERIC Educational Resources Information Center

    Jeong, Heisawn; Chi, Michelene T. H.

    2007-01-01

    This paper operationalized the notion of knowledge convergence and assessed quantitatively how much knowledge convergence occurred during collaborative learning. Knowledge convergence was defined as an increase in common knowledge where common knowledge referred to the knowledge that all collaborating partners had. Twenty pairs of college students…

  19. Knowledge elicitation for an operator assistant system in process control tasks

    NASA Technical Reports Server (NTRS)

    Boy, Guy A.

    1988-01-01

    A knowledge based system (KBS) methodology designed to study human machine interactions and levels of autonomy in allocation of process control tasks is presented. Users are provided with operation manuals to assist them in normal and abnormal situations. Unfortunately, operation manuals usually represent only the functioning logic of the system to be controlled. The user logic is often totally different. A method is focused on which illicits user logic to refine a KBS shell called an Operator Assistant (OA). If the OA is to help the user, it is necessary to know what level of autonomy gives the optimal performance of the overall man-machine system. For example, for diagnoses that must be carried out carefully by both the user and the OA, interactions are frequent, and processing is mostly sequential. Other diagnoses can be automated, in which the case the OA must be able to explain its reasoning in an appropriate level of detail. OA structure was used to design a working KBS called HORSES (Human Orbital Refueling System Expert System). Protocol analysis of pilots interacting with this system reveals that the a-priori analytical knowledge becomes more structured with training and the situation patterns more complex and dynamic. This approach can improve the a-priori understanding of human and automatic reasoning.

  20. Knowledge elicitation for an operator assistant system in process control tasks

    NASA Technical Reports Server (NTRS)

    Boy, Guy A.

    1988-01-01

    A knowledge based system (KBS) methodology designed to study human machine interactions and levels of autonomy in allocation of process control tasks is presented. Users are provided with operation manuals to assist them in normal and abnormal situations. Unfortunately, operation manuals usually represent only the functioning logic of the system to be controlled. The user logic is often totally different. A method is focused on which illicits user logic to refine a KBS shell called an Operator Assistant (OA). If the OA is to help the user, it is necessary to know what level of autonomy gives the optimal performance of the overall man-machine system. For example, for diagnoses that must be carried out carefully by both the user and the OA, interactions are frequent, and processing is mostly sequential. Other diagnoses can be automated, in which the case the OA must be able to explain its reasoning in an appropriate level of detail. OA structure was used to design a working KBS called HORSES (Human Orbital Refueling System Expert System). Protocol analysis of pilots interacting with this system reveals that the a-priori analytical knowledge becomes more structured with training and the situation patterns more complex and dynamic. This approach can improve the a-priori understanding of human and automatic reasoning.

  1. Unconscious knowledge: A survey

    PubMed Central

    Augusto, Luís M.

    2011-01-01

    The concept of unconscious knowledge is fundamental for an understanding of human thought processes and mentation in general; however, the psychological community at large is not familiar with it. This paper offers a survey of the main psychological research currently being carried out into cognitive processes, and examines pathways that can be integrated into a discipline of unconscious knowledge. It shows that the field has already a defined history and discusses some of the features that all kinds of unconscious knowledge seem to share at a deeper level. With the aim of promoting further research, we discuss the main challenges which the postulation of unconscious cognition faces within the psychological community. PMID:21814538

  2. Knowledge Management: A Skeptic's Guide

    NASA Technical Reports Server (NTRS)

    Linde, Charlotte

    2006-01-01

    A viewgraph presentation discussing knowledge management is shown. The topics include: 1) What is Knowledge Management? 2) Why Manage Knowledge? The Presenting Problems; 3) What Gets Called Knowledge Management? 4) Attempts to Rethink Assumptions about Knowledgs; 5) What is Knowledge? 6) Knowledge Management and INstitutional Memory; 7) Knowledge Management and Culture; 8) To solve a social problem, it's easier to call for cultural rather than organizational change; 9) Will the Knowledge Management Effort Succeed? and 10) Backup: Metrics for Valuing Intellectural Capital i.e. Knowledge.

  3. Test Your Asthma Knowledge

    MedlinePlus

    ... Current Issue Past Issues Special Section Test Your Asthma Knowledge Past Issues / Fall 2007 Table of Contents ... page please turn Javascript on. True or False? Asthma is caused by an inflammation of the inner ...

  4. Visualizing Knowledge Domains.

    ERIC Educational Resources Information Center

    Borner, Katy; Chen, Chaomei; Boyack, Kevin W.

    2003-01-01

    Reviews visualization techniques for scientific disciplines and information retrieval and classification. Highlights include historical background of scientometrics, bibliometrics, and citation analysis; map generation; process flow of visualizing knowledge domains; measures and similarity calculations; vector space model; factor analysis;…

  5. Knowledge, Understanding, and Behavior

    DTIC Science & Technology

    2003-10-04

    Knowledge, Understanding, and Behavior James Albus Intelligent Systems Division , National Institute of Standards and Technology, Gaithersburg, MD 20899 301...trails, woods and fields, hills and valleys, filled integrating, and testing intelligent systems software for with tall grass, weeds, stumps, fallen

  6. The Costs of Knowledge

    NASA Technical Reports Server (NTRS)

    Prusak, Laurence

    2008-01-01

    Acquiring knowledge-genuinely learning something new-requires the consent and commitment of the person you're trying to learn from. In contrast to information, which can usually be effectively transmitted in a document or diagram, knowledge comes from explaining, clarifying, questioning, and sometimes actually working together. Getting this kind of attention and commitment often involves some form of negotiation, since even the most generous person's time and energy are limited. Few experts sit around waiting to share their knowledge with strangers or casual acquaintances. In reasonably collaborative enterprises- I think NASA is one-this sort of negotiation isn't too onerous. People want to help each other and share what they know, so the "cost" of acquiring knowledge is relatively low. In many organizations (and many communities and countries), however, there are considerable costs associated with this activity, and many situations in which negotiations fail. The greatest knowledge cost is in and adopting knowledge to one's own use. Sometimes this means formally organizing what one learns in writing. Sometimes it means just taking time to reflect on someone else's thoughts and experiences-thinking about knowledge that is not exactly what you need but can lead you to develop ideas that will be useful. A long, discursive conversation, with all the back-and-forth that defines conversation, can be a mechanism of knowledge exchange. I have seen many participants at NASA APPEL Masters Forums talking, reflecting, and thinking-adapting what they are hearing to their own needs. Knowledge transfer is not a simple proposition. An enormous amount of information flows through the world every day, but knowledge is local, contextual, and "stickyn-that is, it takes real effort to move it from one place to another. There is no way around this. To really learn a subject, you have to work at it, you have to pay your "knowledge dues." So while, thanks to advances in technology

  7. Knowledge and information modeling.

    PubMed

    Madsen, Maria

    2010-01-01

    This chapter gives an educational overview of: * commonly used modelling methods what they represent * the importance of selecting the tools and methods suited to the health information system being designed * how the quality of the information or knowledge model is determined by the quality of the system requirements specification * differentiating between the purpose of information models and knowledge models * the benefits of the openEHR approach for health care data modeling.

  8. Prior Knowledge Assessment Guide

    DTIC Science & Technology

    2014-12-01

    facts automatically requires knowledge of the fact itself. You will have to determine what levels are important for your purposes. As an example...will see the cell positions for “array 1” automatically record in your function. GUIDE FOR DEVELOPING AND USING PRIOR KNOWLEDGE ASSESSMENTS TO...TAILOR TRAINING 50 6. Type a comma ( , ) and “array2” will automatically show in bold type. GUIDE FOR DEVELOPING AND USING PRIOR

  9. Hybrid knowledge systems

    NASA Technical Reports Server (NTRS)

    Subrahmanian, V. S.

    1994-01-01

    An architecture called hybrid knowledge system (HKS) is described that can be used to interoperate between a specification of the control laws describing a physical system, a collection of databases, knowledge bases and/or other data structures reflecting information about the world in which the physical system controlled resides, observations (e.g. sensor information) from the external world, and actions that must be taken in response to external observations.

  10. US Spacesuit Knowledge Capture

    NASA Technical Reports Server (NTRS)

    Chullen, Cinda; Thomas, Ken; McMann, Joe; Dolan, Kristi; Bitterly, Rose; Lewis, Cathleen

    2011-01-01

    The ability to learn from both the mistakes and successes of the past is vital to assuring success in the future. Due to the close physical interaction between spacesuit systems and human beings as users, spacesuit technology and usage lends itself rather uniquely to the benefits realized from the skillful organization of historical information; its dissemination; the collection and identification of artifacts; and the education of those in the field. The National Aeronautics and Space Administration (NASA), other organizations and individuals have been performing United States (U.S.) Spacesuit Knowledge Capture since the beginning of space exploration. Avenues used to capture the knowledge have included publication of reports; conference presentations; specialized seminars; and classes usually given by veterans in the field. More recently the effort has been more concentrated and formalized whereby a new avenue of spacesuit knowledge capture has been added to the archives in which videotaping occurs engaging both current and retired specialists in the field presenting technical scope specifically for education and preservation of knowledge. With video archiving, all these avenues of learning can now be brought to life with the real experts presenting their wealth of knowledge on screen for future learners to enjoy. Scope and topics of U.S. spacesuit knowledge capture have included lessons learned in spacesuit technology, experience from the Gemini, Apollo, Skylab and Shuttle programs, hardware certification, design, development and other program components, spacesuit evolution and experience, failure analysis and resolution, and aspects of program management. Concurrently, U.S. spacesuit knowledge capture activities have progressed to a level where NASA, the National Air and Space Museum (NASM), Hamilton Sundstrand (HS) and the spacesuit community are now working together to provide a comprehensive closed-looped spacesuit knowledge capture system which includes

  11. Demonopolizing medical knowledge.

    PubMed

    Arora, Sanjeev; Thornton, Karla; Komaromy, Miriam; Kalishman, Summers; Katzman, Joanna; Duhigg, Daniel

    2014-01-01

    In the past 100 years, there has been an explosion of medical knowledge-and in the next 50 years, more medical knowledge will be available than ever before. Regrettably, current medical practice has been unable to keep pace with this explosion of medical knowledge. Specialized medical knowledge has been confined largely to academic medical centers (i.e., teaching hospitals) and to specialists in major cities; it has been disconnected from primary care clinicians on the front lines of patient care. To bridge this disconnect, medical knowledge must be demonopolized, and a platform for collaborative practice amongst all clinicians needs to be created. A new model of health care and education delivery called Project ECHO (Extension for Community Healthcare Outcomes), developed by the first author, does just this. Using videoconferencing technology and case-based learning, ECHO's medical specialists provide training and mentoring to primary care clinicians working in rural and urban underserved areas so that the latter can deliver the best evidence-based care to patients with complex health conditions in their own communities. The ECHO model increases access to care in rural and underserved areas, and it demonopolizes specialized medical knowledge and expertise.

  12. The Roles of Knowledge Professionals for Knowledge Management.

    ERIC Educational Resources Information Center

    Kim, Seonghee

    This paper starts by exploring the definition of knowledge and knowledge management; examples of acquisition, creation, packaging, application, and reuse of knowledge are provided. It then considers the partnership for knowledge management and especially how librarians as knowledge professionals, users, and technology experts can contribute to…

  13. Modelling of classification rules on metabolic patterns including machine learning and expert knowledge.

    PubMed

    Baumgartner, Christian; Böhm, Christian; Baumgartner, Daniela

    2005-04-01

    Machine learning has a great potential to mine potential markers from high-dimensional metabolic data without any a priori knowledge. Exemplarily, we investigated metabolic patterns of three severe metabolic disorders, PAHD, MCADD, and 3-MCCD, on which we constructed classification models for disease screening and diagnosis using a decision tree paradigm and logistic regression analysis (LRA). For the LRA model-building process we assessed the relevance of established diagnostic flags, which have been developed from the biochemical knowledge of newborn metabolism, and compared the models' error rates with those of the decision tree classifier. Both approaches yielded comparable classification accuracy in terms of sensitivity (>95.2%), while the LRA models built on flags showed significantly enhanced specificity. The number of false positive cases did not exceed 0.001%.

  14. Knowledge-based vision for space station object motion detection, recognition, and tracking

    NASA Technical Reports Server (NTRS)

    Symosek, P.; Panda, D.; Yalamanchili, S.; Wehner, W., III

    1987-01-01

    Computer vision, especially color image analysis and understanding, has much to offer in the area of the automation of Space Station tasks such as construction, satellite servicing, rendezvous and proximity operations, inspection, experiment monitoring, data management and training. Knowledge-based techniques improve the performance of vision algorithms for unstructured environments because of their ability to deal with imprecise a priori information or inaccurately estimated feature data and still produce useful results. Conventional techniques using statistical and purely model-based approaches lack flexibility in dealing with the variabilities anticipated in the unstructured viewing environment of space. Algorithms developed under NASA sponsorship for Space Station applications to demonstrate the value of a hypothesized architecture for a Video Image Processor (VIP) are presented. Approaches to the enhancement of the performance of these algorithms with knowledge-based techniques and the potential for deployment of highly-parallel multi-processor systems for these algorithms are discussed.

  15. The Music Educator's Professional Knowledge

    ERIC Educational Resources Information Center

    Jorquera Jaramillo, Maria Cecilia

    2008-01-01

    Professional knowledge in teaching is broadly based on personal knowledge. Hence, it is important to build teachers' development out of their initial knowledge. The idea of a sociogenesis of educational knowledge, teacher knowledge and training models as well as teaching models are the basis of this study. It aims to diagnose the knowledge…

  16. Knowledge, knowledge, who's got the knowledge? The male contraceptive career.

    PubMed

    Swanson, J M

    1980-01-01

    most common pattern among men in this study. In this instance the man becomes increasingly involved in the family planning process. This man has access to knowledge through his partner and enters a process of determining what he needs to know and, in turn, finding out how he can use what he has gained to insure a more satisfactory outcome. These men put much effort into increasing their understanding of their partner's family planning experience. The assuming pattern is based on the man's perception of himself as an outsider who makes the assumption that his partner is managing family planning alone. He is less involved than men in the other 2 patterns.

  17. Uncertainty as knowledge

    PubMed Central

    Lewandowsky, Stephan; Ballard, Timothy; Pancost, Richard D.

    2015-01-01

    This issue of Philosophical Transactions examines the relationship between scientific uncertainty about climate change and knowledge. Uncertainty is an inherent feature of the climate system. Considerable effort has therefore been devoted to understanding how to effectively respond to a changing, yet uncertain climate. Politicians and the public often appeal to uncertainty as an argument to delay mitigative action. We argue that the appropriate response to uncertainty is exactly the opposite: uncertainty provides an impetus to be concerned about climate change, because greater uncertainty increases the risks associated with climate change. We therefore suggest that uncertainty can be a source of actionable knowledge. We survey the papers in this issue, which address the relationship between uncertainty and knowledge from physical, economic and social perspectives. We also summarize the pervasive psychological effects of uncertainty, some of which may militate against a meaningful response to climate change, and we provide pointers to how those difficulties may be ameliorated. PMID:26460108

  18. Knowledge management across domains

    NASA Astrophysics Data System (ADS)

    Gilfillan, Lynne G.; Haddock, Gail; Borek, Stan

    2001-02-01

    This paper presents a secure, Internet-enabled, third wave knowledge management system. TheResearchPlaceTM, that will facilitate a collaborative, strategic approach to analyzing public safety problems and developing interventions to reduce them. TheResearchPlace, currently being developed under Government and private funding for use by the National Cancer Institute, Federal agencies, and the Defense Advanced Research Project Agency, will augment Geographic Information Systems and analytical tool capabilities by providing a synergistic workspace where teams of multidisciplinary professions can manage portfolios of existing knowledge resources, locate and create new knowledge resources that are added to portfolios, and collaborate with colleagues to leverage evolving portfolios' capabilities on team missions. TheResearchPlace is currently in use by selected alpha users at selected federal sites, and by the faculty of Howard University.

  19. LIS Professionals as Knowledge Engineers.

    ERIC Educational Resources Information Center

    Poulter, Alan; And Others

    1994-01-01

    Considers the role of library and information science professionals as knowledge engineers. Highlights include knowledge acquisition, including personal experience, interviews, protocol analysis, observation, multidimensional sorting, printed sources, and machine learning; knowledge representation, including production rules and semantic nets;…

  20. LIS Professionals as Knowledge Engineers.

    ERIC Educational Resources Information Center

    Poulter, Alan; And Others

    1994-01-01

    Considers the role of library and information science professionals as knowledge engineers. Highlights include knowledge acquisition, including personal experience, interviews, protocol analysis, observation, multidimensional sorting, printed sources, and machine learning; knowledge representation, including production rules and semantic nets;…

  1. A Priori Calculations of Thermodynamic Functions

    DTIC Science & Technology

    1991-12-01

    hypothetical molecules (hydroxy-methyl- amino)nitro-methanol, 2,4,6,8-tetraazabicyclo-[3.3.0]octane, and 1,2,3-oxadiazolo-1,2,3- oxadiazole -1,1 -dioxide...C.010 rs 13,4 Oxadiazole 1.267 1.297h 0.030 rs 13,4 Thiadiazole 1.277 1.302 i 0.025 rs Diazomethane 1.288 1.300J 0.012 rs Pyrazole 1.309 1.331k 0.022...1.330f 0.018 ra Pyrazole 1.335 1.590.016 rs Dimethyl Nitramine 1 .365h 1.383i 0.018 r 1,3,4 Thladiazole 1.378 1.371i -0.007 r 1,3,4 Oxadiazole 1.402 1.9k

  2. Knowledge Query Language (KQL)

    DTIC Science & Technology

    2016-02-12

    making portability of the queries or query-dependent algorithms difficult. This report introduces an ontological declarative approach that is...Expressions using the ontology implemented in a knowledge registry, and returning query results with provenance information...OF CONTENTS Page Executive Summary iii Acknowledgments v List of Figures ix 1. INTRODUCTION 1 2. DETAILED DESCRIPTION 3 2.1 Registry Ontology 6

  3. Knowledge Query Language (KQL)

    DTIC Science & Technology

    2016-02-01

    portability of the queries or query-dependent algorithms difficult. This report introduces an ontological declarative approach that is independent...Expressions using the ontology implemented in a knowledge registry, and returning query results with provenance information...Page Executive Summary iii Acknowledgments v List of Figures ix 1. INTRODUCTION 1 2. DETAILED DESCRIPTION 3 2.1 Registry Ontology 6 2.2

  4. Hermeneutics of Integrative Knowledge.

    ERIC Educational Resources Information Center

    Shin, Un-chol

    This paper examines and compares the formation processes and structures of three types of integrative knowledge that in general represent natural sciences, social sciences, and humanities. These three types can be observed, respectively, in the philosophies of Michael Polanyi, Jurgen Habermas, and Paul Ricoeur. These types of integrative knowledge…

  5. Harvesting Cultural Knowledge.

    ERIC Educational Resources Information Center

    Keating, Joseph F.

    1997-01-01

    Describes a year-long course called Outdoor Science taught at an American Indian reservation high school that demonstrates to students the connection between traditional tribal knowledge and western science to spark student interest in science. Presents a list that contains references on the subject of ethnobotany. Includes specific references for…

  6. Assessing Knowledge of Cultures.

    ERIC Educational Resources Information Center

    Norris, Robert

    The procedures used in a study to determine how well a group of American Indian college students understood their traditional and modern cultures and a college Caucasian culture were explained in this paper. The sample consisted of 111 Indian students enrolled in the University of New Mexico. The students were tested in the areas of knowledge of…

  7. Doing Knowledge Management

    ERIC Educational Resources Information Center

    Firestone, Joseph M.; McElroy, Mark W.

    2005-01-01

    Purpose: Knowledge management (KM) as a field has been characterized by great confusion about its conceptual foundations and scope, much to the detriment of assessments of its impact and track record. The purpose of this paper is to contribute toward defining the scope of KM and ending the confusion, by presenting a conceptual framework and set of…

  8. Monitoring Knowledge Base (MKB)

    EPA Pesticide Factsheets

    The Monitoring Knowledge Base (MKB) is a compilation of emissions measurement and monitoring techniques associated with air pollution control devices, industrial process descriptions, and permitting techniques, including flexible permit development. Using MKB, one can gain a comprehensive understanding of emissions sources, control devices, and monitoring techniques, enabling one to determine appropriate permit terms and conditions.

  9. Knowledge Representation in PARKA

    DTIC Science & Technology

    1990-02-01

    the color of Poodle could be restricted to being just black or white, while the color of Irish-Setter could be set to red. Note that this would allow a...sub- field of knowledge representation with considerable subtlety and a history of interesting, difficult problems (see, e.g. [10]). Winston et. al

  10. Reasoning from Incomplete Knowledge.

    ERIC Educational Resources Information Center

    Collins, Allan M.; And Others

    People use a variety of plausible, but uncertain inferences to answer questions about which their knowledge is incomplete. Such inferential thinking and reasoning is being incorporated into the SCHOLAR computer-assisted instruction (CAI) system. Socratic tutorial techniques in CAI systems such as SCHOLAR are described, and examples of their…

  11. NHS clinical knowledge summaries.

    PubMed

    Richards, Derek

    2009-01-01

    The UK National Health Service (NHS) Clinical Knowledge Summaries, formerly known as PRODIGY, are part of the National Library for Health and provide a source of evidence-based information and practical know-how relating to the common conditions managed in primary care.

  12. Transforming Data into Knowledge

    ERIC Educational Resources Information Center

    Mills, Lane

    2006-01-01

    School systems can be data rich and information poor if they do not understand and manage their data effectively. The task for school leaders is to put existing data into a format that lends itself to answering questions and improving outcomes for the students. Common barriers to transforming data into knowledge in education settings often include…

  13. Anishinaabe Star Knowledge.

    ERIC Educational Resources Information Center

    Price, Michael Wassegijig

    2002-01-01

    A connection with nature constitutes the difference between Western science and indigenous perspectives of the natural world. Understanding the synchronicity of natural and astronomical cycles is integral to Anishinaabe cosmology. Examples show how the Anishinaabe cultural worldview and philosophy are reflected in their celestial knowledge and how…

  14. Keeping Knowledge in Site

    ERIC Educational Resources Information Center

    Livingstone, David N.

    2010-01-01

    Recent work on the history of education has been registering a "spatial turn" in its historiography. These reflections from a historical geographer working on the spatiality of knowledge enterprises (science in particular) reviews some recent developments in the field before turning to three themes--landscape agency, geographies of textuality, and…

  15. Spectral Bayesian Knowledge Tracing

    ERIC Educational Resources Information Center

    Falakmasir, Mohammad; Yudelson, Michael; Ritter, Steve; Koedinger, Ken

    2015-01-01

    Bayesian Knowledge Tracing (BKT) has been in wide use for modeling student skill acquisition in Intelligent Tutoring Systems (ITS). BKT tracks and updates student's latent mastery of a skill as a probability distribution of a binary variable. BKT does so by accounting for observed student successes in applying the skill correctly, where success is…

  16. Electoral Knowledge and Uncertainty.

    ERIC Educational Resources Information Center

    Blood, R. Warwick; And Others

    Research indicates that the media play a role in shaping the information that voters have about election options. Knowledge of those options has been related to actual vote, but has not been shown to be strongly related to uncertainty. Uncertainty, however, does seem to motivate voters to engage in communication activities, some of which may…

  17. A Measure of Knowledge.

    ERIC Educational Resources Information Center

    Burley, Hansel

    2002-01-01

    Standardized tests cover acquired knowledge in reading, mathematics, and English. To avoid misuse of standardized tests school districts should use tests to diagnose individual learning needs and improve programs; and administer tests once a year in the fall semester so results help guide learning plans for students Contains a glossary of basic…

  18. Activating Event Knowledge

    PubMed Central

    Hare, Mary; Jones, Michael; Thomson, Caroline; Kelly, Sarah; McRae, Ken

    2009-01-01

    An increasing number of results in sentence and discourse processing demonstrate that comprehension relies on rich pragmatic knowledge about real-world events, and that incoming words incrementally activate such knowledge. If so, then even outside of any larger context, nouns should activate knowledge of the generalized events that they denote or typically play a role in. We used short stimulus onset asynchrony priming to demonstrate that (1) event nouns prime people (sale-shopper) and objects (trip-luggage) commonly found at those events; (2) location nouns prime people/animals (hospital-doctor) and objects (barn-hay) commonly found at those locations; and (3) instrument nouns prime things on which those instruments are commonly used (key-door), but not the types of people who tend to use them (hose-gardener). The priming effects are not due to normative word association. On our account, facilitation results from event knowledge relating primes and targets. This has much in common with computational models like LSA or BEAGLE in which one word primes another if they frequently occur in similar contexts. LSA predicts priming for all six experiments, whereas BEAGLE correctly predicted that priming should not occur for the instrument-people relation but should occur for the other five. We conclude that event-based relations are encoded in semantic memory and computed as part of word meaning, and have a strong influence on language comprehension. PMID:19298961

  19. Knowledge, Labour and Education.

    ERIC Educational Resources Information Center

    Ilon, Lynn

    2000-01-01

    Argues that educational leaders must place education in a proactive role that can influence global dynamics. Examines production for changes in how value, work, and learning environments are created. Discusses emerging definitions of education, value, labor, and knowledge. Includes references. (CMK)

  20. Medical Knowledge Bases.

    ERIC Educational Resources Information Center

    Miller, Randolph A.; Giuse, Nunzia B.

    1991-01-01

    Few commonly available, successful computer-based tools exist in medical informatics. Faculty expertise can be included in computer-based medical information systems. Computers allow dynamic recombination of knowledge to answer questions unanswerable with print textbooks. Such systems can also create stronger ties between academic and clinical…

  1. Keeping Knowledge in Site

    ERIC Educational Resources Information Center

    Livingstone, David N.

    2010-01-01

    Recent work on the history of education has been registering a "spatial turn" in its historiography. These reflections from a historical geographer working on the spatiality of knowledge enterprises (science in particular) reviews some recent developments in the field before turning to three themes--landscape agency, geographies of textuality, and…

  2. National Knowledge Commission

    NASA Astrophysics Data System (ADS)

    Pitroda, Sam

    2007-04-01

    India's National Knowledge Commission (NKC) established by the prime minister is focused on building institutions and infrastructure in Education, Science and Technology, Innovation etc. to meet the challenges of the knowledge economy in the 21st century and increase India's competitive advantage in the global market. India today stands poised to reap the benefits of a rapidly growing economy and a major demographic advantage, with 550 million young people below the age of 25 years, the largest in the world. The NKC is focused on five critical areas of knowledge related to access, concepts, creation, applications and services. This includes a variety of subject areas such as language, translations, libraries, networks, portals, affirmative action, distance learning, intellectual property, Entrepreneurship, application in Agriculture, health, small and medium scale industries, e-governance etc. One of the keys to this effort is to build a national broadband gigabit of networks of 500 nodes to connect universities, Libraries, Laboratories, Hospitals, Agriculture institutions etc. to share resources and collaborate on multidisciplinary activities. This presentation will introduce the NKC, discuss methodology, subject areas, specific recommendation and outline a plan to build knowledge networks and specifics on network architecture, applications, and utilities.

  3. Knowledge Management Section

    DTIC Science & Technology

    2008-08-01

    Knowledge Online (www.us.army.mil) and General Dennis J. Reimer Training and Doctrine Digital Library at (www.train.army.mil). FM 6-01.1 Distribution...v Stryker Symposium II – Exploiting Online Collaboration...significantly influence all aspects of the Army for the foreseeable future, partly because of the changing environment and partly due to ongoing operational

  4. Anishinaabe Star Knowledge.

    ERIC Educational Resources Information Center

    Price, Michael Wassegijig

    2002-01-01

    A connection with nature constitutes the difference between Western science and indigenous perspectives of the natural world. Understanding the synchronicity of natural and astronomical cycles is integral to Anishinaabe cosmology. Examples show how the Anishinaabe cultural worldview and philosophy are reflected in their celestial knowledge and how…

  5. Knowledge systems in agroforestry

    Treesearch

    Wieland Kunzel

    1993-01-01

    Pacific Islands agroforestry has evolved into sustainable, diverse and productive a land use systems in many areas. We marvel at these systems, and the scientific world is trying to catch up with the traditional knowledge. At the same time, Pacific Islands farmers are abandoning their agroforestry systems in great numbers. It is mainly intensified agriculture for cash...

  6. Activating event knowledge.

    PubMed

    Hare, Mary; Jones, Michael; Thomson, Caroline; Kelly, Sarah; McRae, Ken

    2009-05-01

    An increasing number of results in sentence and discourse processing demonstrate that comprehension relies on rich pragmatic knowledge about real-world events, and that incoming words incrementally activate such knowledge. If so, then even outside of any larger context, nouns should activate knowledge of the generalized events that they denote or typically play a role in. We used short stimulus onset asynchrony priming to demonstrate that (1) event nouns prime people (sale-shopper) and objects (trip-luggage) commonly found at those events; (2) location nouns prime people/animals (hospital-doctor) and objects (barn-hay) commonly found at those locations; and (3) instrument nouns prime things on which those instruments are commonly used (key-door), but not the types of people who tend to use them (hose-gardener). The priming effects are not due to normative word association. On our account, facilitation results from event knowledge relating primes and targets. This has much in common with computational models like LSA or BEAGLE in which one word primes another if they frequently occur in similar contexts. LSA predicts priming for all six experiments, whereas BEAGLE correctly predicted that priming should not occur for the instrument-people relation but should occur for the other five. We conclude that event-based relations are encoded in semantic memory and computed as part of word meaning, and have a strong influence on language comprehension.

  7. Knowledge Management as Enterprise

    ERIC Educational Resources Information Center

    Kutay, Cat

    2007-01-01

    Indigenous people have been for a long time deprived of financial benefit from their knowledge. Campaigns around the stolen wages and the "Pay the Rent" campaign highlight this. As does the endemic poverty and economic disenfranchisement experienced by many Indigenous people and communities in Australia. Recent enterprises developed by…

  8. Knowledge Management as Enterprise

    ERIC Educational Resources Information Center

    Kutay, Cat

    2007-01-01

    Indigenous people have been for a long time deprived of financial benefit from their knowledge. Campaigns around the stolen wages and the "Pay the Rent" campaign highlight this. As does the endemic poverty and economic disenfranchisement experienced by many Indigenous people and communities in Australia. Recent enterprises developed by…

  9. Medical Knowledge Bases.

    ERIC Educational Resources Information Center

    Miller, Randolph A.; Giuse, Nunzia B.

    1991-01-01

    Few commonly available, successful computer-based tools exist in medical informatics. Faculty expertise can be included in computer-based medical information systems. Computers allow dynamic recombination of knowledge to answer questions unanswerable with print textbooks. Such systems can also create stronger ties between academic and clinical…

  10. Online Knowledge Communities.

    ERIC Educational Resources Information Center

    de Vries, Sjoerd; Bloemen, Paul; Roossink, Lonneke

    This paper describes the concept of online knowledge communities. The concept is defined, and six qualities of online communities are identified: members (user roles are clearly defined); mission (generally accepted goal-statement, ideas, beliefs, etc.); commitment (members give their loyalty to the mission); social interaction (frequent…

  11. Knowledge-Based Abstracting.

    ERIC Educational Resources Information Center

    Black, William J.

    1990-01-01

    Discussion of automatic abstracting of technical papers focuses on a knowledge-based method that uses two sets of rules. Topics discussed include anaphora; text structure and discourse; abstracting techniques, including the keyword method and the indicator phrase method; and tools for text skimming. (27 references) (LRW)

  12. Creating Illusions of Knowledge: Learning Errors that Contradict Prior Knowledge

    ERIC Educational Resources Information Center

    Fazio, Lisa K.; Barber, Sarah J.; Rajaram, Suparna; Ornstein, Peter A.; Marsh, Elizabeth J.

    2013-01-01

    Most people know that the Pacific is the largest ocean on Earth and that Edison invented the light bulb. Our question is whether this knowledge is stable, or if people will incorporate errors into their knowledge bases, even if they have the correct knowledge stored in memory. To test this, we asked participants general-knowledge questions 2 weeks…

  13. Depth of Teachers' Knowledge: Frameworks for Teachers' Knowledge of Mathematics

    ERIC Educational Resources Information Center

    Holmes, Vicki-Lynn

    2012-01-01

    This article describes seven teacher knowledge frameworks and relates these frameworks to the teaching and assessment of elementary teacher's mathematics knowledge. The frameworks classify teachers' knowledge and provide a vocabulary and common language through which knowledge can be discussed and assessed. These frameworks are categorized into…

  14. Metalinguistic Knowledge, Metalingual Knowledge, and Proficiency in L2 Spanish

    ERIC Educational Resources Information Center

    Gutierrez, Xavier

    2013-01-01

    The role of metalinguistic knowledge of language and knowledge of technical terms (i.e. metalingual knowledge) in second language (L2) learning and use is a matter of controversy in the field of Second Language Acquisition. This paper examines the development of these two types of knowledge in adult university-level learners of L2 Spanish, and…

  15. Creating Illusions of Knowledge: Learning Errors that Contradict Prior Knowledge

    ERIC Educational Resources Information Center

    Fazio, Lisa K.; Barber, Sarah J.; Rajaram, Suparna; Ornstein, Peter A.; Marsh, Elizabeth J.

    2013-01-01

    Most people know that the Pacific is the largest ocean on Earth and that Edison invented the light bulb. Our question is whether this knowledge is stable, or if people will incorporate errors into their knowledge bases, even if they have the correct knowledge stored in memory. To test this, we asked participants general-knowledge questions 2 weeks…

  16. Depth of Teachers' Knowledge: Frameworks for Teachers' Knowledge of Mathematics

    ERIC Educational Resources Information Center

    Holmes, Vicki-Lynn

    2012-01-01

    This article describes seven teacher knowledge frameworks and relates these frameworks to the teaching and assessment of elementary teacher's mathematics knowledge. The frameworks classify teachers' knowledge and provide a vocabulary and common language through which knowledge can be discussed and assessed. These frameworks are categorized into…

  17. Reusing Design Knowledge Based on Design Cases and Knowledge Map

    ERIC Educational Resources Information Center

    Yang, Cheng; Liu, Zheng; Wang, Haobai; Shen, Jiaoqi

    2013-01-01

    Design knowledge was reused for innovative design work to support designers with product design knowledge and help designers who lack rich experiences to improve their design capacity and efficiency. First, based on the ontological model of product design knowledge constructed by taxonomy, implicit and explicit knowledge was extracted from some…

  18. Knowledge Acquisition Using Linguistic-Based Knowledge Analysis

    Treesearch

    Daniel L. Schmoldt

    1998-01-01

    Most knowledge-based system developmentefforts include acquiring knowledge from one or more sources. difficulties associated with this knowledge acquisition task are readily acknowledged by most researchers. While a variety of knowledge acquisition methods have been reported, little has been done to organize those different methods and to suggest how to apply them...

  19. Reusing Design Knowledge Based on Design Cases and Knowledge Map

    ERIC Educational Resources Information Center

    Yang, Cheng; Liu, Zheng; Wang, Haobai; Shen, Jiaoqi

    2013-01-01

    Design knowledge was reused for innovative design work to support designers with product design knowledge and help designers who lack rich experiences to improve their design capacity and efficiency. First, based on the ontological model of product design knowledge constructed by taxonomy, implicit and explicit knowledge was extracted from some…

  20. Knowledge From Pictures (KFP)

    NASA Technical Reports Server (NTRS)

    Truszkowski, Walt; Paterra, Frank; Bailin, Sidney

    1993-01-01

    The old maxim goes: 'A picture is worth a thousand words'. The objective of the research reported in this paper is to demonstrate this idea as it relates to the knowledge acquisition process and the automated development of an expert system's rule base. A prototype tool, the Knowledge From Pictures (KFP) tool, has been developed which configures an expert system's rule base by an automated analysis of and reasoning about a 'picture', i.e., a graphical representation of some target system to be supported by the diagnostic capabilities of the expert system under development. This rule base, when refined, could then be used by the expert system for target system monitoring and fault analysis in an operational setting. Most people, when faced with the problem of understanding the behavior of a complicated system, resort to the use of some picture or graphical representation of the system as an aid in thinking about it. This depiction provides a means of helping the individual to visualize the bahavior and dynamics of the system under study. An analysis of the picture augmented with the individual's background information, allows the problem solver to codify knowledge about the system. This knowledge can, in turn, be used to develop computer programs to automatically monitor the system's performance. The approach taken is this research was to mimic this knowledge acquisition paradigm. A prototype tool was developed which provides the user: (1) a mechanism for graphically representing sample system-configurations appropriate for the domain, and (2) a linguistic device for annotating the graphical representation with the behaviors and mutual influences of the components depicted in the graphic. The KFP tool, reasoning from the graphical depiction along with user-supplied annotations of component behaviors and inter-component influences, generates a rule base that could be used in automating the fault detection, isolation, and repair of the system.

  1. Standard model of knowledge representation

    NASA Astrophysics Data System (ADS)

    Yin, Wensheng

    2016-09-01

    Knowledge representation is the core of artificial intelligence research. Knowledge representation methods include predicate logic, semantic network, computer programming language, database, mathematical model, graphics language, natural language, etc. To establish the intrinsic link between various knowledge representation methods, a unified knowledge representation model is necessary. According to ontology, system theory, and control theory, a standard model of knowledge representation that reflects the change of the objective world is proposed. The model is composed of input, processing, and output. This knowledge representation method is not a contradiction to the traditional knowledge representation method. It can express knowledge in terms of multivariate and multidimensional. It can also express process knowledge, and at the same time, it has a strong ability to solve problems. In addition, the standard model of knowledge representation provides a way to solve problems of non-precision and inconsistent knowledge.

  2. Language knowledge and event knowledge in language use.

    PubMed

    Willits, Jon A; Amato, Michael S; MacDonald, Maryellen C

    2015-05-01

    This paper examines how semantic knowledge is used in language comprehension and in making judgments about events in the world. We contrast knowledge gleaned from prior language experience ("language knowledge") and knowledge coming from prior experience with the world ("world knowledge"). In two corpus analyses, we show that previous research linking verb aspect and event representations have confounded language and world knowledge. Then, using carefully chosen stimuli that remove this confound, we performed four experiments that manipulated the degree to which language knowledge or world knowledge should be salient and relevant to performing a task, finding in each case that participants use the type of knowledge most appropriate to the task. These results provide evidence for a highly context-sensitive and interactionist perspective on how semantic knowledge is represented and used during language processing.

  3. Knowledge Resources - A Knowledge Management Approach for Digital Ecosystems

    NASA Astrophysics Data System (ADS)

    Kurz, Thomas; Eder, Raimund; Heistracher, Thomas

    The paper at hand presents an innovative approach for the conception and implementation of knowledge management in Digital Ecosystems. Based on a reflection of Digital Ecosystem research of the past years, an architecture is outlined which utilizes Knowledge Resources as the central and simplest entities of knowledge transfer. After the discussion of the related conception, the result of a first prototypical implementation is described that helps the transformation of implicit knowledge to explicit knowledge for wide use.

  4. Investigating the Knowledge Management Culture

    ERIC Educational Resources Information Center

    Stylianou, Vasso; Savva, Andreas

    2016-01-01

    Knowledge Management (KM) efforts aim at leveraging an organization into a knowledge organization thereby presenting knowledge employees with a very powerful tool; organized valuable knowledge accessible when and where needed in flexible, technologically-enhanced modes. The attainment of this aim, i.e., the transformation into a knowledge…

  5. Knowledge Translation: Implications for Evaluation

    ERIC Educational Resources Information Center

    Davison, Colleen M.

    2009-01-01

    Translation theory originates in the field of applied linguistics and communication. The term knowledge translation has been adopted in health and other fields to refer to the exchange, synthesis, and application of knowledge. The logic model is a circular or iterative loop among various knowledge translation actors (knowledge producers and users)…

  6. Knowledge Creation in Constructivist Learning

    ERIC Educational Resources Information Center

    Jaleel, Sajna; Verghis, Alie Molly

    2015-01-01

    In today's competitive global economy characterized by knowledge acquisition, the concept of knowledge management has become increasingly prevalent in academic and business practices. Knowledge creation is an important factor and remains a source of competitive advantage over knowledge management. Constructivism holds that learners learn actively…

  7. Bioethics and knowledge.

    PubMed

    Bernard, J

    1990-01-01

    The acquisition of knowledge must be regarded as the first duty, as affirmed by Claude Bernard and Jacques Monod. It occupies a place in the forefront of other duties--respect for the individual and his liberty and dignity--to which the duty of knowledge must be subordinated. An attempt must be made to reconcile these diverse obligations. These three situations are described by stressing the importance of the principles that govern these studies: respect for the individual; respect for science; restriction of human experiments to what is strictly necessary; scrupulous assessment of the balance of risks and advantages; the human body is not an object of commerce; free and informed consent; solidarity between men.

  8. [Models of self knowledge].

    PubMed

    Orlando, Eleonora

    2005-01-01

    The main purpose of this paper is to analyze some different explanatory models of self - knowledge, belonging to Contemporary analytic philosophy. As a starting point, I will focus on self-ascriptions of mental states of the likes of "I have a headache", "I am thinking about my son", "I desire to shock my father", namely, the so-called "avowals". In the first part, I will point out what I take to be the set of characteristics of avowals that any theory about self-knowledge should account for. In the second part, I will present and contrast with one another the main theoretical explanatory models that have been put forward to give the required account.

  9. Test your troubleshooting knowledge.

    PubMed

    Snyder, E

    2001-01-01

    While troubleshooting and repairing medical instrumentation may be all that BMETs would like to do, it's just too limited in scope to perform the job effectively. Flattened organizations can require greater responsibility for BMETs--and lead to greater ambiguity. Besides electronic troubleshooting skills, mechanical ability, and the knowledge of how medical equipment normally operates, additional skills are required of the BMET to effectively facilitate a repair--such as knowledge of pertinent codes and standards, job safety laws and guidelines, politeness, and empathy for the equipment user. You will notice that many of these relate to interpersonal relations. The ability to interact with fellow health care workers in a non-threatening manner and to have an appreciation for their perspectives are valuable customer service skills--potentially more valuable than being able to do component-level troubleshooting!

  10. The Knowledge Stealing Initiative?

    NASA Technical Reports Server (NTRS)

    Goshorn, Larry

    2005-01-01

    I have the honor of being on the Academy of Program and Project Leadership (APPL) Knowledge Sharing Feedback and Assessment Team (FAA), and as such, I am privileged to receive the feedback written by many of you as attendees of the Project Management (PM) Master s Forums. It is the intent of the FAA Team and APPL leadership to use this feedback as a tool for continuous program improvement. As a retired (sort of) PM in the payload contracting industry, I'm a big supporter of NASA s Knowledge Sharing Initiative (KSI), especially the Master's Forums. I really enjoy participating in them. Unfortunately I had to miss the 8th forum in Pasadena this past Spring, but I did get the feedback package for the Assessment Team work. So here I was, reviewing twelve pages of comments, reflections, learning notes and critiques from attendees of the 8th forum.

  11. Knowledge Integration to Make Decisions About Complex Systems: Sustainability of Energy Production from Agriculture

    SciTech Connect

    Danuso, Francesco

    2008-06-18

    A major bottleneck for improving the governance of complex systems, rely on our ability to integrate different forms of knowledge into a decision support system (DSS). Preliminary aspects are the classification of different types of knowledge (a priori or general, a posteriori or specific, with uncertainty, numerical, textual, algorithmic, complete/incomplete, etc.), the definition of ontologies for knowledge management and the availability of proper tools like continuous simulation models, event driven models, statistical approaches, computational methods (neural networks, evolutionary optimization, rule based systems etc.) and procedure for textual documentation. Following these views at University of Udine, a computer language (SEMoLa, Simple, Easy Modelling Language) for knowledge integration has been developed. SEMoLa can handle models, data, metadata and textual knowledge; it implements and extends the system dynamics ontology (Forrester, 1968; Joergensen, 1994) in which systems are modeled by the concepts of material, group, state, rate, parameter, internal and external events and driving variables. As an example, a SEMoLa model to improve management and sustainability (economical, energetic, environmental) of the agricultural farms is presented. The model (X-Farm) simulates a farm in which cereal and forage yield, oil seeds, milk, calves and wastes can be sold or reused. X-Farm is composed by integrated modules describing fields (crop and soil), feeds and materials storage, machinery management, manpower management, animal husbandry, economic and energetic balances, seed oil extraction, manure and wastes management, biogas production from animal wastes and biomasses.

  12. Knowledge Integration to Make Decisions About Complex Systems: Sustainability of Energy Production from Agriculture

    ScienceCinema

    Danuso, Francesco [University of Udine, Italy

    2016-07-12

    A major bottleneck for improving the governance of complex systems, rely on our ability to integrate different forms of knowledge into a decision support system (DSS). Preliminary aspects are the classification of different types of knowledge (a priori or general, a posteriori or specific, with uncertainty, numerical, textual, algorithmic, complete/incomplete, etc.), the definition of ontologies for knowledge management and the availability of proper tools like continuous simulation models, event driven models, statistical approaches, computational methods (neural networks, evolutionary optimization, rule based systems etc.) and procedure for textual documentation. Following these views at University of Udine, a computer language (SEMoLa, Simple, Easy Modelling Language) for knowledge integration has been developed.  SEMoLa can handle models, data, metadata and textual knowledge; it implements and extends the system dynamics ontology (Forrester, 1968; Jørgensen, 1994) in which systems are modelled by the concepts of material, group, state, rate, parameter, internal and external events and driving variables. As an example, a SEMoLa model to improve management and sustainability (economical, energetic, environmental) of the agricultural farms is presented. The model (X-Farm) simulates a farm in which cereal and forage yield, oil seeds, milk, calves and wastes can be sold or reused. X-Farm is composed by integrated modules describing fields (crop and soil), feeds and materials storage, machinery management, manpower  management, animal husbandry, economic and energetic balances, seed oil extraction, manure and wastes management, biogas production from animal wastes and biomasses.

  13. Knowledge Integration to Make Decisions About Complex Systems: Sustainability of Energy Production from Agriculture

    SciTech Connect

    Danuso, Francesco

    2008-06-18

    A major bottleneck for improving the governance of complex systems, rely on our ability to integrate different forms of knowledge into a decision support system (DSS). Preliminary aspects are the classification of different types of knowledge (a priori or general, a posteriori or specific, with uncertainty, numerical, textual, algorithmic, complete/incomplete, etc.), the definition of ontologies for knowledge management and the availability of proper tools like continuous simulation models, event driven models, statistical approaches, computational methods (neural networks, evolutionary optimization, rule based systems etc.) and procedure for textual documentation. Following these views at University of Udine, a computer language (SEMoLa, Simple, Easy Modelling Language) for knowledge integration has been developed.  SEMoLa can handle models, data, metadata and textual knowledge; it implements and extends the system dynamics ontology (Forrester, 1968; Jørgensen, 1994) in which systems are modelled by the concepts of material, group, state, rate, parameter, internal and external events and driving variables. As an example, a SEMoLa model to improve management and sustainability (economical, energetic, environmental) of the agricultural farms is presented. The model (X-Farm) simulates a farm in which cereal and forage yield, oil seeds, milk, calves and wastes can be sold or reused. X-Farm is composed by integrated modules describing fields (crop and soil), feeds and materials storage, machinery management, manpower  management, animal husbandry, economic and energetic balances, seed oil extraction, manure and wastes management, biogas production from animal wastes and biomasses.

  14. Knowledge Based Text Generation

    DTIC Science & Technology

    1989-08-01

    knowledge base as well as communicate the reasoning behind a particular diagnosis. This is discussed more thoroughly in subsequent sections. On the other...explanation. Wcincr proposed that a statement can be justified by offering reasons , supporting examples, and implausible alternatives, except for the statement...These justification techniques are realized in his system by four predicates: statement, reason , example and alternative. Connectives such as and/or

  15. The Knowledge Level

    DTIC Science & Technology

    1981-07-01

    precisely to realizing mental functions in physical systems. In the hands of Daniel Dennett (1978), a philosopher who has concerned himself rather...illustrated repeatedly by Dennett with the gross flow diagrams of AI programs. The intentional stance corresponds to the knowledge level. In...particular, Dennett takes the important step of jettisoning the major result-cum- assumption of the original doctrine, to wit, that the intentional is

  16. Threads of common knowledge.

    PubMed

    Icamina, P

    1993-04-01

    Indigenous knowledge is examined as it is affected by development and scientific exploration. The indigenous culture of shamanism, which originated in northern and southeast Asia, is a "political and religious technique for managing societies through rituals, myths, and world views." There is respect for the natural environment and community life as a social common good. This world view is still practiced by many in Latin America and in Colombia specifically. Colombian shamanism has an environmental accounting system, but the Brazilian government has established its own system of land tenure and political representation which does not adequately represent shamanism. In 1992 a conference was held in the Philippines by the International Institute for Rural Reconstruction and IDRC on sustainable development and indigenous knowledge. The link between the two is necessary. Unfortunately, there are already examples in the Philippines of loss of traditional crop diversity after the introduction of modern farming techniques and new crop varieties. An attempt was made to collect species, but without proper identification. Opposition was expressed to the preservation of wilderness preserves; the desire was to allow indigenous people to maintain their homeland and use their time-tested sustainable resource management strategies. Property rights were also discussed during the conference. Of particular concern was the protection of knowledge rights about biological diversity or pharmaceutical properties of indigenous plant species. The original owners and keepers of the knowledge must retain access and control. The research gaps were identified and found to be expansive. Reference was made to a study of Mexican Indian children who knew 138 plant species while non-Indian children knew only 37. Sometimes there is conflict of interest where foresters prefer timber forests and farmers desire fuelwood supplies and fodder and grazing land, which is provided by shrubland. Information

  17. Aeronautical Knowledge (Selected Articles),

    DTIC Science & Technology

    1981-01-14

    UNCLASSIFIED FTD-ID RSN -12348 Nm m ED I FTD-ID(RS)T-1234-80-- FOREIGN TECHNOLOGY DIVISION AERONAUTICAL KNOWLEDGE (Selected Articles) * DTIC cm. ’- D...of the spacecraft cabin, went through the structure of the eyes of the astronauts, and caused them to see flahig-. The frequency of the flashing was...to tell space travelers of the existence of belts of high radiation end alert them to the danger. Present and future missins must clarify the

  18. [The diffusion of knowledge].

    PubMed

    Ramiro-H, Manuel; Cruz-A, Enrique

    2016-01-01

    Between August 19 and 21, the Feria del Libro de las Ciencias de la Salud (Healthcare Book Fair) took place in the Palacio de Medicina in Mexico City. Archives of Medical Research, Revista Médica del IMSS, and Saber IMSS, three of the main instruments of knowledge diffusion of the Instituto Mexicano del Seguro Social, assisted to this book fair, which was organized by the Facultad de Medicina of UNAM.

  19. Knowledge Translation in Audiology

    PubMed Central

    Kothari, Anita; Bagatto, Marlene P.; Seewald, Richard; Miller, Linda T.; Scollie, Susan D.

    2011-01-01

    The impetus for evidence-based practice (EBP) has grown out of widespread concern with the quality, effectiveness (including cost-effectiveness), and efficiency of medical care received by the public. Although initially focused on medicine, EBP principles have been adopted by many of the health care professions and are often represented in practice through the development and use of clinical practice guidelines (CPGs). Audiology has been working on incorporating EBP principles into its mandate for professional practice since the mid-1990s. Despite widespread efforts to implement EBP and guidelines into audiology practice, gaps still exist between the best evidence based on research and what is being done in clinical practice. A collaborative dynamic and iterative integrated knowledge translation (KT) framework rather than a researcher-driven hierarchical approach to EBP and the development of CPGs has been shown to reduce the knowledge-to-clinical action gaps. This article provides a brief overview of EBP and CPGs, including a discussion of the barriers to implementing CPGs into clinical practice. It then offers a discussion of how an integrated KT process combined with a community of practice (CoP) might facilitate the development and dissemination of evidence for clinical audiology practice. Finally, a project that uses the knowledge-to-action (KTA) framework for the development of outcome measures in pediatric audiology is introduced. PMID:22194314

  20. Knowledge of contraceptive effectiveness.

    PubMed

    Eisenberg, David L; Secura, Gina M; Madden, Tessa E; Allsworth, Jenifer E; Zhao, Qiuhong; Peipert, Jeffrey F

    2012-06-01

    The purpose of this study was to determine women's knowledge of contraceptive effectiveness. We performed a cross-sectional analysis of a contraceptive knowledge questionnaire that had been completed by 4144 women who were enrolled in the Contraceptive CHOICE Project before they received comprehensive contraceptive counseling and chose their method. For each contraceptive method, women were asked "what percentage would get pregnant in a year: <1%, 1-5%, 6-10%, >10%, don't know." Overall, 86% of subjects knew that the annual risk of pregnancy is >10% if no contraception is used. More than 45% of women overestimate the effectiveness of depo-medroxyprogesterone acetate, pills, the patch, the ring, and condoms. After adjustment for age, education, and contraceptive history, the data showed that women who chose the intrauterine device (adjusted relative risk, 6.9; 95% confidence interval, 5.6-8.5) or implant (adjusted relative risk, 5.9; 95% confidence interval, 4.7-7.3) were significantly more likely to identify the effectiveness of their method accurately compared with women who chose either the pill, patch, or ring. This cohort demonstrated significant knowledge gaps regarding contraceptive effectiveness and over-estimated the effectiveness of pills, the patch, the ring, depo-medroxyprogesterone acetate, and condoms. Copyright © 2012 Mosby, Inc. All rights reserved.

  1. Creating illusions of knowledge: learning errors that contradict prior knowledge.

    PubMed

    Fazio, Lisa K; Barber, Sarah J; Rajaram, Suparna; Ornstein, Peter A; Marsh, Elizabeth J

    2013-02-01

    Most people know that the Pacific is the largest ocean on Earth and that Edison invented the light bulb. Our question is whether this knowledge is stable, or if people will incorporate errors into their knowledge bases, even if they have the correct knowledge stored in memory. To test this, we asked participants general-knowledge questions 2 weeks before they read stories that contained errors (e.g., "Franklin invented the light bulb"). On a later general-knowledge test, participants reproduced story errors despite previously answering the questions correctly. This misinformation effect was found even for questions that were answered correctly on the initial test with the highest level of confidence. Furthermore, prior knowledge offered no protection against errors entering the knowledge base; the misinformation effect was equivalent for previously known and unknown facts. Errors can enter the knowledge base even when learners have the knowledge necessary to catch the errors.

  2. Western Hemisphere Knowledge Partnerships

    NASA Astrophysics Data System (ADS)

    Malone, T. F.

    2001-05-01

    Society in general, and geophysicists in particular, are challenged by problems and opportunities in the prospects for an additional three billion people on finite planet Earth by 2050 in a global economy four to six times larger than it is at present. A problem was identified by the Pilot Assessment of Global Ecosystems (PAGE): "If we choose to continue our current patterns of use, we face almost certain decline in the ability of ecosystems to yield their broad spectrum of benefits - from clean water to stable climate, fuel wood to food crops, timber to wildlife habitat." This is the issue of environmental sustainability. Another problem is the widening gap in wealth and health between affluent nations and impoverished countries. Every day each of the more than a billion people in the industrial nations produces goods and services worth nearly 60 dollars to meet their basic needs and "wants." This figure increases by about 85 cents annually. Every day each of the 600 million people in the least developed countries produces goods and services worth about 75 cents to meet their basic needs and limited wants. That number grows by less that a penny a day annually. This is the issue of economic prosperity and equity. By harnessing revolutionary technologies in communications to distribute expanding knowledge in the physical, chemical, and geophysical sciences and exploding knowledge in the biological and health sciences, a new vision for world society is brought within reach in The Knowledge Age. It is a society in which all of the basic human needs and an equitable share of human wants can be met while maintaining healthy, attractive, and biologically productive ecosystems. This society is environmentally sustainable, economically prosperous and equitable, and therefore likely to be politically stable. The time has arrived to fashion a strategy to pursue that vision. A knowledge-based and human-centered strategy will involve the discovery, integration, dissemination

  3. From knowledge presentation to knowledge representation to knowledge construction: Future directions for hypermedia

    NASA Technical Reports Server (NTRS)

    Palumbo, David B.

    1990-01-01

    Relationships between human memory systems and hypermedia systems are discussed with particular emphasis on the underlying importance of associational memory. The distinctions between knowledge presentation, knowledge representation, and knowledge constructions are addressed. Issues involved in actually developing individualizable hypermedia based knowledge construction tools are presented.

  4. Distinguishing Knowledge-Sharing, Knowledge-Construction, and Knowledge-Creation Discourses

    ERIC Educational Resources Information Center

    van Aalst, Jan

    2009-01-01

    The study reported here sought to obtain the clear articulation of asynchronous computer-mediated discourse needed for Carl Bereiter and Marlene Scardamalia's knowledge-creation model. Distinctions were set up between three modes of discourse: knowledge sharing, knowledge construction, and knowledge creation. These were applied to the asynchronous…

  5. Procedural and Conceptual Knowledge: Exploring the Gap between Knowledge Type and Knowledge Quality

    ERIC Educational Resources Information Center

    Star, Jon R.; Stylianides, Gabriel J.

    2013-01-01

    Following Star (2005, 2007), we continue to problematize the entangling of type and quality in the use of conceptual knowledge and procedural knowledge. Although those whose work is guided by types of knowledge and those whose work is guided by qualities of knowledge seem to be referring to the same phenomena, actually they are not. This lack of…

  6. Distinguishing Knowledge-Sharing, Knowledge-Construction, and Knowledge-Creation Discourses

    ERIC Educational Resources Information Center

    van Aalst, Jan

    2009-01-01

    The study reported here sought to obtain the clear articulation of asynchronous computer-mediated discourse needed for Carl Bereiter and Marlene Scardamalia's knowledge-creation model. Distinctions were set up between three modes of discourse: knowledge sharing, knowledge construction, and knowledge creation. These were applied to the asynchronous…

  7. Knowledge repositories for multiple uses

    NASA Technical Reports Server (NTRS)

    Williamson, Keith; Riddle, Patricia

    1991-01-01

    In the life cycle of a complex physical device or part, for example, the docking bay door of the Space Station, there are many uses for knowledge about the device or part. The same piece of knowledge might serve several uses. Given the quantity and complexity of the knowledge that must be stored, it is critical to maintain the knowledge in one repository, in one form. At the same time, because of quantity and complexity of knowledge that must be used in life cycle applications such as cost estimation, re-design, and diagnosis, it is critical to automate such knowledge uses. For each specific use, a knowledge base must be available and must be in a from that promotes the efficient performance of that knowledge base. However, without a single source knowledge repository, the cost of maintaining consistent knowledge between multiple knowledge bases increases dramatically; as facts and descriptions change, they must be updated in each individual knowledge base. A use-neutral representation of a hydraulic system for the F-111 aircraft was developed. The ability to derive portions of four different knowledge bases is demonstrated from this use-neutral representation: one knowledge base is for re-design of the device using a model-based reasoning problem solver; two knowledge bases, at different levels of abstraction, are for diagnosis using a model-based reasoning solver; and one knowledge base is for diagnosis using an associational reasoning problem solver. It was shown how updates issued against the single source use-neutral knowledge repository can be propagated to the underlying knowledge bases.

  8. Modeling Expert Control Knowledge.

    DTIC Science & Technology

    1987-12-10

    body of rese:inlh results characterize expert domain knowledge and problem-solving mechanisms for a variet td problem domains. By contrast, little is...ident ification of regions of secuml:u I structure, since these regions are used as solid-level components of the struture to he determined. ABC is an...similar to that of human experts in the field, who themselves do not alwas agree on the interpretation of data. This was illustrated vi,,idlf by three

  9. Capturing design knowledge

    NASA Technical Reports Server (NTRS)

    Babin, Brian A.; Loganantharaj, Rasiah

    1990-01-01

    A scheme is proposed to capture the design knowledge of a complex object including functional, structural, performance, and other constraints. Further, the proposed scheme is also capable of capturing the rationale behind the design of an object as a part of the overall design of the object. With this information, the design of an object can be treated as a case and stored with other designs in a case base. A person can then perform case-based reasoning by examining these designs. Methods of modifying object designs are also discussed. Finally, an overview of an approach to fault diagnosis using case-based reasoning is given.

  10. The Use of a priori Information in ICA-Based Techniques for Real-Time fMRI: An Evaluation of Static/Dynamic and Spatial/Temporal Characteristics

    PubMed Central

    Soldati, Nicola; Calhoun, Vince D.; Bruzzone, Lorenzo; Jovicich, Jorge

    2013-01-01

    Real-time brain functional MRI (rt-fMRI) allows in vivo non-invasive monitoring of neural networks. The use of multivariate data-driven analysis methods such as independent component analysis (ICA) offers an attractive trade-off between data interpretability and information extraction, and can be used during both task-based and rest experiments. The purpose of this study was to assess the effectiveness of different ICA-based procedures to monitor in real-time a target IC defined from a functional localizer which also used ICA. Four novel methods were implemented to monitor ongoing brain activity in a sliding window approach. The methods differed in the ways in which a priori information, derived from ICA algorithms, was used to monitor a target independent component (IC). We implemented four different algorithms, all based on ICA. One Back-projection method used ICA to derive static spatial information from the functional localizer, off-line, which was then back-projected dynamically during the real-time acquisition. The other three methods used real-time ICA algorithms that dynamically exploited temporal, spatial, or spatial-temporal priors during the real-time acquisition. The methods were evaluated by simulating a rt-fMRI experiment that used real fMRI data. The performance of each method was characterized by the spatial and/or temporal correlation with the target IC component monitored, computation time, and intrinsic stochastic variability of the algorithms. In this study the Back-projection method, which could monitor more than one IC of interest, outperformed the other methods. These results are consistent with a functional task that gives stable target ICs over time. The dynamic adaptation possibilities offered by the other ICA methods proposed may offer better performance than the Back-projection in conditions where the functional activation shows higher spatial and/or temporal variability. PMID:23483841

  11. Knowledge Management: Usefulness of Knowledge to Organizational Managers

    ERIC Educational Resources Information Center

    Klein, Roy L.

    2010-01-01

    The purpose of this study was to determine the level of knowledge-usefulness to organizational managers. The determination of the level of usefulness provided organizational managers with a reliable measure of their decision-making. Organizational workers' perceptions of knowledge accessibility, quality of knowledge content, timeliness, and user…

  12. Knowledge Growth: Applied Models of General and Individual Knowledge Evolution

    ERIC Educational Resources Information Center

    Silkina, Galina Iu.; Bakanova, Svetlana A.

    2016-01-01

    The article considers the mathematical models of the growth and accumulation of scientific and applied knowledge since it is seen as the main potential and key competence of modern companies. The problem is examined on two levels--the growth and evolution of objective knowledge and knowledge evolution of a particular individual. Both processes are…

  13. Knowledge Management in Higher Education: A Knowledge Repository Approach

    ERIC Educational Resources Information Center

    Wedman, John; Wang, Feng-Kwei

    2005-01-01

    One might expect higher education, where the discovery and dissemination of new and useful knowledge is vital, to be among the first to implement knowledge management practices. Surprisingly, higher education has been slow to implement knowledge management practices (Townley, 2003). This article describes an ongoing research and development effort…

  14. Knowledge Building: Reinventing Education for the Knowledge Age

    ERIC Educational Resources Information Center

    Philip, Donald N.

    2011-01-01

    This paper examines the Knowledge Age and how economic factors are causing educators to rethink and reinvent education. Two key factors in education in the Knowledge Age will be education for an economy of innovation, and the increasing virtualization of education. We present knowledge building pedagogy as a model for education in the Knowledge…

  15. Knowledge Management: Usefulness of Knowledge to Organizational Managers

    ERIC Educational Resources Information Center

    Klein, Roy L.

    2010-01-01

    The purpose of this study was to determine the level of knowledge-usefulness to organizational managers. The determination of the level of usefulness provided organizational managers with a reliable measure of their decision-making. Organizational workers' perceptions of knowledge accessibility, quality of knowledge content, timeliness, and user…

  16. Knowledge Sharing and Global Collaboration on Online Knowledge Exchange Platforms

    ERIC Educational Resources Information Center

    Yu, Yuecheng

    2012-01-01

    This thesis reports on three empirical studies that focus on questions concerning knowledge sharing and construction in communities of practice and global knowledge exchange platforms. The first essay presents an exploratory case study on a particular academic community of practice--AISNET and its central knowledge exchange platform, the ISWorld…

  17. Linking Knowledge Production and Needs of Knowledge Users. II.

    ERIC Educational Resources Information Center

    Wolf, W. C., Jr.

    Arguing that many new ideas, techniques, and products fail to be adopted because they are not properly linked to the needs of knowledge users, this chapter presents an approach to the problem of linking the knowledge user with the knowledge producer that is designed to provide linkage agents with a frame of reference and tools for disciplined…

  18. Pedagogical Content Knowledge and Content Knowledge of Secondary Mathematics Teachers

    ERIC Educational Resources Information Center

    Krauss, Stefan; Brunner, Martin; Kunter, Mareike; Baumert, Jurgen; Neubrand, Michael; Blum, Werner; Jordan, Alexander

    2008-01-01

    Drawing on the work of L. S. Shulman (1986), the authors present a conceptualization of the pedagogical content knowledge and content knowledge of secondary-level mathematics teachers. They describe the theory-based construction of tests to assess these knowledge categories and the implementation of these tests in a sample of German mathematics…

  19. Nursing, knowledge and practice.

    PubMed

    Allen, D

    1997-07-01

    Recent commentators have suggested that academic knowledge is irrelevant to nursing practice and may actually undermine nursing's traditional caring ethos. Furthermore, by making nursing more academic, it is claimed that 'natural' but non-academic carers are prevented from pursuing a career in nursing. Debates about the relationship between nursing, knowledge and practice have a long history and have to be understood in terms of wider political and economic issues relating to nursing, its status within society and the changing role of nurses within the health services division of labour. One crucial issue is nursing's status as women's work. Critics of developments in nurse education draw an ideological equation between nursing work and the traditional female role. From this perspective the qualities that make a good nurse cannot be taught, rather they are founded on 'natural' feminine skills. Irrespective of whether caring is 'natural' or not, it is questionable as to whether, for today's nurses, being caring is sufficient. The shape of nursing jurisdiction is a long way removed from its origins in the Victorian middle-class household. In addition to their traditional caring role, contemporary nurses may also have complex clinical, management and research responsibilities, as well as being crucial coordinators of service provision. It is suggested that these and future developments in health services make the need for an educated nursing workforce even more pressing. In order to adequately prepare nurses for practice, however, it is vital that nurse education reflects the reality of service provision.

  20. Knowledge: Genuine and Bogus

    NASA Astrophysics Data System (ADS)

    Bunge, Mario

    2011-05-01

    Pseudoscience is error, substantive or methodological, parading as science. Obvious examples are parapsychology, "intelligent design," and homeopathy. Psychoanalysis and pop evolutionary psychology are less obvious, yet no less flawed in both method and doctrine. The fact that science can be faked to the point of deceiving science lovers suggests the need for a rigorous sifting device, one capable of revealing out the worm in the apple. This device is needed to evaluate research proposal as well as new fashions. Such a device can be designed only with the help of a correct definition of science, one attending not only to methodological aspects, such as testability and predictive power, but also to other features of scientific knowledge, such as intelligibility, corrigibility, and compatibility with the bulk of antecedent knowledge. The aim of this paper is to suggest such a criterion, to illustrate it with a handful of topical examples, and to emphasize the role of philosophy in either promoting or blocking scientific progress. This article is a revised version of a chapter in the author's forthcoming book Matter and Mind (Springer). [The Appendix on inductive logic was written at the request of the editors in order to elaborate claims made in #10 (4).

  1. Business knowledge in surgeons.

    PubMed

    Satiani, Bhagwan

    2004-07-01

    Surgeons and residents in training receive little, if any, formal education in the economic side of clinical practice during medical school or residency. As medical professionals face shrinking reimbursement, loss of control over health care decisions, and limited resources, surgical specialties must reevaluate the need to teach their members business survival skills. Before designing business related-teaching modules, educators must know the exact gaps in knowledge that exist among surgeons. This article reports a survey of 133 surgeons in the Midwest who were asked to rate their knowledge base in 11 business topics relevant to the practice of medicine. The survey showed that the average surgeon perceives himself or herself to be poorly equipped to understand basic financial accounting principles, financial markets, economics of health care, tools for evaluating purchases, marketing, budgets, antitrust and fraud and abuse regulations, and risk and return on investments. Armed with this data, teaching faculty, health care systems, and medical specialty societies should design business education seminars to better position surgical specialists and trainees to communicate with insurers, hospital administrators, health care organizations, and their own personal financial advisors.

  2. Vision without knowledge.

    PubMed Central

    Milner, A D

    1997-01-01

    A brain-damaged patient (D.F.) with visual form agnosia is described and discussed. D.F. has a profound inability to recognize objects, places and people, in large part because of her inability to make perceptual discriminations of size, shape or orientation, despite having good visual acuity. Yet she is able to perform skilled actions that depend on that very same size, shape and orientation information that is missing from her perceptual awareness. It is suggested that her intact vision can best be understood within the framework of a dual processing model, according to which there are two cortical processing streams operating on different coding principles, for perception and for action, respectively. These may be expected to have different degrees of dependence on top-down information. One possibility is that D.F.'s lack of explicit awareness of the visual cues that guide her behaviour may result from her having to rely on a processing system which is not knowledge-based in a broad sense. Conversely, it may be that the perceptual system can provide conscious awareness of its products in normal individuals by virtue of the fact that it does interact with a stored base of visual knowledge. PMID:9304691

  3. Knowledge, Informationa and Literacy

    NASA Astrophysics Data System (ADS)

    Roberts, Peter

    2000-09-01

    This paper problematises the notion of the "knowledge society" found in two recent initiatives: the OECD's International Adult Literacy Survey, and the New Zealand Foresight Project. The author supports a broadening of the concept of literacy, as suggested by the OECD reports, but points to some of the limits of "information" as the focus for such a re-definition. The principle of theorising social and economic futures is also endorsed, but the form this takes in the Foresight Project is seen as unnecessarily restrictive. To date, the Foresight Project can be seen as a synthesis of elements of market liberalism and scientific rationalism. Both projects ignore crucial political and ethical questions in their accounts of the "knowledge society" and the process of globalisation, and both are wedded to a technocratic mode of policy development and planning. The author calls for further critical work on changing patterns of literate activity in the information age, and stresses the importance of contemplating futures other than those driven by the imperatives of global capitalism.

  4. 'Ethos' Enabling Organisational Knowledge Creation

    NASA Astrophysics Data System (ADS)

    Matsudaira, Yoshito

    This paper examines knowledge creation in relation to improvements on the production line in the manufacturing department of Nissan Motor Company and aims to clarify embodied knowledge observed in the actions of organisational members who enable knowledge creation will be clarified. For that purpose, this study adopts an approach that adds a first, second, and third-person's viewpoint to the theory of knowledge creation. Embodied knowledge, observed in the actions of organisational members who enable knowledge creation, is the continued practice of 'ethos' (in Greek) founded in Nissan Production Way as an ethical basis. Ethos is knowledge (intangible) assets for knowledge creating companies. Substantiated analysis classifies ethos into three categories: the individual, team and organisation. This indicates the precise actions of the organisational members in each category during the knowledge creation process. This research will be successful in its role of showing the indispensability of ethos - the new concept of knowledge assets, which enables knowledge creation -for future knowledge-based management in the knowledge society.

  5. Strategies for Mentoring Pedagogical Knowledge

    ERIC Educational Resources Information Center

    Hudson, Peter

    2013-01-01

    Fundamental for mentoring a preservice teacher is the mentor's articulation of pedagogical knowledge, which in this research draws upon specific practices, viz.: planning, timetabling lessons, preparation, teaching strategies, content knowledge, problem solving, questioning, classroom management, implementation, assessment and viewpoints for…

  6. Strategies for Mentoring Pedagogical Knowledge

    ERIC Educational Resources Information Center

    Hudson, Peter

    2013-01-01

    Fundamental for mentoring a preservice teacher is the mentor's articulation of pedagogical knowledge, which in this research draws upon specific practices, viz.: planning, timetabling lessons, preparation, teaching strategies, content knowledge, problem solving, questioning, classroom management, implementation, assessment and viewpoints for…

  7. The Medawar Lecture 2001 Knowledge for vision: vision for knowledge

    PubMed Central

    Gregory, Richard L

    2005-01-01

    An evolutionary development of perception is suggested—from passive reception to active perception to explicit conception—earlier stages being largely retained and incorporated in later species. A key is innate and then individually learned knowledge, giving meaning to sensory signals. Inappropriate or misapplied knowledge produces rich cognitive phenomena of illusions, revealing normally hidden processes of vision, tentatively classified here in a ‘peeriodic table’. Phenomena of physiology are distinguished from phenomena of general rules and specific object knowledge. It is concluded that vision uses implicit knowledge, and provides knowledge for intelligent behaviour and for explicit conceptual understanding including science. PMID:16147519

  8. Characterization of GM events by insert knowledge adapted re-sequencing approaches

    PubMed Central

    Yang, Litao; Wang, Congmao; Holst-Jensen, Arne; Morisset, Dany; Lin, Yongjun; Zhang, Dabing

    2013-01-01

    Detection methods and data from molecular characterization of genetically modified (GM) events are needed by stakeholders of public risk assessors and regulators. Generally, the molecular characteristics of GM events are incomprehensively revealed by current approaches and biased towards detecting transformation vector derived sequences. GM events are classified based on available knowledge of the sequences of vectors and inserts (insert knowledge). Herein we present three insert knowledge-adapted approaches for characterization GM events (TT51-1 and T1c-19 rice as examples) based on paired-end re-sequencing with the advantages of comprehensiveness, accuracy, and automation. The comprehensive molecular characteristics of two rice events were revealed with additional unintended insertions comparing with the results from PCR and Southern blotting. Comprehensive transgene characterization of TT51-1 and T1c-19 is shown to be independent of a priori knowledge of the insert and vector sequences employing the developed approaches. This provides an opportunity to identify and characterize also unknown GM events. PMID:24088728

  9. Research on Knowledge-Based Optimization Method of Indoor Location Based on Low Energy Bluetooth

    NASA Astrophysics Data System (ADS)

    Li, C.; Li, G.; Deng, Y.; Wang, T.; Kang, Z.

    2017-09-01

    With the rapid development of LBS (Location-based Service), the demand for commercialization of indoor location has been increasing, but its technology is not perfect. Currently, the accuracy of indoor location, the complexity of the algorithm, and the cost of positioning are hard to be simultaneously considered and it is still restricting the determination and application of mainstream positioning technology. Therefore, this paper proposes a method of knowledge-based optimization of indoor location based on low energy Bluetooth. The main steps include: 1) The establishment and application of a priori and posterior knowledge base. 2) Primary selection of signal source. 3) Elimination of positioning gross error. 4) Accumulation of positioning knowledge. The experimental results show that the proposed algorithm can eliminate the signal source of outliers and improve the accuracy of single point positioning in the simulation data. The proposed scheme is a dynamic knowledge accumulation rather than a single positioning process. The scheme adopts cheap equipment and provides a new idea for the theory and method of indoor positioning. Moreover, the performance of the high accuracy positioning results in the simulation data shows that the scheme has a certain application value in the commercial promotion.

  10. Characterization of GM events by insert knowledge adapted re-sequencing approaches.

    PubMed

    Yang, Litao; Wang, Congmao; Holst-Jensen, Arne; Morisset, Dany; Lin, Yongjun; Zhang, Dabing

    2013-10-03

    Detection methods and data from molecular characterization of genetically modified (GM) events are needed by stakeholders of public risk assessors and regulators. Generally, the molecular characteristics of GM events are incomprehensively revealed by current approaches and biased towards detecting transformation vector derived sequences. GM events are classified based on available knowledge of the sequences of vectors and inserts (insert knowledge). Herein we present three insert knowledge-adapted approaches for characterization GM events (TT51-1 and T1c-19 rice as examples) based on paired-end re-sequencing with the advantages of comprehensiveness, accuracy, and automation. The comprehensive molecular characteristics of two rice events were revealed with additional unintended insertions comparing with the results from PCR and Southern blotting. Comprehensive transgene characterization of TT51-1 and T1c-19 is shown to be independent of a priori knowledge of the insert and vector sequences employing the developed approaches. This provides an opportunity to identify and characterize also unknown GM events.

  11. Ways with Community Knowledge. PEN.

    ERIC Educational Resources Information Center

    Nelson, Greg

    Families and communities have enormous resources of knowledge around them that they use in their daily lives. Teachers may overlook or devalue these "funds of knowledge." Students, however, can benefit when teachers draw on community knowledge. Educationists advocate for the integration of subjects so that school curriculum is more…

  12. Nutrition Knowledge among Navy Recruits

    DTIC Science & Technology

    1987-03-27

    Nutrition Knovledge among Navy Recruits DTIC AELEOI7E flS SEP 2 8 1987 Terry L. Conway Linda K. Hervig D Ross R. Vickers, Jr. Health Psychology Department...5 Participants ............................................................. 5 Nutrition Knowledge Questionnaire...Students ............................... 6 Recruits’ Nutrition Knowledge .................................... ...... 8 Correlates of Nutrition Knowledge

  13. Knowledge Navigation for Virtual Vehicles

    NASA Technical Reports Server (NTRS)

    Gomez, Julian E.

    2004-01-01

    A virtual vehicle is a digital model of the knowledge surrounding a potentially real vehicle. Knowledge consists not only of the tangible information, such as CAD, but also what is known about the knowledge - its metadata. This paper is an overview of technologies relevant to building a virtual vehicle, and an assessment of how to bring those technologies together.

  14. Knowledge Acquisition in Observational Astronomy.

    ERIC Educational Resources Information Center

    Vosniadou, Stella

    This paper presents findings from research on knowledge acquisition in observational astronomy to demonstrate the kinds of intuitive models children form and to show how these models influence the acquisition of science knowledge. Sixty children of approximate ages 6, 9, and 12 were given a questionnaire to investigate their knowledge of the size,…

  15. Teacher Knowledge: A Complex Tapestry

    ERIC Educational Resources Information Center

    Adoniou, Misty

    2015-01-01

    Teachers need to know a great deal, in many areas and in multiple ways. Teacher knowledge is a complex tapestry, and teachers must successfully weave the multiple threads. In this article, I present a conceptualisation of teacher knowledge that provides a framework for describing the complexity of teacher knowledge. The framework describes three…

  16. Political Knowledge and American Democracy

    ERIC Educational Resources Information Center

    Melanson, Philip H.

    1974-01-01

    This article examines what makes knowledge political in its derivation and uses, and what potential political functions it may serve. Emphasis is placed upon the idea that the consequences of political knowledge for a democracy are neither uniformly beneficent or malevolent; but depend upon how the knowledge is being used and by whom. (DE)

  17. [Knowledge management and healthcare organizations].

    PubMed

    Favaretti, Carlo

    2013-10-01

    The present scenario is characterized by a high "environmental turbulence". Healthcare professionals and organizations must increase their knowledge, skills and attitudes for choosing wisely. Healthcare organizations are complex adaptive systems which should use integrated governance systems: knowledge management should be a strategic goal. These organizations should become learning organizations: they should build and renovate their knowledge in a systematic, explicit and definite way.

  18. Teacher Knowledge: A Complex Tapestry

    ERIC Educational Resources Information Center

    Adoniou, Misty

    2015-01-01

    Teachers need to know a great deal, in many areas and in multiple ways. Teacher knowledge is a complex tapestry, and teachers must successfully weave the multiple threads. In this article, I present a conceptualisation of teacher knowledge that provides a framework for describing the complexity of teacher knowledge. The framework describes three…

  19. Reducing the Knowledge Tracing Space

    ERIC Educational Resources Information Center

    Ritter, Steven; Harris, Thomas K.; Nixon, Tristan; Dickison, Daniel; Murray, R. Charles; Towle, Brendon

    2009-01-01

    In Cognitive Tutors, student skill is represented by estimates of student knowledge on various knowledge components. The estimate for each knowledge component is based on a four-parameter model developed by Corbett and Anderson [Nb]. In this paper, we investigate the nature of the parameter space defined by these four parameters by modeling data…

  20. Knowledge Construction in Critical Ethnography.

    ERIC Educational Resources Information Center

    Adkins, Amee; Gunzenhauser, Michael G.

    1999-01-01

    Explores the grounding of cultural critique in ethnography as a process of knowledge construction, using concepts from philosophy, anthropology, and sociology of knowledge to identify a theory of knowledge that may inform a postcritical ethnography. The paper proposes a critical ethnography that is more authentic both to its wider social project…

  1. Knowledge modeling for software design

    NASA Technical Reports Server (NTRS)

    Shaw, Mildred L. G.; Gaines, Brian R.

    1992-01-01

    This paper develops a modeling framework for systems engineering that encompasses systems modeling, task modeling, and knowledge modeling, and allows knowledge engineering and software engineering to be seen as part of a unified developmental process. This framework is used to evaluate what novel contributions the 'knowledge engineering' paradigm has made and how these impact software engineering.

  2. Challenges in Measuring Teachers' Knowledge

    ERIC Educational Resources Information Center

    Fauskanger, Janne

    2015-01-01

    Mathematical knowledge for teaching (MKT) measures have been widely adopted by researchers. Critics have debated the value of such measures and questioned the type of knowledge that these access. This article reports on a study where the challenges in measuring teachers' knowledge were illuminated through investigating relationships between the…

  3. Knowledge Creation in Nursing Education

    PubMed Central

    Hassanian, Zahra Marzieh; Ahanchian, Mohammad Reza; Ahmadi, Suleiman; Gholizadeh, Rezvan Hossein; Karimi-Moonaghi, Hossein

    2015-01-01

    In today’s society, knowledge is recognized as a valuable social asset and the educational system is in search of a new strategy that allows them to construct their knowledge and experience. The purpose of this study was to explore the process of knowledge creation in nursing education. In the present study, the grounded theory approach was used. This method provides a comprehensive approach to collecting, organizing, and analyzing data. Data were obtained through 17 semi-structured interviews with nursing faculties and nursing students. Purposeful and theoretical sampling was conducted. Based on the method of Strauss and Corbin, the data were analyzed using fragmented, deep, and constant-comparative methods. The main categories included striving for growth and reduction of ambiguity, use of knowledge resources, dynamism of mind and social factors, converting knowledge, and creating knowledge. Knowledge was converted through mind processes, individual and group reflection, praxis and research, and resulted in the creation of nursing knowledge. Discrete nursing knowledge is gained through disconformity research in order to gain more individual advantages. The consequence of this analysis was gaining new knowledge. Knowledge management must be included in the mission and strategic planning of nursing education, and it should be planned through operational planning in order to create applicable knowledge. PMID:25716383

  4. Combining factual and heuristic knowledge in knowledge acquisition

    NASA Technical Reports Server (NTRS)

    Gomez, Fernando; Hull, Richard; Karr, Clark; Hosken, Bruce; Verhagen, William

    1992-01-01

    A knowledge acquisition technique that combines heuristic and factual knowledge represented as two hierarchies is described. These ideas were applied to the construction of a knowledge acquisition interface to the Expert System Analyst (OPERA). The goal of OPERA is to improve the operations support of the computer network in the space shuttle launch processing system. The knowledge acquisition bottleneck lies in gathering knowledge from human experts and transferring it to OPERA. OPERA's knowledge acquisition problem is approached as a classification problem-solving task, combining this approach with the use of factual knowledge about the domain. The interface was implemented in a Symbolics workstation making heavy use of windows, pull-down menus, and other user-friendly devices.

  5. Expanding Yeast Knowledge Online

    PubMed Central

    DOLINSKI, KARA; BALL, CATHERINE A.; CHERVITZ, STEPHEN A.; DWIGHT, SELINA S.; HARRIS, MIDORI A.; ROBERTS, SHANNON; ROE, TAIYUN; CHERRY, J. MICHAEL; BOTSTEIN, DAVID

    2011-01-01

    The completion of the Saccharomyces cerevisiae genome sequencing project11 and the continued development of improved technology for large-scale genome analysis have led to tremendous growth in the amount of new yeast genetics and molecular biology data. Efficient organization, presentation, and dissemination of this information are essential if researchers are to exploit this knowledge. In addition, the development of tools that provide efficient analysis of this information and link it with pertinent information from other systems is becoming increasingly important at a time when the complete genome sequences of other organisms are becoming available. The aim of this review is to familiarize biologists with the type of data resources currently available on the World Wide Web (WWW). PMID:9885151

  6. Robust automated knowledge capture.

    SciTech Connect

    Stevens-Adams, Susan Marie; Abbott, Robert G.; Forsythe, James Chris; Trumbo, Michael Christopher Stefan; Haass, Michael Joseph; Hendrickson, Stacey M. Langfitt

    2011-10-01

    This report summarizes research conducted through the Sandia National Laboratories Robust Automated Knowledge Capture Laboratory Directed Research and Development project. The objective of this project was to advance scientific understanding of the influence of individual cognitive attributes on decision making. The project has developed a quantitative model known as RumRunner that has proven effective in predicting the propensity of an individual to shift strategies on the basis of task and experience related parameters. Three separate studies are described which have validated the basic RumRunner model. This work provides a basis for better understanding human decision making in high consequent national security applications, and in particular, the individual characteristics that underlie adaptive thinking.

  7. Transdisciplinary approaches enhance the production of translational knowledge.

    PubMed

    Ciesielski, Timothy H; Aldrich, Melinda C; Marsit, Carmen J; Hiatt, Robert A; Williams, Scott M

    2017-04-01

    The primary goal of translational research is to generate and apply knowledge that can improve human health. Although research conducted within the confines of a single discipline has helped us to achieve this goal in many settings, this unidisciplinary approach may not be optimal when disease causation is complex and health decisions are pressing. To address these issues, we suggest that transdisciplinary approaches can facilitate the progress of translational research, and we review publications that demonstrate what these approaches can look like. These examples serve to (1) demonstrate why transdisciplinary research is useful, and (2) stimulate a conversation about how it can be further promoted. While we note that open-minded communication is a prerequisite for germinating any transdisciplinary work and that epidemiologists can play a key role in promoting it, we do not propose a rigid protocol for conducting transdisciplinary research, as one really does not exist. These achievements were developed in settings where typical disciplinary and institutional barriers were surmountable, but they were not accomplished with a single predetermined plan. The benefits of cross-disciplinary communication are hard to predict a priori and a detailed research protocol or process may impede the realization of novel and important insights. Overall, these examples demonstrate that enhanced cross-disciplinary information exchange can serve as a starting point that helps researchers frame better questions, integrate more relevant evidence, and advance translational knowledge more effectively. Specifically, we discuss examples where transdisciplinary approaches are helping us to better explore, assess, and intervene to improve human health.

  8. A knowledge-based imaging informatics approach to managing patients treated with proton beam therapy

    NASA Astrophysics Data System (ADS)

    Liu, B. J.; Huang, H. K.; Law, M.; Le, Anh; Documet, Jorge; Gertych, Arek

    2007-03-01

    Last year we presented work on an imaging informatics approach towards developing quantitative knowledge and tools based on standardized DICOM-RT objects for Image-Guided Radiation Therapy. In this paper, we have extended this methodology to perform knowledge-based medical imaging informatics research on specific clinical scenarios where brain tumor patients are treated with Proton Beam Therapy (PT). PT utilizes energized charged particles, protons, to deliver dose to the target region. Protons are energized to specific velocities which determine where they will deposit maximum energy within the body to destroy cancerous cells. Treatment Planning is similar in workflow to traditional Radiation Therapy methods such as Intensity-Modulated Radiation Therapy (IMRT) which utilizes a priori knowledge to drive the treatment plan in an inverse manner. In March 2006, two new RT Objects were drafted in a DICOM-RT Supplement 102 specifically for Ion Therapy which includes Proton Therapy. The standardization of DICOM-RT-ION objects and the development of a knowledge base as well as decision-support tools that can be add-on features to the ePR DICOM-RT system were researched. We have developed a methodology to perform knowledge-based medical imaging informatics research on specific clinical scenarios. This methodology can be used to extend to Proton Therapy and the development of future clinical decision-making scenarios during the course of the patient's treatment that utilize "inverse treatment planning". In this paper, we present the initial steps toward extending this methodology for PT and lay the foundation for development of future decision-support tools tailored to cancer patients treated with PT. By integrating decision-support knowledge and tools designed to assist in the decision-making process, a new and improved "knowledge-enhanced treatment planning" approach can be realized.

  9. Nutrition knowledge and food intake.

    PubMed

    Wardle, J; Parmenter, K; Waller, J

    2000-06-01

    In many studies, correlations between nutrition knowledge and dietary behaviour have failed to reach statistical significance, leading researchers to question the relevance of nutrition knowledge to food choice, and the value of nutrition education campaigns. This study aimed to investigate the relationship between knowledge and intake of fat, fruit and vegetables using a well-validated measure of nutrition knowledge. The study was a postal survey, using 1040 adult participants selected at random from General Practitioners' lists in England. Nutrition knowledge and food intake followed the expected demographic patterns. Knowledge was significantly associated with healthy eating, and the effect persisted after controlling for demographic variables. Logistic regression showed that respondents in the highest quintile for knowledge were almost 25 times more likely to meet current recommendations for fruit, vegetable and fat intake than those in the lowest quintile. Nutrition knowledge was shown to be a partial mediator of the socio-demographic variation in intake, especially for fruit and vegetables. This demonstrates the value of using more sophisticated statistical techniques to investigate associations between knowledge and food intake and indicates that knowledge is an important factor in explaining variations in food choice. The results support the likely value of including nutrition knowledge as a target for health education campaigns aimed at promoting healthy eating.

  10. Prior Knowledge and Exemplar Frequency

    PubMed Central

    Harris, Harlan D.; Murphy, Gregory L.; Rehder, Bob

    2008-01-01

    New concepts can be learned by statistical associations as well as by relevant existing knowledge. We examined the interaction of these two processes by manipulating exemplar frequency and thematic knowledge and considering their interaction through computational modeling. Exemplar frequency affects category learning, with high frequency items learned faster than low frequency items, and prior knowledge usually speeds category learning. In two experiments that manipulated both of these factors, we found that the effects of frequency are greatly reduced when stimulus features are linked by thematic prior knowledge, and that frequency effects on single stimulus features can actually be reversed by knowledge. We account for these results with the Knowledge Resonance (KRES) model of category learning (Rehder & Murphy, 2003) and conclude that prior knowledge may change representations so that empirical effects such as those caused by frequency manipulations are modulated. PMID:18927047

  11. Knowledge Exchange in the Shrines of Knowledge: The ''How's'' and ''Where's'' of Knowledge Sharing Processes

    ERIC Educational Resources Information Center

    Reychav, Iris; Te'eni, Dov

    2009-01-01

    Academic conferences are places of situated learning dedicated to the exchange of knowledge. Knowledge is exchanged between colleagues who are looking to enhance their future research by taking part in several formal and informal settings (lectures, discussions and social events). We studied the processes of knowledge sharing and the influence of…

  12. Knowledge Systems and the Role of Knowledge Synthesis in Linkages for Knowledge Use.

    ERIC Educational Resources Information Center

    Holzner, Burkart; Salmon-Cox, Leslie

    The relationship between the social structure of knowledge systems and knowledge syntheses is explored in order to define the social and cultural requirements for effective linkage. Following an introduction (section 1), analysis is divided into 5 additional sections. Section 2 discusses tools for conceptualizing knowledge systems, including…

  13. How does individual smoking behaviour among hospital staff influence their knowledge of the health consequences of smoking?

    PubMed

    Willaing, Ingrid; Jørgensen, Torben; Iversen, Lars

    2003-01-01

    This study examined associations between individual smoking habits among hospital staff and their knowledge of the health consequences of smoking and passive smoking. The a priori hypothesis was a higher level of knowledge among non-smokers compared with smokers. A survey was undertaken, based on self-administered questionnaires at a Danish hospital (Frederikssund Hospital) in the Copenhagen area. Descriptive statistics, chi-square test and multiple logistic regression analyses were used. A backward stepwise elimination of variables at a 5% level of significance was performed and 95% confidence intervals were calculated. Main outcome measures were knowledge of the health consequences of smoking, passive smoking and other lifestyle factors. A total of 445 of 487 employees (91%) from all professional groups returned the questionnaire. Compared with ex- and never smokers, smokers systematically underestimate the health consequences of smoking and passive smoking independent of profession, department, sex, and age. There is no consistent association between knowledge of the health consequences of smoking and profession and department. There are significant inverse associations between smoking and knowledge of the health effects of excess use of alcohol and lack of physical activity. Individual smoking habits among hospital staff strongly influence smoking-related knowledge. No other variables are of consistent importance. These findings are supported by the literature. The validity of the study is good, but a similar study in a bigger population would strengthen the evidence.

  14. Digging for knowledge

    NASA Astrophysics Data System (ADS)

    Szu, Harold; Jenkins, Jeffrey; Hsu, Charles; Goehl, Steve; Miao, Liden; Cader, Masud; Benachenhou, Dalila

    2009-04-01

    The "smile of a mother" is always recognized, whenever and wherever. But why is my PC always dumb and unable to recognize me or my needs, whoever or whatever? This paper postulates that such a 6 W's query and search system needs matching storage. Such a lament will soon be mended with a smarter PC, or a smarter Google engine, a network computer, working in the field of data retrieval, feature extraction, reduction, and knowledge precipitation. Specifically, the strategy of modern information storage and retrieval shall work like our brains, which are constantly overwhelmed by 5 pairs of identical tapes taken by eyes, ears, etc. 5 high fidelity sensors generate 5 pairs of high definition tapes which produce the seeing and hearing etc. in our perception. This amounts to 10 tapes recorded in a non-abridged fashion. How can we store and retrieve them when we need to? We must reduce the redundancy, enhancing the signal noise ratio, and fusing invariant features using a simple set of mathematical operations to write according to the union and read by the intersection in the higher dimensional vector space. For example, (see paper for equation) where the query must be phrased in terms of the union of imprecise or partial set of 6w's denoted by the union of lower case w's. The upper case W's are the archival storage of a primer tree. A simplified humanistic representation may be called the 6W space (who, what, where, when, why, how), also referred to as the Newspaper geometry. It seems like mapping the 6W to the 3W (World Wide Web) is becoming relatively easier. It may thus become efficient and robust by rapidly digging for knowledge through the set operations of union, writing, and intersection, reading, upon the design of 6 W query searching engine matched efficiently by the 6W vector index databases. In fact, Newspaper 6D geometry may be reduced furthermore by PCA (Principal Component Analysis) eigenvector mathematics and mapped into the 2D causality space comprised of

  15. Unity of knowledge in the advancement of nursing knowledge.

    PubMed

    Giuliano, Karen K; Tyer-Viola, Lynda; Lopez, Ruth Palan

    2005-07-01

    During the past 20 years, we have witnessed an explosion in nursing knowledge providing the discipline with diverse and multifaceted theoretical frameworks and paradigms. One knowledge theme that pervades the dialogue in the scholarly literature is that of multiple ways of knowing. With the acknowledgement that the fundamental nature of nursing knowledge is grounded in the understanding of human nature and its response to its environment, comes an imperative for a consilience of knowledge. The purpose of this article is to present such a unified worldview by articulating a vision of nursing knowledge, a meaning of unity of knowledge, and a challenge to the discipline to embrace inclusive rather than exclusive ways of knowing.

  16. Mobile robot knowledge base

    NASA Astrophysics Data System (ADS)

    Heath Pastore, Tracy; Barnes, Mitchell; Hallman, Rory

    2005-05-01

    Robot technology is developing at a rapid rate for both commercial and Department of Defense (DOD) applications. As a result, the task of managing both technology and experience information is growing. In the not-to-distant past, tracking development efforts of robot platforms, subsystems and components was not too difficult, expensive, or time consuming. To do the same today is a significant undertaking. The Mobile Robot Knowledge Base (MRKB) provides the robotics community with a web-accessible, centralized resource for sharing information, experience, and technology to more efficiently and effectively meet the needs of the robot system user. The resource includes searchable information on robot components, subsystems, mission payloads, platforms, and DOD robotics programs. In addition, the MRKB website provides a forum for technology and information transfer within the DOD robotics community and an interface for the Robotic Systems Pool (RSP). The RSP manages a collection of small teleoperated and semi-autonomous robotic platforms, available for loan to DOD and other qualified entities. The objective is to put robots in the hands of users and use the test data and fielding experience to improve robot systems.

  17. Subjective measures of unconscious knowledge.

    PubMed

    Dienes, Zoltán

    2008-01-01

    The chapter gives an overview of the use of subjective measures of unconscious knowledge. Unconscious knowledge is knowledge we have, and could very well be using, but we are not aware of. Hence appropriate methods for indicating unconscious knowledge must show that the person (a) has knowledge but (b) does not know that she has it. One way of determining awareness of knowing is by taking confidence ratings after making judgments. If the judgments are above baseline but the person believes they are guessing (guessing criterion) or confidence does not relate to accuracy (zero-correlation criterion) there is evidence of unconscious knowledge. The way these methods can deal with the problem of bias is discussed, as is the use of different types of confidence scales. The guessing and zero-correlation criteria show whether or not the person is aware of knowing the content of the judgment, but not whether the person is aware of what any knowledge was that enabled the judgment. Thus, a distinction is made between judgment and structural knowledge, and it is shown how the conscious status of the latter can also be assessed. Finally, the use of control over the use of knowledge as a subjective measure of judgment knowledge is illustrated. Experiments using artificial grammar learning and a serial reaction time task explore these issues.

  18. The folk conception of knowledge.

    PubMed

    Starmans, Christina; Friedman, Ori

    2012-09-01

    How do people decide which claims should be considered mere beliefs and which count as knowledge? Although little is known about how people attribute knowledge to others, philosophical debate about the nature of knowledge may provide a starting point. Traditionally, a belief that is both true and justified was thought to constitute knowledge. However, philosophers now agree that this account is inadequate, due largely to a class of counterexamples (termed "Gettier cases") in which a person's justified belief is true, but only due to luck. We report four experiments examining the effect of truth, justification, and "Gettiering" on people's knowledge attributions. These experiments show that: (1) people attribute knowledge to others only when their beliefs are both true and justified; (2) in contrast to contemporary philosophers, people also attribute knowledge to others in Gettier situations; and (3) knowledge is not attributed in one class of Gettier cases, but only because the agent's belief is based on "apparent" evidence. These findings suggest that the lay concept of knowledge is roughly consistent with the traditional account of knowledge as justified true belief, and also point to a major difference between the epistemic intuitions of laypeople and those of philosophers.

  19. The acquisition of strategic knowledge

    SciTech Connect

    Gruber, T.R.

    1989-01-01

    This research focuses on the problem of acquiring strategic knowledge-knowledge used by an agent to decide what action to perform next. Strategic knowledge is especially difficult to acquire from experts by conventional methods, and it is typically implemented with low-level primitives by a knowledge engineer. This dissertation presents a method for partially automating the acquisition of strategic knowledge from experts. The method consists of a representation for strategic knowledge, a technique for eliciting strategy from experts, and a learning procedure for transforming the information elicited from experts into operational and general form. The knowledge representation is formulated as strategy rules that associate strategic situations with equivalence classes of appropriate actions. The elicitation technique is based on a language of justifications with which the expert explains why a knowledge system should have chosen a particular action in a specific strategic situation. The learning procedure generates strategy rules from expert justifications in training cases, and generalizes newly-formed rules using syntactic induction operators. The knowledge acquisition procedure is embodied in an interactive program called ASK, which actively elicits justifications and new terms from the expert and generates operational strategy rules. ASK has been used by physicians to extend the strategic knowledge for a chest pain diagnosis application and by knowledge engineers to build a general strategy for the task of prospective diagnosis. A major conclusion is that expressive power can be traded for acquirability. By restricting the form of the representation of strategic knowledge, ASK can present a comprehensible knowledge elicitation medium to the expert and employ well-understood syntactic generalization operations to learn from the expert's explanations.

  20. Representing geometrical knowledge.

    PubMed

    Anderson, J A

    1997-08-29

    This paper introduces perspex algebra which is being developed as a common representation of geometrical knowledge. A perspex can currently be interpreted in one of four ways. First, the algebraic perspex is a generalization of matrices, it provides the most general representation for all of the interpretations of a perspex. The algebraic perspex can be used to describe arbitrary sets of coordinates. The remaining three interpretations of the perspex are all related to square matrices and operate in a Euclidean model of projective space-time, called perspex space. Perspex space differs from the usual Euclidean model of projective space in that it contains the point at nullity. It is argued that the point at nullity is necessary for a consistent account of perspective in top-down vision. Second, the geometric perspex is a simplex in perspex space. It can be used as a primitive building block for shapes, or as a way of recording landmarks on shapes. Third, the transformational perspex describes linear transformations in perspex space that provide the affine and perspective transformations in space-time. It can be used to match a prototype shape to an image, even in so called 'accidental' views where the depth of an object disappears from view, or an object stays in the same place across time. Fourth, the parametric perspex describes the geometric and transformational perspexes in terms of parameters that are related to everyday English descriptions. The parametric perspex can be used to obtain both continuous and categorical perception of objects. The paper ends with a discussion of issues related to using a perspex to describe logic.

  1. Representing geometrical knowledge.

    PubMed Central

    Anderson, J A

    1997-01-01

    This paper introduces perspex algebra which is being developed as a common representation of geometrical knowledge. A perspex can currently be interpreted in one of four ways. First, the algebraic perspex is a generalization of matrices, it provides the most general representation for all of the interpretations of a perspex. The algebraic perspex can be used to describe arbitrary sets of coordinates. The remaining three interpretations of the perspex are all related to square matrices and operate in a Euclidean model of projective space-time, called perspex space. Perspex space differs from the usual Euclidean model of projective space in that it contains the point at nullity. It is argued that the point at nullity is necessary for a consistent account of perspective in top-down vision. Second, the geometric perspex is a simplex in perspex space. It can be used as a primitive building block for shapes, or as a way of recording landmarks on shapes. Third, the transformational perspex describes linear transformations in perspex space that provide the affine and perspective transformations in space-time. It can be used to match a prototype shape to an image, even in so called 'accidental' views where the depth of an object disappears from view, or an object stays in the same place across time. Fourth, the parametric perspex describes the geometric and transformational perspexes in terms of parameters that are related to everyday English descriptions. The parametric perspex can be used to obtain both continuous and categorical perception of objects. The paper ends with a discussion of issues related to using a perspex to describe logic. PMID:9304680

  2. Increase Productivity Through Knowledge Management

    NASA Astrophysics Data System (ADS)

    Gavrikova, N. A.; Dolgih, I. N.; Dyrina, E. N.

    2016-04-01

    Increase in competition level requires companies to improve the efficiency of work force use characterized by labor productivity. Professional knowledge of staff and its experience play the key role in it. The results of Extrusion Line operator’s working time analysis are performed in this article. The analysis revealed that the reasons of working time ineffective use connected with inadequate information exchange and knowledge management in the company. Authors suggest the way to solve this problem: the main sources of knowledge in engineering enterprise have been defined, the conditions of success and the stages of knowledge management control have been stated.

  3. Knowledge translation of research findings.

    PubMed

    Grimshaw, Jeremy M; Eccles, Martin P; Lavis, John N; Hill, Sophie J; Squires, Janet E

    2012-05-31

    One of the most consistent findings from clinical and health services research is the failure to translate research into practice and policy. As a result of these evidence-practice and policy gaps, patients fail to benefit optimally from advances in healthcare and are exposed to unnecessary risks of iatrogenic harms, and healthcare systems are exposed to unnecessary expenditure resulting in significant opportunity costs. Over the last decade, there has been increasing international policy and research attention on how to reduce the evidence-practice and policy gap. In this paper, we summarise the current concepts and evidence to guide knowledge translation activities, defined as T2 research (the translation of new clinical knowledge into improved health). We structure the article around five key questions: what should be transferred; to whom should research knowledge be transferred; by whom should research knowledge be transferred; how should research knowledge be transferred; and, with what effect should research knowledge be transferred? We suggest that the basic unit of knowledge translation should usually be up-to-date systematic reviews or other syntheses of research findings. Knowledge translators need to identify the key messages for different target audiences and to fashion these in language and knowledge translation products that are easily assimilated by different audiences. The relative importance of knowledge translation to different target audiences will vary by the type of research and appropriate endpoints of knowledge translation may vary across different stakeholder groups. There are a large number of planned knowledge translation models, derived from different disciplinary, contextual (i.e., setting), and target audience viewpoints. Most of these suggest that planned knowledge translation for healthcare professionals and consumers is more likely to be successful if the choice of knowledge translation strategy is informed by an assessment of the

  4. Knowledge typology for imprecise probabilities.

    SciTech Connect

    Wilson, G. D.; Zucker, L. J.

    2002-01-01

    When characterizing the reliability of a complex system there are often gaps in the data available for specific subsystems or other factors influencing total system reliability. At Los Alamos National Laboratory we employ ethnographic methods to elicit expert knowledge when traditional data is scarce. Typically, we elicit expert knowledge in probabilistic terms. This paper will explore how we might approach elicitation if methods other than probability (i.e., Dempster-Shafer, or fuzzy sets) prove more useful for quantifying certain types of expert knowledge. Specifically, we will consider if experts have different types of knowledge that may be better characterized in ways other than standard probability theory.

  5. Knowledge translation of research findings

    PubMed Central

    2012-01-01

    Background One of the most consistent findings from clinical and health services research is the failure to translate research into practice and policy. As a result of these evidence-practice and policy gaps, patients fail to benefit optimally from advances in healthcare and are exposed to unnecessary risks of iatrogenic harms, and healthcare systems are exposed to unnecessary expenditure resulting in significant opportunity costs. Over the last decade, there has been increasing international policy and research attention on how to reduce the evidence-practice and policy gap. In this paper, we summarise the current concepts and evidence to guide knowledge translation activities, defined as T2 research (the translation of new clinical knowledge into improved health). We structure the article around five key questions: what should be transferred; to whom should research knowledge be transferred; by whom should research knowledge be transferred; how should research knowledge be transferred; and, with what effect should research knowledge be transferred? Discussion We suggest that the basic unit of knowledge translation should usually be up-to-date systematic reviews or other syntheses of research findings. Knowledge translators need to identify the key messages for different target audiences and to fashion these in language and knowledge translation products that are easily assimilated by different audiences. The relative importance of knowledge translation to different target audiences will vary by the type of research and appropriate endpoints of knowledge translation may vary across different stakeholder groups. There are a large number of planned knowledge translation models, derived from different disciplinary, contextual (i.e., setting), and target audience viewpoints. Most of these suggest that planned knowledge translation for healthcare professionals and consumers is more likely to be successful if the choice of knowledge translation strategy is informed by

  6. Knowledge acquisition for autonomous systems

    NASA Technical Reports Server (NTRS)

    Lum, Henry; Heer, Ewald

    1988-01-01

    Knowledge-based capabilities for autonomous aerospace systems, such as the NASA Space Station, must encompass conflict-resolution functions comparable to those of human operators, with all elements of the system working toward system goals in a concurrent, asynchronous-but-coordinated fashion. Knowledge extracted from a design database will support robotic systems by furnishing geometric, structural, and causal descriptions required for repair, disassembly, and assembly. The factual knowledge for these databases will be obtained from a master database through a technical management information system, and it will in many cases have to be augmented by domain-specific heuristic knowledge acquired from domain experts.

  7. Epistemologies of Situated Knowledges: "Troubling" Knowledge in Philosophy of Education

    ERIC Educational Resources Information Center

    Lang, James C.

    2011-01-01

    Epistemologies of situated knowledges, advanced by scholars such as Donna Haraway, Lorraine Code, and Maureen Ford, challenge mainstream epistemology's claim to be the gold standard in determining what counts as knowledge. In this essay, James Lang uses the work of these and other feminist theorists to explicate the notion of situated knowledges…

  8. Knowledge Management in Doctoral Education toward Knowledge Economy

    ERIC Educational Resources Information Center

    Stamou, Adamantia

    2017-01-01

    Purpose: The purpose of this paper is to investigate the role and the scope of knowledge management (KM) in doctoral education, in the emerging knowledge economy (KE) context, and the recommendation of a framework for KM in doctoral education. Design/Methodology/Approach: An extended literature analysis was contacted to elaborate the role and the…

  9. The Knowledge of Teaching--Pedagogical Content Knowledge (PCK)

    ERIC Educational Resources Information Center

    Shing, Chien Lee; Saat, Rohaida Mohd.; Loke, Siow Heng

    2015-01-01

    Pedagogical content knowledge (PCK) was first introduced by Shulman in the 80s. It is defined as the integration or amalgamation of pedagogy and content which basically covers the "what" and "how" of teaching. PCK was considered as the missing paradigm in the study of teaching. This integration of knowledge was long searched by…

  10. Advancing Knowledge in Schools through Consultative Knowledge Linking.

    ERIC Educational Resources Information Center

    Kratochwill, Thomas R.

    Consultation services have been considered an essential and important role for school psychologists throughout the history of the field. Traditionally consultation has been cast as a problem-solving process, nevertheless, it can be thought of as a knowledge-linking process in which psychologists advance knowledge in schools to various mediators…

  11. Epistemologies of Situated Knowledges: "Troubling" Knowledge in Philosophy of Education

    ERIC Educational Resources Information Center

    Lang, James C.

    2011-01-01

    Epistemologies of situated knowledges, advanced by scholars such as Donna Haraway, Lorraine Code, and Maureen Ford, challenge mainstream epistemology's claim to be the gold standard in determining what counts as knowledge. In this essay, James Lang uses the work of these and other feminist theorists to explicate the notion of situated knowledges…

  12. Knowledge Base Refinement by Monitoring Abstract Control Knowledge.

    ERIC Educational Resources Information Center

    Wilkins, D. C.; And Others

    Arguing that an explicit representation of the problem-solving method of an expert system shell as abstract control knowledge provides a powerful foundation for learning, this paper describes the abstract control knowledge of the Heracles expert system shell for heuristic classification problems, and describes how the Odysseus apprenticeship…

  13. Research on knowledge representation, machine learning, and knowledge acquisition

    NASA Technical Reports Server (NTRS)

    Buchanan, Bruce G.

    1987-01-01

    Research in knowledge representation, machine learning, and knowledge acquisition performed at Knowledge Systems Lab. is summarized. The major goal of the research was to develop flexible, effective methods for representing the qualitative knowledge necessary for solving large problems that require symbolic reasoning as well as numerical computation. The research focused on integrating different representation methods to describe different kinds of knowledge more effectively than any one method can alone. In particular, emphasis was placed on representing and using spatial information about three dimensional objects and constraints on the arrangement of these objects in space. Another major theme is the development of robust machine learning programs that can be integrated with a variety of intelligent systems. To achieve this goal, learning methods were designed, implemented and experimented within several different problem solving environments.

  14. A knowledge base architecture for distributed knowledge agents

    NASA Technical Reports Server (NTRS)

    Riedesel, Joel; Walls, Bryan

    1990-01-01

    A tuple space based object oriented model for knowledge base representation and interpretation is presented. An architecture for managing distributed knowledge agents is then implemented within the model. The general model is based upon a database implementation of a tuple space. Objects are then defined as an additional layer upon the database. The tuple space may or may not be distributed depending upon the database implementation. A language for representing knowledge and inference strategy is defined whose implementation takes advantage of the tuple space. The general model may then be instantiated in many different forms, each of which may be a distinct knowledge agent. Knowledge agents may communicate using tuple space mechanisms as in the LINDA model as well as using more well known message passing mechanisms. An implementation of the model is presented describing strategies used to keep inference tractable without giving up expressivity. An example applied to a power management and distribution network for Space Station Freedom is given.

  15. SU-E-T-572: A Plan Quality Metric for Evaluating Knowledge-Based Treatment Plans.

    PubMed

    Chanyavanich, V; Lo, J; Das, S

    2012-06-01

    In prostate IMRT treatment planning, the variation in patient anatomy makes it difficult to estimate a priori the potentially achievable extent of dose reduction possible to the rectum and bladder. We developed a mutual information-based framework to estimate the achievable plan quality for a new patient, prior to any treatment planning or optimization. The knowledge-base consists of 250 retrospective prostate IMRT plans. Using these prior plans, twenty query cases were each matched with five cases from the database. We propose a simple DVH plan quality metric (PQ) based on the weighted-sum of the areas under the curve (AUC) of the PTV, rectum and bladder. We evaluate the plan quality of knowledge-based generated plans, and established a correlation between the plan quality and case similarity. The introduced plan quality metric correlates well (r2 = 0.8) with the mutual similarity between cases. A matched case with high anatomical similarity can be used to produce a new high quality plan. Not surprisingly, a poorly matched case with low degree of anatomical similarity tends to produce a low quality plan, since the adapted fluences from a dissimilar case cannot be modified sufficiently to yield acceptable PTV coverage. The plan quality metric is well-correlated to the degree of anatomical similarity between a new query case and matched cases. Further work will investigate how to apply this metric to further stratify and select cases for knowledge-based planning. © 2012 American Association of Physicists in Medicine.

  16. Belief, Knowledge and Understanding. How to Deal with the Relations Between Different Cultural Perspectives in Classrooms

    NASA Astrophysics Data System (ADS)

    Moreira-dos-Santos, Frederik; El-Hani, Charbel N.

    2017-05-01

    This article discusses how to deal with the relations between different cultural perspectives in classrooms, based on a proposal for considering understanding and knowledge as goals of science education, inspired by Dewey's naturalistic humanism. It thus combines educational and philosophical interests. In educational terms, our concerns relate to how science teachers position themselves in multicultural classrooms. In philosophical terms, we are interested in discussing the relations between belief, understanding, and knowledge under the light of Dewey's philosophy. We present a synthesis of Dewey's theory of inquiry through his naturalistic humanism and discuss its implications for the concepts of belief, understanding, and knowledge, as well as for the goals of science teaching. In particular, we highlight problems arising in the context of possible conflicts between scientific and religious claims in the school environment that result from totalitarian positions. We characterize an individual's position as totalitarian if he or she takes some way of thinking as the only one capable of expressing the truth about all that exists in the world, lacks open-mindedness to understand different interpretative perspectives, and attempts to impose her or his interpretation about the facts to others by violent means or not. From this stance, any other perspective is taken to be false a priori and, accordingly, as a putative target to be suppressed or adapted to the privileged way of thinking. We argue, instead, for a more fallibilist evaluation of our own beliefs and a more respectful appraisal of the diversity of students' beliefs by both students and teachers.

  17. Chronic lymphocytic leukaemias and non-Hodgkin's lymphomas by histological type in farming-animal breeding workers: a population case-control study based on a priori exposure matrices.

    PubMed Central

    Nanni, O; Amadori, D; Lugaresi, C; Falcini, F; Scarpi, E; Saragoni, A; Buiatti, E

    1996-01-01

    OBJECTIVES: A population based case-control study was conducted in a highly agricultural area in Italy to investigate the association between chronic lymphocytic leukaemias (CLLs) and non-Hodgkin's lymphomas (NHLs), and subtypes, and exposure to pesticides in farming-animal breeding workers. METHODS: 187 cases of CLLs and NHLs and 977 population controls were interviewed on medical, residential, family, and occupational history. Detailed information was collected about cultivated crops and animals bred from subjects who worked in farming and animal breeding. Information on crop diseases and pesticides used (and their quantity and duration) was also obtained. A priori job-exposure matrices were applied when a crop disease was reported, estimating the most probable pesticide and, when possible, an estimate of the cumulative dose. Odds ratios (ORs) were calculated by unconditional logistic analysis with adjustment for relevant confounders in farmers who bred animals and in farmers alone, for the main crops, types of animals, and pesticides categories. First recall and then the matrices were used for defining exposure, as it affected CLLs and NHLs and then separately on CLLs and low grade NHLs. Finally, the dose-response was investigated for those pesticides which had shown some association. RESULTS: No variable under study was associated with work in farming alone. In farming and animal breeding, no crop or animal showed an association with CLLs and NHLs when adjusted by exposure during childhood to farming and animal breeding (an indicator of life in a farming and animal breeding environment before the age of 13, which behaved as an independent risk variable). A non-significant association was found with stannates, arsenates, phosphates, and dichloro-diphenyl-trichloroethane (DDT) based on recall, and for stannates, arsenates, and DDT after the application of the matrices. When CLLs together with low grade NHLs were considered, the association with insecticides in

  18. Applying the knowledge to action framework to plan a strategy for implementing breast cancer screening guidelines: an interprofessional perspective.

    PubMed

    Munce, Sarah; Kastner, Monika; Cramm, Heidi; Lal, Shalini; Deschêne, Sarah-Maude; Auais, Mohammad; Stacey, Dawn; Brouwers, Melissa

    2013-09-01

    Integrated knowledge translation (IKT) interventions may be one solution to improving the uptake of clinical guidelines. IKT research initiatives are particularly relevant for breast cancer research and initiatives targeting the implementation of clinical guidelines and guideline implementation initiatives, where collaboration with an interdisciplinary team of practitioners, patients, caregivers, and policy makers is needed for producing optimum patient outcomes. The objective of this paper was to describe the process of developing an IKT strategy that could be used by guideline developers to improve the uptake of their new clinical practice guidelines on breast cancer screening. An interprofessional group of students as well as two faculty members met six times over three days at the KT Canada Summer Institute in 2011. The team used all of the phases of the action cycle in the Knowledge to Action Framework as an organizing framework. While the entire framework was used, the step involving assessing barriers to knowledge use was judged to be particularly relevant in anticipating implementation problems and being able to inform the specific KT interventions that would be appropriate to mitigate these challenges and to accomplish goals and outcomes. This activity also underscored the importance of group process and teamwork in IKT. We propose that an a priori assessment of barriers to knowledge use (i.e., level and corresponding barriers), along with the other phases of the Knowledge to Action Framework, is a strategic approach for KT strategy development, implementation, and evaluation planning and could be used in the future planning of KT strategies.

  19. The Relationship between Immediate Relevant Basic Science Knowledge and Clinical Knowledge: Physiology Knowledge and Transthoracic Echocardiography Image Interpretation

    ERIC Educational Resources Information Center

    Nielsen, Dorte Guldbrand; Gotzsche, Ole; Sonne, Ole; Eika, Berit

    2012-01-01

    Two major views on the relationship between basic science knowledge and clinical knowledge stand out; the Two-world view seeing basic science and clinical science as two separate knowledge bases and the encapsulated knowledge view stating that basic science knowledge plays an overt role being encapsulated in the clinical knowledge. However, resent…

  20. The Relationship between Immediate Relevant Basic Science Knowledge and Clinical Knowledge: Physiology Knowledge and Transthoracic Echocardiography Image Interpretation

    ERIC Educational Resources Information Center

    Nielsen, Dorte Guldbrand; Gotzsche, Ole; Sonne, Ole; Eika, Berit

    2012-01-01

    Two major views on the relationship between basic science knowledge and clinical knowledge stand out; the Two-world view seeing basic science and clinical science as two separate knowledge bases and the encapsulated knowledge view stating that basic science knowledge plays an overt role being encapsulated in the clinical knowledge. However, resent…

  1. Knowledge Economy and Research Innovation

    ERIC Educational Resources Information Center

    Bastalich, Wendy

    2010-01-01

    The "knowledge economy" has been received with considerable scepticism by scholars within the fields of political economy, social and political philosophy, and higher education. Key arguments within this literature are reviewed in this article to suggest that, despite policy claims, "knowledge economy" does not describe a "new" mode of economic…

  2. Handbook for Driving Knowledge Testing.

    ERIC Educational Resources Information Center

    Pollock, William T.; McDole, Thomas L.

    Materials intended for driving knowledge test development for use by operational licensing and education agencies are presented. A pool of 1,313 multiple choice test items is included, consisting of sets of specially developed and tested items covering principles of safe driving, legal regulations, and traffic control device knowledge pertinent to…

  3. Research Commentary: Reconceptualizing Procedural Knowledge

    ERIC Educational Resources Information Center

    Star, Jon R.

    2005-01-01

    In this article, I argue for a renewed focus in mathematics education research on procedural knowledge. I make three main points: (1) The development of students' procedural knowledge has not received a great deal of attention in recent research; (2) one possible explanation for this deficiency is that current characterizations of conceptual and…

  4. Differentiating Knowledge, Differentiating (Occupational) Practice

    ERIC Educational Resources Information Center

    Hordern, Jim

    2016-01-01

    This paper extends arguments for differentiating knowledge into conceptualisations of occupational practice. It is argued that specialised forms of knowledge and practice require recognition and differentiation in ways that many contemporary approaches to practice theory deny. Drawing on Hager's interpretation of MacIntyre, it is suggested that…

  5. Pursuing the Depths of Knowledge

    ERIC Educational Resources Information Center

    Boyles, Nancy

    2016-01-01

    Today's state literacy standards and assessments demand deeper levels of knowledge from students. But many teachers ask, "What does depth of knowledge look like on these new, more rigorous assessments? How do we prepare students for this kind of thinking?" In this article, Nancy Boyles uses a sampling of questions from the PARCC and SBAC…

  6. The Importance of Prior Knowledge.

    ERIC Educational Resources Information Center

    Cleary, Linda Miller

    1989-01-01

    Recounts a college English teacher's experience of reading and rereading Noam Chomsky, building up a greater store of prior knowledge. Argues that Frank Smith provides a theory for the importance of prior knowledge and Chomsky's work provided a personal example with which to interpret and integrate that theory. (RS)

  7. Handbook for Driving Knowledge Testing.

    ERIC Educational Resources Information Center

    Pollock, William T.; McDole, Thomas L.

    Materials intended for driving knowledge test development for use by operational licensing and education agencies are presented. A pool of 1,313 multiple choice test items is included, consisting of sets of specially developed and tested items covering principles of safe driving, legal regulations, and traffic control device knowledge pertinent to…

  8. Biomedical Knowledge and Clinical Expertise.

    ERIC Educational Resources Information Center

    Boshuizen, Henny P. A.; Schmidt, Henk G.

    A study examined the application and availability of clinical and biomedical knowledge in the clinical reasoning of physicians as well as possible mechanisms responsible for changes in the organization of clinical and biomedical knowledge in the development from novice to expert. Subjects were 28 students (10 second year, 8 fourth year, and 10…

  9. Geographical Knowledge of University Students.

    ERIC Educational Resources Information Center

    Wood, Robert W.; And Others

    In order to obtain information on the status of geographical knowledge possessed by University of South Dakota (Vermillion) students, a geography survey designed to determine specific knowledge about the locations of bodies of water, countries, and cities was conducted. One map was used for identifying cities, while the second was used for…

  10. Experiencing Collaborative Knowledge Creation Processes

    ERIC Educational Resources Information Center

    Jakubik, Maria

    2008-01-01

    Purpose: How people learn and create knowledge together through interactions in communities of practice (CoPs) is not fully understood. The purpose of this paper is to create and apply a model that could increase participants' consciousness about knowledge creation processes. Design/methodology/approach: This four-month qualitative research was…

  11. The Importance of Prior Knowledge.

    ERIC Educational Resources Information Center

    Cleary, Linda Miller

    1989-01-01

    Recounts a college English teacher's experience of reading and rereading Noam Chomsky, building up a greater store of prior knowledge. Argues that Frank Smith provides a theory for the importance of prior knowledge and Chomsky's work provided a personal example with which to interpret and integrate that theory. (RS)

  12. Knowledge Infrastructures for Solar Cities

    ERIC Educational Resources Information Center

    Vanderburg, Willem H.

    2006-01-01

    The evolution of contemporary cities into solar cities will be affected by the decisions of countless specialists according to an established intellectual and professional division of labor. These specialists belong to groups responsible for advancing and applying a body of knowledge, and jointly, these bodies of knowledge make up a knowledge…

  13. Differentiating Knowledge, Differentiating (Occupational) Practice

    ERIC Educational Resources Information Center

    Hordern, Jim

    2016-01-01

    This paper extends arguments for differentiating knowledge into conceptualisations of occupational practice. It is argued that specialised forms of knowledge and practice require recognition and differentiation in ways that many contemporary approaches to practice theory deny. Drawing on Hager's interpretation of MacIntyre, it is suggested that…

  14. Emotion Processes in Knowledge Revision

    ERIC Educational Resources Information Center

    Trevors, Gregory J.; Kendeou, Panayiota; Butterfuss, Reese

    2017-01-01

    In recent years, a number of insights have been gained into the cognitive processes that explain how individuals overcome misconceptions and revise their previously acquired incorrect knowledge. The current study complements this line of research by investigating the moment-by-moment emotion processes that occur during knowledge revision using a…

  15. Mobilising Research Knowledge in Education

    ERIC Educational Resources Information Center

    Levin, Ben

    2011-01-01

    The field of knowledge mobilisation (KM) addresses the multiple ways in which stronger connections can be made between research, policy and practice. This paper reviews the current situation around knowledge mobilisation in education. It addresses changing understandings of KM, considers some of the main issues in conducting empirical research in…

  16. Pursuing the Depths of Knowledge

    ERIC Educational Resources Information Center

    Boyles, Nancy

    2016-01-01

    Today's state literacy standards and assessments demand deeper levels of knowledge from students. But many teachers ask, "What does depth of knowledge look like on these new, more rigorous assessments? How do we prepare students for this kind of thinking?" In this article, Nancy Boyles uses a sampling of questions from the PARCC and SBAC…

  17. Teachers' Knowledge of Education Law

    ERIC Educational Resources Information Center

    Littleton, Mark

    2008-01-01

    The knowledge base of education-related law is growing at a rapid pace. The increase in federal and state statutes is rising commensurate with litigation that directs teachers on curricular, professional, and social matters. At the same time, numerous studies provide significant evidence that teachers lack an adequate level of knowledge of…

  18. Mobilising Research Knowledge in Education

    ERIC Educational Resources Information Center

    Levin, Ben

    2011-01-01

    The field of knowledge mobilisation (KM) addresses the multiple ways in which stronger connections can be made between research, policy and practice. This paper reviews the current situation around knowledge mobilisation in education. It addresses changing understandings of KM, considers some of the main issues in conducting empirical research in…

  19. How Is Vocational Knowledge Recontextualised?

    ERIC Educational Resources Information Center

    Hordern, Jim

    2014-01-01

    This paper sets out to examine how vocational knowledge is recontextualised in curricula, pedagogy and workplaces, by learners, and to ensure the availability of valuable and relevant knowledge for vocational practice. Starting from Bernstein's notion of recontextualisation, and with reference to literature in the sociology of educational…

  20. Knowledge Management and Academic Libraries.

    ERIC Educational Resources Information Center

    Townley, Charles T.

    2001-01-01

    The emerging field of knowledge management offers academic libraries the opportunity to improve effectiveness, both for themselves and their parent institutions. This article summarizes knowledge management theory. Current applications in academic libraries and higher education are described. Similarities and difficulties between knowledge…