Science.gov

Sample records for priori knowledge lvgiga

  1. Level set segmentation for greenbelts by integrating wavelet texture and priori color knowledge

    NASA Astrophysics Data System (ADS)

    Yang, Tie-jun; Song, Zhi-hui; Jiang, Chuan-xian; Huang, Lin

    2013-09-01

    Segmenting greenbelts quickly and accurately in remote sensing images is an economic and effective method for the statistics of green coverage rate (GCR). Towards the problem of over-reliance on priori knowledge of the traditional level set segmentation model based on max-flow/min-cut Graph Cut principle and weighted Total Variation (GCTV), this paper proposes a level set segmentation method of combining regional texture features and priori knowledge of color and applies it to greenbelt segmentation in urban remote sensing images. For the color of greenbelts is not reliable for segmentation, Gabor wavelet transform is used to extract image texture features. Then we integrate the extracted features into the GCTV model which contains only priori knowledge of color, and use both the prior knowledge and the targets' texture to constrain the evolving of the level set which can solve the problem of over-reliance on priori knowledge. Meanwhile, the convexity of the corresponding energy functional is ensured by using relaxation and threshold method, and primal-dual algorithm with global relabeling is used to accelerate the evolution of the level set. The experiments show that our method can effectively reduce the dependence on priori knowledge of GCTV, and yields more accurate greenbelt segmentation results.

  2. Investigating industrial investigation: examining the impact of a priori knowledge and tunnel vision education.

    PubMed

    Maclean, Carla L; Brimacombe, C A Elizabeth; Lindsay, D Stephen

    2013-12-01

    The current study addressed tunnel vision in industrial incident investigation by experimentally testing how a priori information and a human bias (generated via the fundamental attribution error or correspondence bias) affected participants' investigative behavior as well as the effectiveness of a debiasing intervention. Undergraduates and professional investigators engaged in a simulated industrial investigation exercise. We found that participants' judgments were biased by knowledge about the safety history of either a worker or piece of equipment and that a human bias was evident in participants' decision making. However, bias was successfully reduced with "tunnel vision education." Professional investigators demonstrated a greater sophistication in their investigative decision making compared to undergraduates. The similarities and differences between these two populations are discussed. PMID:24295062

  3. Effective identification of essential proteins based on priori knowledge, network topology and gene expressions.

    PubMed

    Li, Min; Zheng, Ruiqing; Zhang, Hanhui; Wang, Jianxin; Pan, Yi

    2014-06-01

    Identification of essential proteins is very important for understanding the minimal requirements for cellular life and also necessary for a series of practical applications, such as drug design. With the advances in high throughput technologies, a large number of protein-protein interactions are available, which makes it possible to detect proteins' essentialities from the network level. Considering that most species already have a number of known essential proteins, we proposed a new priori knowledge-based scheme to discover new essential proteins from protein interaction networks. Based on the new scheme, two essential protein discovery algorithms, CPPK and CEPPK, were developed. CPPK predicts new essential proteins based on network topology and CEPPK detects new essential proteins by integrating network topology and gene expressions. The performances of CPPK and CEPPK were validated based on the protein interaction network of Saccharomyces cerevisiae. The experimental results showed that the priori knowledge of known essential proteins was effective for improving the predicted precision. The predicted precisions of CPPK and CEPPK clearly exceeded that of the other 10 previously proposed essential protein discovery methods: Degree Centrality (DC), Betweenness Centrality (BC), Closeness Centrality (CC), Subgraph Centrality (SC), Eigenvector Centrality (EC), Information Centrality (IC), Bottle Neck (BN), Density of Maximum Neighborhood Component (DMNC), Local Average Connectivity-based method (LAC), and Network Centrality (NC). Especially, CPPK achieved 40% improvement in precision over BC, CC, SC, EC, and BN, and CEPPK performed even better. CEPPK was also compared to four other methods (EPC, ORFL, PeC, and CoEWC) which were not node centralities and CEPPK was showed to achieve the best results. PMID:24565748

  4. Converting local spectral and spatial information from a priori classifiers into contextual knowledge for impervious surface classification

    NASA Astrophysics Data System (ADS)

    Luo, Li; Mountrakis, Giorgos

    2011-09-01

    A classification model was demonstrated that explored spectral and spatial contextual information from previously classified neighbors to improve classification of remaining unclassified pixels. The classification was composed by two major steps, the a priori and the a posteriori classifications. The a priori algorithm classified the less difficult image portion. The a posteriori classifier operated on the more challenging image parts and strived to enhance accuracy by converting classified information from the a priori process into specific knowledge. The novelty of this work relies on the substitution of image-wide information with local spectral representations and spatial correlations, in essence classifying each pixel using exclusively neighboring behavior. Furthermore, the a posteriori classifier is a simple and intuitive algorithm, adjusted to perform in a localized setting for the task requirements. A 2001 and a 2006 Landsat scene from Central New York were used to assess the performance on an impervious classification task. The proposed method was compared with a back propagation neural network. Kappa statistic values in the corresponding applicable datasets increased from 18.67 to 24.05 for the 2006 scene, and from 22.92 to 35.76 for the 2001 scene classification, mostly correcting misclassifications between impervious and soil pixels. This finding suggests that simple classifiers have the ability to surpass complex classifiers through incorporation of partial results and an elegant multi-process framework.

  5. SOLVING THE INTERIOR PROBLEM OF COMPUTED TOMOGRAPHY USING A PRIORI KNOWLEDGE

    PubMed Central

    Courdurier, M.; Noo, F.; Defrise, M.; Kudo, H.

    2008-01-01

    The case of incomplete tomographic data for a compactly supported attenuation function is studied. When the attenuation function is a priori known in a subregion, we show that a reduced set of measurements is enough to uniquely determine the attenuation function over all the space. Furthermore, we found stability estimates showing that reconstruction can be stable near the region where the attenuation is known. These estimates also suggest that reconstruction stability collapses quickly when approaching the set of points that are viewed under less than 180 degrees. This paper may be seen as a continuation of the work “Truncated Hilbert transform and Image reconstruction from limited tomographic data” that was published in Inverse Problems in 2006. This continuation tackles new cases of incomplete data that could be of interest in applications of computed tomography. PMID:20613970

  6. A Computationally Efficient, Exploratory Approach to Brain Connectivity Incorporating False Discovery Rate Control, A Priori Knowledge, and Group Inference

    PubMed Central

    Liu, Aiping; Li, Junning; Wang, Z. Jane; McKeown, Martin J.

    2012-01-01

    Graphical models appear well suited for inferring brain connectivity from fMRI data, as they can distinguish between direct and indirect brain connectivity. Nevertheless, biological interpretation requires not only that the multivariate time series are adequately modeled, but also that there is accurate error-control of the inferred edges. The PCfdr algorithm, which was developed by Li and Wang, was to provide a computationally efficient means to control the false discovery rate (FDR) of computed edges asymptotically. The original PCfdr algorithm was unable to accommodate a priori information about connectivity and was designed to infer connectivity from a single subject rather than a group of subjects. Here we extend the original PCfdr algorithm and propose a multisubject, error-rate-controlled brain connectivity modeling approach that allows incorporation of prior knowledge of connectivity. In simulations, we show that the two proposed extensions can still control the FDR around or below a specified threshold. When the proposed approach is applied to fMRI data in a Parkinson's disease study, we find robust group evidence of the disease-related changes, the compensatory changes, and the normalizing effect of L-dopa medication. The proposed method provides a robust, accurate, and practical method for the assessment of brain connectivity patterns from functional neuroimaging data. PMID:23251232

  7. Elimination of Systematic Mass Measurement Errors in Liquid Chromatography-Mass Spectrometry Based Proteomics using Regression Models and a priori Partial Knowledge of the Sample Content

    SciTech Connect

    Petyuk, Vladislav A.; Jaitly, Navdeep; Moore, Ronald J.; Ding, Jie; Metz, Thomas O.; Tang, Keqi; Monroe, Matthew E.; Tolmachev, Aleksey V.; Adkins, Joshua N.; Belov, Mikhail E.; Dabney, Alan R.; Qian, Weijun; Camp, David G.; Smith, Richard D.

    2008-02-01

    The high mass measurement accuracy and precision available with recently developed mass spectrometers is increasingly used in proteomics analyses to confidently identify tryptic peptides from complex mixtures of proteins, as well as post-translational modifications and peptides from non-annotated proteins. To take full advantage of high mass measurement accuracy instruments it is necessary to limit systematic mass measurement errors. It is well known that errors in the measurement of m/z can be affected by experimental parameters that include e.g., outdated calibration coefficients, ion intensity, and temperature changes during the measurement. Traditionally, these variations have been corrected through the use of internal calibrants (well-characterized standards introduced with the sample being analyzed). In this paper we describe an alternative approach where the calibration is provided through the use of a priori knowledge of the sample being analyzed. Such an approach has previously been demonstrated based on the dependence of systematic error on m/z alone. To incorporate additional explanatory variables, we employed multidimensional, nonparametric regression models, which were evaluated using several commercially available instruments. The applied approach is shown to remove any noticeable biases from the overall mass measurement errors, and decreases the overall standard deviation of the mass measurement error distribution by 1.2- to 2-fold, depending on instrument type. Subsequent reduction of the random errors based on multiple measurements over consecutive spectra further improves accuracy and results in an overall decrease of the standard deviation by 1.8- to 3.7-fold. This new procedure will decrease the false discovery rates for peptide identifications using high accuracy mass measurements.

  8. Flood delineation from synthetic aperture radar data with the help of a priori knowledge from historical acquisitions and digital elevation models in support of near-real-time flood mapping

    NASA Astrophysics Data System (ADS)

    Schlaffer, Stefan; Hollaus, Markus; Wagner, Wolfgang; Matgen, Patrick

    2012-10-01

    The monitoring of flood events with synthetic aperture radar (SAR) sensors has attracted a considerable amount of attention during the last decade, owing to the growing interest in using spaceborne data in near-real time flood management. Most existing methods for classifying flood extent from SAR data rely on pure image processing techniques. In this paper, we propose a method involving a priori knowledge about an area taken from a multitemporal time series and a digital elevation model. A time series consisting of ENVISAT ASAR acquisitions was geocoded and coregistered. Then, a harmonic model was fitted to each pixel time series. The standardised residuals of the model were classified as flooded when exceeding a certain threshold value. Additionally, the classified flood extent was limited to flood-prone areas which were derived from a freely available DEM using the height above nearest drainage (HAND) index. Comparison with two different reference datasets for two different flood events showed that the approach yielded realistic results but underestimated the inundation extent. Among the possible reasons for this are the rather coarse resolution of 150 m and the sparse data coverage for a substantial part of the time series. Nevertheless, the study shows the potential for production of rapid overviews in near-real time in support of early response to flood crises.

  9. An Approach for the Long-Term 30-m Land Surface Snow-Free Albedo Retrieval from Historic Landsat Surface Reflectance and MODIS-based A Priori Anisotropy Knowledge

    NASA Technical Reports Server (NTRS)

    Shuai, Yanmin; Masek, Jeffrey G.; Gao, Feng; Schaaf, Crystal B.; He, Tao

    2014-01-01

    Land surface albedo has been recognized by the Global Terrestrial Observing System (GTOS) as an essential climate variable crucial for accurate modeling and monitoring of the Earth's radiative budget. While global climate studies can leverage albedo datasets from MODIS, VIIRS, and other coarse-resolution sensors, many applications in heterogeneous environments can benefit from higher-resolution albedo products derived from Landsat. We previously developed a "MODIS-concurrent" approach for the 30-meter albedo estimation which relied on combining post-2000 Landsat data with MODIS Bidirectional Reflectance Distribution Function (BRDF) information. Here we present a "pre-MODIS era" approach to extend 30-m surface albedo generation in time back to the 1980s, through an a priori anisotropy Look-Up Table (LUT) built up from the high quality MCD43A BRDF estimates over representative homogenous regions. Each entry in the LUT reflects a unique combination of land cover, seasonality, terrain information, disturbance age and type, and Landsat optical spectral bands. An initial conceptual LUT was created for the Pacific Northwest (PNW) of the United States and provides BRDF shapes estimated from MODIS observations for undisturbed and disturbed surface types (including recovery trajectories of burned areas and non-fire disturbances). By accepting the assumption of a generally invariant BRDF shape for similar land surface structures as a priori information, spectral white-sky and black-sky albedos are derived through albedo-to-nadir reflectance ratios as a bridge between the Landsat and MODIS scale. A further narrow-to-broadband conversion based on radiative transfer simulations is adopted to produce broadband albedos at visible, near infrared, and shortwave regimes.We evaluate the accuracy of resultant Landsat albedo using available field measurements at forested AmeriFlux stations in the PNW region, and examine the consistency of the surface albedo generated by this approach

  10. Measurement, coordination, and the relativized a priori

    NASA Astrophysics Data System (ADS)

    Padovani, Flavia

    2015-11-01

    The problem of measurement is a central issue in the epistemology and methodology of the physical sciences. In recent literature on scientific representation, large emphasis has been put on the "constitutive role" played by measurement procedures as forms of representation. Despite its importance, this issue hardly finds any mention in writings on constitutive principles, viz. in Michael Friedman's account of relativized a priori principles. This issue, instead, was at the heart of Reichenbach's analysis of coordinating principles that has inspired Friedman's interpretation. This paper suggests that these procedures should have a part in an account of constitutive principles of science, and that they could be interpreted following the intuition originally present (but ultimately not fully developed) in Reichenbach's early work.

  11. Predictive a priori pressure-dependent kinetics.

    PubMed

    Jasper, Ahren W; Pelzer, Kenley M; Miller, James A; Kamarchik, Eugene; Harding, Lawrence B; Klippenstein, Stephen J

    2014-12-01

    The ability to predict the pressure dependence of chemical reaction rates would be a great boon to kinetic modeling of processes such as combustion and atmospheric chemistry. This pressure dependence is intimately related to the rate of collision-induced transitions in energy E and angular momentum J. We present a scheme for predicting this pressure dependence based on coupling trajectory-based determinations of moments of the E,J-resolved collisional transfer rates with the two-dimensional master equation. This completely a priori procedure provides a means for proceeding beyond the empiricism of prior work. The requisite microcanonical dissociation rates are obtained from ab initio transition state theory. Predictions for the CH4 = CH3 + H and C2H3 = C2H2 + H reaction systems are in excellent agreement with experiment. PMID:25477457

  12. Conventional Principles in Science: On the foundations and development of the relativized a priori

    NASA Astrophysics Data System (ADS)

    Ivanova, Milena; Farr, Matt

    2015-11-01

    The present volume consists of a collection of papers originally presented at the conference Conventional Principles in Science, held at the University of Bristol, August 2011, which featured contributions on the history and contemporary development of the notion of 'relativized a priori' principles in science, from Henri Poincaré's conventionalism to Michael Friedman's contemporary defence of the relativized a priori. In Science and Hypothesis, Poincaré assessed the problematic epistemic status of Euclidean geometry and Newton's laws of motion, famously arguing that each has the status of 'convention' in that their justification is neither analytic nor empirical in nature. In The Theory of Relativity and A Priori Knowledge, Hans Reichenbach, in light of the general theory of relativity, proposed an updated notion of the Kantian synthetic a priori to account for the dynamic inter-theoretic status of geometry and other non-empirical physical principles. Reichenbach noted that one may reject the 'necessarily true' aspect of the synthetic a priori whilst preserving the feature of being constitutive of the object of knowledge. Such constitutive principles are theory-relative, as illustrated by the privileged role of non-Euclidean geometry in general relativity theory. This idea of relativized a priori principles in spacetime physics has been analysed and developed at great length in the modern literature in the work of Michael Friedman, in particular the roles played by the light postulate and the equivalence principle - in special and general relativity respectively - in defining the central terms of their respective theories and connecting the abstract mathematical formalism of the theories with their empirical content. The papers in this volume guide the reader through the historical development of conventional and constitutive principles in science, from the foundational work of Poincaré, Reichenbach and others, to contemporary issues and applications of the

  13. Comparison of a priori calibration models for respiratory inductance plethysmography during running.

    PubMed

    Leutheuser, Heike; Heyde, Christian; Gollhofer, Albert; Eskofier, Bjoern M

    2014-01-01

    Respiratory inductive plethysmography (RIP) has been introduced as an alternative for measuring ventilation by means of body surface displacement (diameter changes in rib cage and abdomen). Using a posteriori calibration, it has been shown that RIP may provide accurate measurements for ventilatory tidal volume under exercise conditions. Methods for a priori calibration would facilitate the application of RIP. Currently, to the best knowledge of the authors, none of the existing ambulant procedures for RIP calibration can be used a priori for valid subsequent measurements of ventilatory volume under exercise conditions. The purpose of this study is to develop and validate a priori calibration algorithms for ambulant application of RIP data recorded in running exercise. We calculated Volume Motion Coefficients (VMCs) using seven different models on resting data and compared the root mean squared error (RMSE) of each model applied on running data. Least squares approximation (LSQ) without offset of a two-degree-of-freedom model achieved the lowest RMSE value. In this work, we showed that a priori calibration of RIP exercise data is possible using VMCs calculated from 5 min resting phase where RIP and flowmeter measurements were performed simultaneously. The results demonstrate that RIP has the potential for usage in ambulant applications. PMID:25571459

  14. Knowledge.

    ERIC Educational Resources Information Center

    Online-Offline, 1999

    1999-01-01

    This theme issue on knowledge includes annotated listings of Web sites, CD-ROMs and computer software, videos, books, and additional resources that deal with knowledge and differences between how animals and humans learn. Sidebars discuss animal intelligence, learning proper behavior, and getting news from the Internet. (LRW)

  15. "A Priori" Assessment of Language Learning Tasks by Practitioners

    ERIC Educational Resources Information Center

    Westhoff, Gerard J.

    2009-01-01

    Teachers' competence to estimate the effectiveness of learning materials is important and often neglected in programmes for teacher education. In this lecture I will try to explore the possibilities of designing scaffolding instruments for a "priori" assessment of language learning tasks, based on insights from SLA and cognitive psychology, more…

  16. The Influence of "a priori" Ideas on the Experimental Approach.

    ERIC Educational Resources Information Center

    Cauzinille-Marmeche, Evelyne; And Others

    1985-01-01

    Investigated the role of "a priori" ideas in planning experiments and data processing leading to inferences. Thirty-one students (ages 11-13) observed a "combustion/candle in a closed container" experiment and were asked to interpret sets of measurements. Findings, among others, show that children preferentially experiment on factors about which…

  17. Ex Priori: Exposure-based Prioritization across Chemical Space

    EPA Science Inventory

    EPA's Exposure Prioritization (Ex Priori) is a simplified, quantitative visual dashboard that makes use of data from various inputs to provide rank-ordered internalized dose metric. This complements other high throughput screening by viewing exposures within all chemical space si...

  18. First-arrival traveltime sound speed inversion with a priori information

    PubMed Central

    Hooi, Fong Ming; Carson, Paul L.

    2014-01-01

    Purpose: A first-arrival travel-time sound speed algorithm presented byTarantola [Inverse Problem Theory and Methods for Model Parameter Estimation (SIAM, Philadelphia, PA, 2005)] is adapted to the medical ultrasonics setting. Through specification of a covariance matrix for the object model, the algorithm allows for natural inclusion of physical a priori information of the object. The algorithm's ability to accurately and robustly reconstruct a complex sound speed distribution is demonstrated on simulation and experimental data using a limited aperture. Methods: The algorithm is first demonstrated generally in simulation with a numerical breast phantom imaged in different geometries. As this work is motivated by the authors' limited aperture dual sided ultrasound breast imaging system, experimental data are acquired with a Verasonics system with dual, 128 element, linear L7-4 arrays. The transducers are automatically calibrated for usage in the eikonal forward model.A priori information such as knowledge of correlated regions within the object is obtained via segmentation of B-mode images generated from synthetic aperture imaging. Results: As one illustration of the algorithm's facility for inclusion ofa priori information, physically grounded regularization is demonstrated in simulation. The algorithm's practicality is then demonstrated through experimental realization in limited aperture cases. Reconstructions of sound speed distributions of various complexity are improved through inclusion of a priori information. The sound speed maps are generally reconstructed with accuracy within a few m/s. Conclusions: This paper demonstrates the ability to form sound speed images using two opposed commercial linear arrays to mimic ultrasound image acquisition in the compressed mammographic geometry. The ability to create reasonably good speed of sound images in the compressed mammographic geometry allows images to be readily coregistered to tomosynthesis image volumes for

  19. Bioluminescence tomography with structural and functional a priori information

    NASA Astrophysics Data System (ADS)

    Yan, Han; Unlu, Mehmet B.; Nalcioglu, Orhan; Gulsen, Gultekin

    2010-02-01

    Multispectral bioluminescence tomography (BLT) is one of the seemingly promising approaches to recover 3D tomographic images of bioluminescence source distribution in vivo. In bioluminescence tomography, internal light source, such as luciferase is activated within a volume and multiple wavelength emission data from the internal bioluminescence sources is acquired for reconstruction. The underline non-uniqueness problem associated with non-spectrally resolved intensity-based bioluminescence tomography was demonstrated by Dehghani et al. and it also shown that using a spectrally resolved technique, an accurate solution for the source distribution can be calculated from the measured data if both functional and anatomical a priori information are at hand. Thus it is of great desire to develop an imaging system that is capable of simultaneously acquiring both the optical and structural a priori information as well as acquiring the bioluminescence data. In this paper we present our first combined optical tomography and CT system which constitutes with a cool CCD camera ( perkin elmer "cold blue"), laser launching units and Xray CT( Dxray proto-type). It is capable of acquiring non contact diffuse optical tomography (DOT) data which is used for functional a priori; X-ray CT images which yields the structure information; and BLT images. Physical phantom experiments are designed to verify the system accuracy, repeatability and resolution. These studies shows the feasibility of such imaging system and its potential.

  20. A priori estimates for the Hill and Dirac operators

    NASA Astrophysics Data System (ADS)

    Korotyaev, E.

    2008-09-01

    The Hill operator Ty = -y″ + q'( t) y is considered in L 2(ℝ), where q ∈ L 2(0, 1) is a periodic real potential. The spectrum of T is absolutely continuous and consists of bands separated by gaps. We obtain a priori estimates of gap lengths, effective masses, and action variables for the KDV equation. In the proof of these results, the analysis of a conformal mapping corresponding to quasimomentum of the Hill operator is used. Similar estimates for the Dirac operator are obtained.

  1. Fluorescence molecular-tomography reconstruction with a priori anatomical information

    NASA Astrophysics Data System (ADS)

    Zhou, Lu; Yazici, Birsen; Ntziachristos, Vasilis

    2008-02-01

    In this study, we combine a generalized Tikhonov regularization method with a priori anatomical information to reconstruct the concentration of fluorophores in mouse with Chronic Obstructive Pulmonary disease (COPD) from in vivo optical and Magnetic Resonance (MR) measurements. Generalized Tikhonov regularization incorporates a penalty term in the optimization formulation of the fluorescence molecular tomography (FMT) inverse problem. Our design involves two penalty terms to make use of a priori anatomical structural information from segmented MR images. The choice of the penalty terms guide the fluorophores in reconstructed image concentrates in the region where it is supposed to be and assure smooth flourophore distribution within tissue of same type and enhances the discontinuities between different tissue types. We compare our results with traditional Tikhanov regularization techniques in extensive simulations and demonstrate the performance our approach in vivo mouse data. The results show that the increased fluorophore concentration in the mouse lungs is consistent with an increased inflammatory response expected from the corresponding animal disease model.

  2. A priori discretization quality metrics for distributed hydrologic modeling applications

    NASA Astrophysics Data System (ADS)

    Liu, Hongli; Tolson, Bryan; Craig, James; Shafii, Mahyar; Basu, Nandita

    2016-04-01

    In distributed hydrologic modelling, a watershed is treated as a set of small homogeneous units that address the spatial heterogeneity of the watershed being simulated. The ability of models to reproduce observed spatial patterns firstly depends on the spatial discretization, which is the process of defining homogeneous units in the form of grid cells, subwatersheds, or hydrologic response units etc. It is common for hydrologic modelling studies to simply adopt a nominal or default discretization strategy without formally assessing alternative discretization levels. This approach lacks formal justifications and is thus problematic. More formalized discretization strategies are either a priori or a posteriori with respect to building and running a hydrologic simulation model. A posteriori approaches tend to be ad-hoc and compare model calibration and/or validation performance under various watershed discretizations. The construction and calibration of multiple versions of a distributed model can become a seriously limiting computational burden. Current a priori approaches are more formalized and compare overall heterogeneity statistics of dominant variables between candidate discretization schemes and input data or reference zones. While a priori approaches are efficient and do not require running a hydrologic model, they do not fully investigate the internal spatial pattern changes of variables of interest. Furthermore, the existing a priori approaches focus on landscape and soil data and do not assess impacts of discretization on stream channel definition even though its significance has been noted by numerous studies. The primary goals of this study are to (1) introduce new a priori discretization quality metrics considering the spatial pattern changes of model input data; (2) introduce a two-step discretization decision-making approach to compress extreme errors and meet user-specified discretization expectations through non-uniform discretization threshold

  3. Using a priori information for regularization in breast microwave image reconstruction.

    PubMed

    Ashtari, Ali; Noghanian, Sima; Sabouni, Abas; Aronsson, Jonatan; Thomas, Gabriel; Pistorius, Stephen

    2010-09-01

    Regularization methods are used in microwave image reconstruction problems, which are ill-posed. Traditional regularization methods are usually problem-independent and do not take advantage of a priori information specific to any particular imaging application. In this paper, a novel problem-dependent regularization approach is introduced for the application of breast imaging. A real genetic algorithm (RGA) minimizes a cost function that is the error between the recorded and the simulated data. At each iteration of the RGA, a priori information about the shape of the breast profiles is used by a neural network classifier to reject the solutions that cannot be a map of the dielectric properties of a breast profile. The algorithm was tested against four realistic numerical breast phantoms including a mostly fatty, a scattered fibroglandular, a heterogeneously dense, and a very dense sample. The tests were also repeated where a 4 mm x 4 mm tumor was inserted in the fibroglandular tissue in each of the four breast types. The results show the effectiveness of the proposed approach, which to the best of our knowledge has the highest resolution amongst the evolutionary algorithms used for the inversion of realistic numerical breast phantoms. PMID:20562033

  4. A priori physicalism, lonely ghosts and Cartesian doubt.

    PubMed

    Goff, Philip

    2012-06-01

    A zombie is a physical duplicates of a human being which lacks consciousness. A ghost is a phenomenal duplicate of a human being whose nature is exhausted by consciousness. Discussion of zombie arguments, that is anti-physicalist arguments which appeal to the conceivability of zombies, is familiar in the philosophy of mind literature, whilst ghostly arguments, that is, anti-physicalist arguments which appeal to the conceivability of ghosts, are somewhat neglected. In this paper I argue that ghostly arguments have a number of dialectical advantages over zombie arguments. I go onto explain how the conceivability of ghosts is inconsistent with two kinds of a priori physicalism: analytic functionalism and the Australian physicalism of Armstrong and Lewis. PMID:21459620

  5. Deconvolution in line scanners using a priori information

    NASA Astrophysics Data System (ADS)

    Wirnitzer, Bernhard; Spraggon-Hernandez, Tadeo

    2002-12-01

    In a digital camera the MTF of the optical system must comprise a low-pass filter in order to avoid aliasing. The MTF of incoherent imaging usually and in principle is far from an ideal low-pass. Theoretically a digital ARMA-Filter can be used to compensate for this drawback. In praxis such deconvolution filters suffer from instability because of time-variant noise and space-variance of the MTF. In addition in a line scanner the MTF in scan direction slightly differs in each scanned image. Therefore inverse filtering will not operate satisfactory in an unknown environment. A new concept is presented which solves both problems using a-priori information about an object, e.g. that parts of it are known to be binary. This information is enough to achieve a stable space and time-variant ARMA-deconvolution filter. Best results are achieved using non linear filtering and pattern feedback. The new method was used to improve the bit-error-rate (BER) of a high-density matrix-code scanner by more than one order of magnitude. An audio scanner will be demonstrated, which reads 12 seconds of music in CD-quality from an audio coded image of 18mmÚ55mm size.

  6. Perfusion from angiogram and a priori (PAP) with temporal regularization

    NASA Astrophysics Data System (ADS)

    Taguchi, Katsuyuki; Geschwind, Jean-Francois H.

    2009-02-01

    Perfusion imaging is often used for diagnosis and for assessment of the response to the treatment. If perfusion can be measured during interventional procedures, it could lead to quantitative, more efficient and accurate treatment; however, imaging modalities that allow continuous dynamic scanning are not available in most of procedure rooms. Thus, we developed a method to measure the perfusion-time attenuation curves (TACs)-of regions-of-interest (ROIs) using xray C-arm angiography system with no gantry rotation but with a priori. The previous study revealed a problem of large oscillations in the estimated TACs and the lack of comparison with CT-based approaches. Thus the purposes of this study were (1) to reduce the variance of TDCs; and (2) to compare the performance of the improved PAP with that of the CT-based perfusion method. Our computer simulation study showed that the standard deviation of PAP method was decreased by 10.7-59.0% and that it outperformed (20× or 200× times) higher dose CT methods in terms of the accuracy, variance, and the temporal resolution.

  7. Precise regional baseline estimation using a priori orbital information

    NASA Technical Reports Server (NTRS)

    Lindqwister, Ulf J.; Lichten, Stephen M.; Blewitt, Geoffrey

    1990-01-01

    A solution using GPS measurements acquired during the CASA Uno campaign has resulted in 3-4 mm horizontal daily baseline repeatability and 13 mm vertical repeatability for a 729 km baseline, located in North America. The agreement with VLBI is at the level of 10-20 mm for all components. The results were obtained with the GIPSY orbit determination and baseline estimation software and are based on five single-day data arcs spanning the 20, 21, 25, 26, and 27 of January, 1988. The estimation strategy included resolving the carrier phase integer ambiguities, utilizing an optial set of fixed reference stations, and constraining GPS orbit parameters by applying a priori information. A multiday GPS orbit and baseline solution has yielded similar 2-4 mm horizontal daily repeatabilities for the same baseline, consistent with the constrained single-day arc solutions. The application of weak constraints to the orbital state for single-day data arcs produces solutions which approach the precise orbits obtained with unconstrained multiday arc solutions.

  8. A no a priori knowledge estimation of the impulse response for satellite image noise reduction

    NASA Astrophysics Data System (ADS)

    Benbouzid, A. B.; Taleb, N.

    2015-04-01

    Due to launching vibrations and space harsh environment, high resolution remote sensing satellite imaging systems require permanent assessment and control of image quality, which may vary between ground pre-launch measurements, after launch and over satellite lifetime. In order to mitigate noise, remove artifacts and enhance image interpretability, the Point Spread Function (PSF) of the imaging system is estimated. Image deconvolution can be performed across the characterization of the actual Modulation Transfer Function (MTF) of the imaging system. In this work we focus on adapting and applying a no reference method to characterize in orbit high resolution satellite images in terms of geometrical performance. Moreover, we use natural details contained in images as edges transitions to estimate the impulse response via the assessment of the MTF image. The obtained results are encouraging and promising.

  9. A priori precision estimation for neutron triples counting

    SciTech Connect

    Croft, S.; Swinhoe, M. T.; Henzl, V.

    2011-07-01

    The nondestructive assay of Plutonium bearing items for criticality, safety, security, safeguards, inventory balance, process control, waste management and compliance is often undertaken using correlated neutron counting. In particular Multiplicity Shift Register analysis allows one to extract autocorrelation parameters from the pulse train which can, within the framework of a simple interpretational model, be related to the effective {sup 240}Pu spontaneous fission mass present. The effective {sup 240}Pu mass is a weighted sum of the {sup 238}Pu, {sup 240}Pu and {sup 242}Pu masses so if the relative isotopic composition of the Pu can be established from the measured {sup 240}Pu effective mass one can estimate the total Pu mass and also the masses of the individual isotopes, example the fissile species {sup 239}Pu and {sup 241}Pu. In multiplicity counting three counting rates are obtained. These are the Singles, Doubles and Triples rates. The Singles rate is just the gross, totals or trigger rate. The Doubles and Triples rates are calculated from factorial moments of the observed signal triggered neutron multiplicity distributions following spontaneous fission in the item and can be thought of as the rate of observed coincident pairs and coincident triplets on the pulse train. Coincident events come about because the spontaneous fission and induced fission chains taking place in the item result in bursts of neutrons. These remain time correlated during the detection process and so retain information, through the burst size distribution, about the Pu content. In designing and assessing the performance of a detector system to meet a given goal it is necessary to make a priori estimates of the counting precision for all three kinds of rates. This is non-trivial because the counting does not obey the familiar rules of a Poissonian counting experiment because the pulse train has time correlated events on it and the train is sampled by event triggered gates that may

  10. The A Priori Ideological Orientation of Schools in Kibbutzim in Israel.

    ERIC Educational Resources Information Center

    Gross, Zehavit

    This paper examines the a-priori ideological orientation of pupils in two different types of schools in the kibbutzim in Israel, the movement schools (Hatakam or Hashomer Hatzair) and the mixed schools. The paper attempts to show how different educational circumstances and environments develop a distinct a-priori ideological orientation in…

  11. Validating Affordances as an Instrument for Design and a Priori Analysis of Didactical Situations in Mathematics

    ERIC Educational Resources Information Center

    Sollervall, Håkan; Stadler, Erika

    2015-01-01

    The aim of the presented case study is to investigate how coherent analytical instruments may guide the a priori and a posteriori analyses of a didactical situation. In the a priori analysis we draw on the notion of affordances, as artefact-mediated opportunities for action, to construct hypothetical trajectories of goal-oriented actions that have…

  12. Use of a priori statistics to minimize acquisition time for RFI immune spread spectrum systems

    NASA Technical Reports Server (NTRS)

    Holmes, J. K.; Woo, K. T.

    1978-01-01

    The optimum acquisition sweep strategy was determined for a PN code despreader when the a priori probability density function was not uniform. A psuedo noise spread spectrum system was considered which could be utilized in the DSN to combat radio frequency interference. In a sample case, when the a priori probability density function was Gaussian, the acquisition time was reduced by about 41% compared to a uniform sweep approach.

  13. Lost in space: Onboard star identification using CCD star tracker data without an a priori attitude

    NASA Technical Reports Server (NTRS)

    Ketchum, Eleanor A.; Tolson, Robert H.

    1993-01-01

    There are many algorithms in use today which determine spacecraft attitude by identifying stars in the field of view of a star tracker. Some methods, which date from the early 1960's, compare the angular separation between observed stars with a small catalog. In the last 10 years, several methods have been developed which speed up the process and reduce the amount of memory needed, a key element to onboard attitude determination. However, each of these methods require some a priori knowledge of the spacecraft attitude. Although the Sun and magnetic field generally provide the necessary coarse attitude information, there are occasions when a spacecraft could get lost when it is not prudent to wait for sunlight. Also, the possibility of efficient attitude determination using only the highly accurate CCD star tracker could lead to fully autonomous spacecraft attitude determination. The need for redundant coarse sensors could thus be eliminated at substantial cost reduction. Some groups have extended their algorithms to implement a computation intense full sky scan. Some require large data bases. Both storage and speed are concerns for autonomous onboard systems. Neural network technology is even being explored by some as a possible solution, but because of the limited number of patterns that can be stored and large overhead, nothing concrete has resulted from these efforts. This paper presents an algorithm which, by descretizing the sky and filtering by visual magnitude of the brightness observed star, speeds up the lost in space star identification process while reducing the amount of necessary onboard computer storage compared to existing techniques.

  14. Efficient a priori identification of drug resistant mutations using Dead-End Elimination and MM-PBSA.

    PubMed

    Safi, Maria; Lilien, Ryan H

    2012-06-25

    Active site mutations that disrupt drug binding are an important mechanism of drug resistance. Computational methods capable of predicting resistance a priori are poised to become extremely useful tools in the fields of drug discovery and treatment design. In this paper, we describe an approach to predicting drug resistance on the basis of Dead-End Elimination and MM-PBSA that requires no prior knowledge of resistance. Our method utilizes a two-pass search to identify mutations that impair drug binding while maintaining affinity for the native substrate. We use our method to probe resistance in four drug-target systems: isoniazid-enoyl-ACP reductase (tuberculosis), ritonavir-HIV protease (HIV), methotrexate-dihydrofolate reductase (breast cancer and leukemia), and gleevec-ABL kinase (leukemia). We validate our model using clinically known resistance mutations for all four test systems. In all cases, the model correctly predicts the majority of known resistance mutations. PMID:22651699

  15. Impact of A Priori Gradients on VLBI-Derived Terrestrial Reference Frames

    NASA Astrophysics Data System (ADS)

    Böhm, J.; Spicakova, H.; Urquhart, L.; Steigenberger, P.; Schuh, H.

    2011-07-01

    Tropospheric gradients are usually estimated in the analysis of space geodetic observations to account for the azimuthal asymmetry of troposphere delays. Whereas some analysis centres use a priori gradients for the analysis of Very Long Baseline Interferometry (VLBI) observations, no a priori information is generally applied in the analysis of Global Navigation Satellite Systems (GNSS) observations. We introduce a spherical harmonic expansion of total gradients derived from climatology data of the European Centre for Medium-Range Weather Forecasts (ECMWF), and we compare it to the gradients which have been determined for selected VLBI sites from data of the Data Assimilation Office (DAO) at Goddard Space Flight Center. The latter are usually applied in VLBI analysis. We show the effect of using both types of a priori gradients on the terrestrial and celestial reference frames which are determined from GNSS and VLBI analysis.

  16. Bayesian classification of polarimetric SAR images using adaptive a priori probabilities

    NASA Technical Reports Server (NTRS)

    Van Zyl, J. J.; Burnette, C. F.

    1992-01-01

    The problem of classifying earth terrain by observed polarimetric scattering properties is tackled with an iterative Bayesian scheme using a priori probabilities adaptively. The first classification is based on the use of fixed and not necessarily equal a priori probabilities, and successive iterations change the a priori probabilities adaptively. The approach is applied to an SAR image in which a single water body covers 10 percent of the image area. The classification accuracy for ocean, urban, vegetated, and total area increase, and the percentage of reclassified pixels decreases greatly as the iteration number increases. The iterative scheme is found to improve the a posteriori classification accuracy of maximum likelihood classifiers by iteratively using the local homogeneity in polarimetric SAR images. A few iterations can improve the classification accuracy significantly without sacrificing key high-frequency detail or edges in the image.

  17. Mediterranean Diet and Cardiovascular Disease: A Critical Evaluation of A Priori Dietary Indexes

    PubMed Central

    D’Alessandro, Annunziata; De Pergola, Giovanni

    2015-01-01

    The aim of this paper is to analyze the a priori dietary indexes used in the studies that have evaluated the role of the Mediterranean Diet in influencing the risk of developing cardiovascular disease. All the studies show that this dietary pattern protects against cardiovascular disease, but studies show quite different effects on specific conditions such as coronary heart disease or cerebrovascular disease. A priori dietary indexes used to measure dietary exposure imply quantitative and/or qualitative divergences from the traditional Mediterranean Diet of the early 1960s, and, therefore, it is very difficult to compare the results of different studies. Based on real cultural heritage and traditions, we believe that the a priori indexes used to evaluate adherence to the Mediterranean Diet should consider classifying whole grains and refined grains, olive oil and monounsaturated fats, and wine and alcohol differently. PMID:26389950

  18. Unequal a priori probability multiple hypothesis testing in space domain awareness with the space surveillance telescope.

    PubMed

    Hardy, Tyler; Cain, Stephen; Blake, Travis

    2016-05-20

    This paper investigates the ability to improve Space Domain Awareness (SDA) by increasing the number of detectable Resident Space Objects (RSOs) from space surveillance sensors. With matched filter based techniques, the expected impulse response, or Point Spread Function (PSF), is compared against the received data. In the situation where the images are spatially undersampled, the modeled PSF may not match the received data if the RSO does not fall in the center of the pixel. This aliasing can be accounted for with a Multiple Hypothesis Test (MHT). Previously, proposed MHTs have implemented a test with an equal a priori prior probability assumption. This paper investigates using an unequal a priori probability MHT. To determine accurate a priori probabilities, three metrics are computed; they are correlation, physical distance, and empirical. Using the calculated a priori probabilities, a new algorithm is developed, and images from the Space Surveillance Telescope (SST) are analyzed. The number of detected objects by both an equal and unequal prior probabilities are compared while keeping the false alarm rate constant. Any additional number of detected objects will help improve SDA capabilities. PMID:27411129

  19. A priori [Formula: see text] estimates for solutions of a class of reaction-diffusion systems.

    PubMed

    Du, Zengji; Peng, Rui

    2016-05-01

    In this short paper, we establish a priori [Formula: see text]-norm estimates for solutions of a class of reaction-diffusion systems which can be used to model the spread of infectious disease. The developed technique may find applications in other reaction-diffusion systems. PMID:26141826

  20. FORTRAN IV Program for Analysis of Covariance with A Priori or A Posteriori Mean Comparisons

    ERIC Educational Resources Information Center

    Fordyce, Michael W.

    1977-01-01

    A flexible Fortran program for computing a complete analysis of covariance is described. Requiring minimal core space, the program provides all group and overall summary statistics for the analysis, a test of homogeneity of regression, and all posttest mean comparisons for a priori or a posteriori testing. (Author/JKS)

  1. A priori estimates for the free boundary problem of incompressible neo-Hookean elastodynamics

    NASA Astrophysics Data System (ADS)

    Hao, Chengchun; Wang, Dehua

    2016-07-01

    A free boundary problem for the incompressible neo-Hookean elastodynamics is studied in two and three spatial dimensions. The a priori estimates in Sobolev norms of solutions with the physical vacuum condition are established through a geometrical point of view of Christodoulou and Lindblad (2000) [3]. Some estimates on the second fundamental form and velocity of the free surface are also obtained.

  2. Acquisition of priori tissue optical structure based on non-rigid image registration

    NASA Astrophysics Data System (ADS)

    Wan, Wenbo; Li, Jiao; Liu, Lingling; Wang, Yihan; Zhang, Yan; Gao, Feng

    2015-03-01

    Shape-parameterized diffuse optical tomography (DOT), which is based on a priori that assumes the uniform distribution of the optical properties in the each region, shows the effectiveness of complex biological tissue optical heterogeneities reconstruction. The priori tissue optical structure could be acquired with the assistance of anatomical imaging methods such as X-ray computed tomography (XCT) which suffers from low-contrast for soft tissues including different optical characteristic regions. For the mouse model, a feasible strategy of a priori tissue optical structure acquisition is proposed based on a non-rigid image registration algorithm. During registration, a mapping matrix is calculated to elastically align the XCT image of reference mouse to the XCT image of target mouse. Applying the matrix to the reference atlas which is a detailed mesh of organs/tissues in reference mouse, registered atlas can be obtained as the anatomical structure of target mouse. By assigning the literature published optical parameters of each organ to the corresponding anatomical structure, optical structure of the target organism can be obtained as a priori information for DOT reconstruction algorithm. By applying the non-rigid image registration algorithm to a target mouse which is transformed from the reference mouse, the results show that the minimum correlation coefficient can be improved from 0.2781 (before registration) to 0.9032 (after fine registration), and the maximum average Euclid distances can be decreased from 12.80mm (before registration) to 1.02mm (after fine registration), which has verified the effectiveness of the algorithm.

  3. APhoRISM FP7 project: the A Priori information for Earthquake damage mapping method

    NASA Astrophysics Data System (ADS)

    Bignami, Christian; Stramondo, Salvatore; Pierdicca, Nazzareno

    2014-05-01

    The APhoRISM - Advanced PRocedure for volcanIc and Seismic Monitoring - project is an FP7 funded project, which aims at developing and testing two new methods to combine Earth Observation satellite data from different sensors, and ground data for seismic and volcanic risk management. The objective is to demonstrate that this two types of data, appropriately managed and integrated, can provide new improved products useful for seismic and volcanic crisis management. One of the two methods deals with earthquakes, and it concerns the generation of maps to address the detection and estimate of damage caused by a seism. The method is named APE - A Priori information for Earthquake damage mapping. The use of satellite data to investigate earthquake damages is not an innovative issue. Indeed, a wide literature and projects have addressed and focused such issue, but usually the proposed approaches are only based on change detection techniques and/or classifications algorithms. The novelty of APhoRISM-APE relies on the exploitation of a priori information derived by: - InSAR time series to measure surface movements - shakemaps obtained from seismological data - vulnerability information. This a priori information is then integrated with change detection map from earth observation satellite sensors (either Optical or Synthetic Aperture Radar) to improve accuracy and to limit false alarms.

  4. A priori data-driven multi-clustered reservoir generation algorithm for echo state network.

    PubMed

    Li, Xiumin; Zhong, Ling; Xue, Fangzheng; Zhang, Anguo

    2015-01-01

    Echo state networks (ESNs) with multi-clustered reservoir topology perform better in reservoir computing and robustness than those with random reservoir topology. However, these ESNs have a complex reservoir topology, which leads to difficulties in reservoir generation. This study focuses on the reservoir generation problem when ESN is used in environments with sufficient priori data available. Accordingly, a priori data-driven multi-cluster reservoir generation algorithm is proposed. The priori data in the proposed algorithm are used to evaluate reservoirs by calculating the precision and standard deviation of ESNs. The reservoirs are produced using the clustering method; only the reservoir with a better evaluation performance takes the place of a previous one. The final reservoir is obtained when its evaluation score reaches the preset requirement. The prediction experiment results obtained using the Mackey-Glass chaotic time series show that the proposed reservoir generation algorithm provides ESNs with extra prediction precision and increases the structure complexity of the network. Further experiments also reveal the appropriate values of the number of clusters and time window size to obtain optimal performance. The information entropy of the reservoir reaches the maximum when ESN gains the greatest precision. PMID:25875296

  5. Examples of use of a-priori information to the inversion of AEM data

    NASA Astrophysics Data System (ADS)

    Viezzoli, A.; Munday, T. J.; Sapia, V.

    2012-12-01

    There is a growing focus in the international near surface geophysical community in the merging of information (loosely termed "data") from different sources, in the modelling of the subsurface. The use of a-priori data as extra input to the inversion of Airborne Electromagnetic data is one illustrative example of such trend. It allows providing more robust results, for a number of reasons. The first one is probably the capability to cross check the geophysical derived model against ancillary information, in a more quantitative and objective way than it can be done a-posteriori. The second is that mitigates the inherent non uniqueness of the results of inversion of geophysical data, which is due to the fact that the problem is usually ill posed. The third is the ever higher level of accuracy of the derived output sought after by end users that, rightly so, demand results (either direct or derived) they can use directly for management. Last, but not least, is the drive to incorporate different physical parameters originating from different sources into one inversion problem, in order to derive directly, e.g., geological or hydrogeological models that fit all data sets at once. In this paper we present examples obtained adding information from geophysics (i.e., seismic, surface and borehole geoelectric) and from geology (e.g., lithology), to the inversion of Airborne EM data from different systems (e.g., VTEM, AeroTEM, SkyTEM, Resolve). Case studies are from several areas in the world, with varied geological settings. In our formulation, the a-priori information is treated as nothing but an extra data set, carrying location, values, uncertainty, and expected lateral variability. The information it contains is spread to the location of the neighbouring AEM soundings, using the Spatially Constrained Inversion approach. Constraints and uncertainties are usually different depending on data types and geology. Case studies show the effect on the inversion results of the a-priori

  6. Phillips-Tikhonov regularization with a priori information for neutron emission tomographic reconstruction on Joint European Torus

    SciTech Connect

    Bielecki, J.; Scholz, M.; Drozdowicz, K.; Giacomelli, L.; Kiptily, V.; Kempenaars, M.; Conroy, S.; Craciunescu, T.; Collaboration: EUROfusion Consortium, JET, Culham Science Centre, Abingdon OX14 3DB

    2015-09-15

    A method of tomographic reconstruction of the neutron emissivity in the poloidal cross section of the Joint European Torus (JET, Culham, UK) tokamak was developed. Due to very limited data set (two projection angles, 19 lines of sight only) provided by the neutron emission profile monitor (KN3 neutron camera), the reconstruction is an ill-posed inverse problem. The aim of this work consists in making a contribution to the development of reliable plasma tomography reconstruction methods that could be routinely used at JET tokamak. The proposed method is based on Phillips-Tikhonov regularization and incorporates a priori knowledge of the shape of normalized neutron emissivity profile. For the purpose of the optimal selection of the regularization parameters, the shape of normalized neutron emissivity profile is approximated by the shape of normalized electron density profile measured by LIDAR or high resolution Thomson scattering JET diagnostics. In contrast with some previously developed methods of ill-posed plasma tomography reconstruction problem, the developed algorithms do not include any post-processing of the obtained solution and the physical constrains on the solution are imposed during the regularization process. The accuracy of the method is at first evaluated by several tests with synthetic data based on various plasma neutron emissivity models (phantoms). Then, the method is applied to the neutron emissivity reconstruction for JET D plasma discharge #85100. It is demonstrated that this method shows good performance and reliability and it can be routinely used for plasma neutron emissivity reconstruction on JET.

  7. Phillips-Tikhonov regularization with a priori information for neutron emission tomographic reconstruction on Joint European Torus.

    PubMed

    Bielecki, J; Giacomelli, L; Kiptily, V; Scholz, M; Drozdowicz, K; Conroy, S; Craciunescu, T; Kempenaars, M

    2015-09-01

    A method of tomographic reconstruction of the neutron emissivity in the poloidal cross section of the Joint European Torus (JET, Culham, UK) tokamak was developed. Due to very limited data set (two projection angles, 19 lines of sight only) provided by the neutron emission profile monitor (KN3 neutron camera), the reconstruction is an ill-posed inverse problem. The aim of this work consists in making a contribution to the development of reliable plasma tomography reconstruction methods that could be routinely used at JET tokamak. The proposed method is based on Phillips-Tikhonov regularization and incorporates a priori knowledge of the shape of normalized neutron emissivity profile. For the purpose of the optimal selection of the regularization parameters, the shape of normalized neutron emissivity profile is approximated by the shape of normalized electron density profile measured by LIDAR or high resolution Thomson scattering JET diagnostics. In contrast with some previously developed methods of ill-posed plasma tomography reconstruction problem, the developed algorithms do not include any post-processing of the obtained solution and the physical constrains on the solution are imposed during the regularization process. The accuracy of the method is at first evaluated by several tests with synthetic data based on various plasma neutron emissivity models (phantoms). Then, the method is applied to the neutron emissivity reconstruction for JET D plasma discharge #85100. It is demonstrated that this method shows good performance and reliability and it can be routinely used for plasma neutron emissivity reconstruction on JET. PMID:26429441

  8. Phillips-Tikhonov regularization with a priori information for neutron emission tomographic reconstruction on Joint European Torus

    NASA Astrophysics Data System (ADS)

    Bielecki, J.; Giacomelli, L.; Kiptily, V.; Scholz, M.; Drozdowicz, K.; Conroy, S.; Craciunescu, T.; Kempenaars, M.

    2015-09-01

    A method of tomographic reconstruction of the neutron emissivity in the poloidal cross section of the Joint European Torus (JET, Culham, UK) tokamak was developed. Due to very limited data set (two projection angles, 19 lines of sight only) provided by the neutron emission profile monitor (KN3 neutron camera), the reconstruction is an ill-posed inverse problem. The aim of this work consists in making a contribution to the development of reliable plasma tomography reconstruction methods that could be routinely used at JET tokamak. The proposed method is based on Phillips-Tikhonov regularization and incorporates a priori knowledge of the shape of normalized neutron emissivity profile. For the purpose of the optimal selection of the regularization parameters, the shape of normalized neutron emissivity profile is approximated by the shape of normalized electron density profile measured by LIDAR or high resolution Thomson scattering JET diagnostics. In contrast with some previously developed methods of ill-posed plasma tomography reconstruction problem, the developed algorithms do not include any post-processing of the obtained solution and the physical constrains on the solution are imposed during the regularization process. The accuracy of the method is at first evaluated by several tests with synthetic data based on various plasma neutron emissivity models (phantoms). Then, the method is applied to the neutron emissivity reconstruction for JET D plasma discharge #85100. It is demonstrated that this method shows good performance and reliability and it can be routinely used for plasma neutron emissivity reconstruction on JET.

  9. SAC-SMA a priori parameter differences and their impact on distributed hydrologic model simulations

    NASA Astrophysics Data System (ADS)

    Zhang, Ziya; Koren, Victor; Reed, Seann; Smith, Michael; Zhang, Yu; Moreda, Fekadu; Cosgrove, Brian

    2012-02-01

    SummaryDeriving a priori gridded parameters is an important step in the development and deployment of an operational distributed hydrologic model. Accurate a priori parameters can reduce the manual calibration effort and/or speed up the automatic calibration process, reduce calibration uncertainty, and provide valuable information at ungauged locations. Underpinned by reasonable parameter data sets, distributed hydrologic modeling can help improve water resource and flood and flash flood forecasting capabilities. Initial efforts at the National Weather Service Office of Hydrologic Development (NWS OHD) to derive a priori gridded Sacramento Soil Moisture Accounting (SAC-SMA) model parameters for the conterminous United States (CONUS) were based on a relatively coarse resolution soils property database, the State Soil Geographic Database (STATSGO) (Soil Survey Staff, 2011) and on the assumption of uniform land use and land cover. In an effort to improve the parameters, subsequent work was performed to fully incorporate spatially variable land cover information into the parameter derivation process. Following that, finer-scale soils data (the county-level Soil Survey Geographic Database (SSURGO) ( Soil Survey Staff, 2011a,b), together with the use of variable land cover data, were used to derive a third set of CONUS, a priori gridded parameters. It is anticipated that the second and third parameter sets, which incorporate more physical data, will be more realistic and consistent. Here, we evaluate whether this is actually the case by intercomparing these three sets of a priori parameters along with their associated hydrologic simulations which were generated by applying the National Weather Service Hydrology Laboratory's Research Distributed Hydrologic Model (HL-RDHM) ( Koren et al., 2004) in a continuous fashion with an hourly time step. This model adopts a well-tested conceptual water balance model, SAC-SMA, applied on a regular spatial grid, and links to physically

  10. Lagrangian formulation and a priori estimates for relativistic fluid flows with vacuum

    NASA Astrophysics Data System (ADS)

    Jang, Juhi; LeFloch, Philippe G.; Masmoudi, Nader

    2016-03-01

    We study the evolution of a compressible fluid surrounded by vacuum and introduce a new symmetrization in Lagrangian coordinates that allows us to encompass both relativistic and non-relativistic fluid flows. The problem under consideration is a free boundary problem of central interest in compressible fluid dynamics and, from the mathematical standpoint, the main challenge to be overcome lies in the loss of regularity in the fluid variables near the free boundary. Based on our Lagrangian formulation, we establish the necessary a priori estimates in weighted Sobolev spaces which are adapted to this loss of regularity.

  11. Proportionality of Components, Liouville Theorems and a Priori Estimates for Noncooperative Elliptic Systems

    NASA Astrophysics Data System (ADS)

    Montaru, Alexandre; Sirakov, Boyan; Souplet, Philippe

    2014-07-01

    We study qualitative properties of positive solutions of noncooperative, possibly nonvariational, elliptic systems. We obtain new classification and Liouville type theorems in the whole Euclidean space, as well as in half-spaces, and deduce a priori estimates and the existence of positive solutions for related Dirichlet problems. We significantly improve the known results for a large class of systems involving a balance between repulsive and attractive terms. This class contains systems arising in biological models of Lotka-Volterra type, in physical models of Bose-Einstein condensates and in models of chemical reactions.

  12. A priori stability estimates and unconditionally stable product formula algorithms for nonlinear coupled thermoplasticity

    NASA Astrophysics Data System (ADS)

    Armero, F.; Simo, J. C.

    This article describes new a priori stability estimates for the full nonlinear system of coupled thermoplasticity at finite strains and presents a fractional step method leading to a new class of unconditionally stable staggered algorithms. These results are shown to hold for general models of multiplicative plasticity that include, as a particular case, the single-crystal model. The proposed product formula algorithm is designed via an entropy based operator split that yields one of the first known staggered algorithms that retains the property of nonlinear unconditional stability. The scheme employs an isentropic step, in which the total entropy is held constant, followed by a heat conduction step (with nonlinear source) at fixed configuration. The nonlinear stability analysis shows that the proposed staggered scheme inherits the a priori energy estimate for the continuum problem, regardless of the size of the time-step. In sharp contrast with these results, it is shown that widely used staggered methods employing an isothermal step followed by a heat conduction problem can be at most only conditionally stable. The excellent performance of the methodology is illustrated in representative numerical simulations.

  13. Determining the depth of certain gravity sources without a priori specification of their structural index

    NASA Astrophysics Data System (ADS)

    Zhou, Shuai; Huang, Danian

    2015-11-01

    We have developed a new method for the interpretation of gravity tensor data based on the generalized Tilt-depth method. Cooper (2011, 2012) extended the magnetic Tilt-depth method to gravity data. We take the gradient-ratio method of Cooper (2011, 2012) and modify it so that the source type does not need to be specified a priori. We develop the new method by generalizing the Tilt-depth method for depth estimation for different types of source bodies. The new technique uses only the three vertical tensor components of the full gravity tensor data observed or calculated at different height plane to estimate the depth of the buried bodies without a priori specification of their structural index. For severely noise-corrupted data, our method utilizes different upward continuation height data, which can effectively reduce the influence of noise. Theoretical simulations of the gravity source model with and without noise illustrate the ability of the method to provide source depth information. Additionally, the simulations demonstrate that the new method is simple, computationally fast and accurate. Finally, we apply the method using the gravity data acquired over the Humble Salt Dome in the USA as an example. The results show a good correspondence to the previous drilling and seismic interpretation results.

  14. Source Reconstruction for Spectrally-resolved Bioluminescence Tomography with Sparse A priori Information

    PubMed Central

    Lu, Yujie; Zhang, Xiaoqun; Douraghy, Ali; Stout, David; Tian, Jie; Chan, Tony F.; Chatziioannou, Arion F.

    2009-01-01

    Through restoration of the light source information in small animals in vivo, optical molecular imaging, such as fluorescence molecular tomography (FMT) and bioluminescence tomography (BLT), can depict biological and physiological changes observed using molecular probes. A priori information plays an indispensable role in tomographic reconstruction. As a type of a priori information, the sparsity characteristic of the light source has not been sufficiently considered to date. In this paper, we introduce a compressed sensing method to develop a new tomographic algorithm for spectrally-resolved bioluminescence tomography. This method uses the nature of the source sparsity to improve the reconstruction quality with a regularization implementation. Based on verification of the inverse crime, the proposed algorithm is validated with Monte Carlo-based synthetic data and the popular Tikhonov regularization method. Testing with different noise levels and single/multiple source settings at different depths demonstrates the improved performance of this algorithm. Experimental reconstruction with a mouse-shaped phantom further shows the potential of the proposed algorithm. PMID:19434138

  15. GNSS Precise Kinematic Positioning for Multiple Kinematic Stations Based on A Priori Distance Constraints.

    PubMed

    He, Kaifei; Xu, Tianhe; Förste, Christoph; Petrovic, Svetozar; Barthelmes, Franz; Jiang, Nan; Flechtner, Frank

    2016-01-01

    When applying the Global Navigation Satellite System (GNSS) for precise kinematic positioning in airborne and shipborne gravimetry, multiple GNSS receiving equipment is often fixed mounted on the kinematic platform carrying the gravimetry instrumentation. Thus, the distances among these GNSS antennas are known and invariant. This information can be used to improve the accuracy and reliability of the state estimates. For this purpose, the known distances between the antennas are applied as a priori constraints within the state parameters adjustment. These constraints are introduced in such a way that their accuracy is taken into account. To test this approach, GNSS data of a Baltic Sea shipborne gravimetric campaign have been used. The results of our study show that an application of distance constraints improves the accuracy of the GNSS kinematic positioning, for example, by about 4 mm for the radial component. PMID:27043580

  16. A Priori Estimates for Free Boundary Problem of Incompressible Inviscid Magnetohydrodynamic Flows

    NASA Astrophysics Data System (ADS)

    Hao, Chengchun; Luo, Tao

    2014-06-01

    In the present paper, we prove the a priori estimates of Sobolev norms for a free boundary problem of the incompressible inviscid magnetohydrodynamics equations in all physical spatial dimensions n = 2 and 3 by adopting a geometrical point of view used in Christodoulou and Lindblad (Commun Pure Appl Math 53:1536-1602, 2000), and estimating quantities such as the second fundamental form and the velocity of the free surface. We identify the well-posedness condition that the outer normal derivative of the total pressure including the fluid and magnetic pressures is negative on the free boundary, which is similar to the physical condition (Taylor sign condition) for the incompressible Euler equations of fluids.

  17. Microwave Radar Imaging of Heterogeneous Breast Tissue Integrating A Priori Information

    PubMed Central

    Kelly, Thomas N.; Sarafianou, Mantalena; Craddock, Ian J.

    2014-01-01

    Conventional radar-based image reconstruction techniques fail when they are applied to heterogeneous breast tissue, since the underlying in-breast relative permittivity is unknown or assumed to be constant. This results in a systematic error during the process of image formation. A recent trend in microwave biomedical imaging is to extract the relative permittivity from the object under test to improve the image reconstruction quality and thereby to enhance the diagnostic assessment. In this paper, we present a novel radar-based methodology for microwave breast cancer detection in heterogeneous breast tissue integrating a 3D map of relative permittivity as a priori information. This leads to a novel image reconstruction formulation where the delay-and-sum focusing takes place in time rather than range domain. Results are shown for a heterogeneous dense (class-4) and a scattered fibroglandular (class-2) numerical breast phantom using Bristol's 31-element array configuration. PMID:25435861

  18. A priori mesh quality metrics for three-dimensional hybrid grids

    SciTech Connect

    Kallinderis, Y. Fotia, S.

    2015-01-01

    Use of general hybrid grids to attain complex-geometry field simulations poses a challenge on estimation of their quality. Apart from the typical problems of non-uniformity and non-orthogonality, the change in element topology is an extra issue to address. The present work derives and evaluates an a priori mesh quality indicator for structured, unstructured, as well as hybrid grids consisting of hexahedra, prisms, tetrahedra, and pyramids. Emphasis is placed on deriving a direct relation between the quality measure and mesh distortion. The work is based on use of the Finite Volume discretization for evaluation of first order spatial derivatives. The analytic form of the truncation error is derived and applied to elementary types of mesh distortion including typical hybrid grid interfaces. The corresponding analytic expressions provide relations between the truncation error and the degree of stretching, skewness, shearing, torsion, expansion, as well as the type of grid interface.

  19. GNSS Precise Kinematic Positioning for Multiple Kinematic Stations Based on A Priori Distance Constraints

    PubMed Central

    He, Kaifei; Xu, Tianhe; Förste, Christoph; Petrovic, Svetozar; Barthelmes, Franz; Jiang, Nan; Flechtner, Frank

    2016-01-01

    When applying the Global Navigation Satellite System (GNSS) for precise kinematic positioning in airborne and shipborne gravimetry, multiple GNSS receiving equipment is often fixed mounted on the kinematic platform carrying the gravimetry instrumentation. Thus, the distances among these GNSS antennas are known and invariant. This information can be used to improve the accuracy and reliability of the state estimates. For this purpose, the known distances between the antennas are applied as a priori constraints within the state parameters adjustment. These constraints are introduced in such a way that their accuracy is taken into account. To test this approach, GNSS data of a Baltic Sea shipborne gravimetric campaign have been used. The results of our study show that an application of distance constraints improves the accuracy of the GNSS kinematic positioning, for example, by about 4 mm for the radial component. PMID:27043580

  20. A priori analysis: an application to the estimate of the uncertainty in course grades

    NASA Astrophysics Data System (ADS)

    Lippi, G. L.

    2014-07-01

    A priori analysis (APA) is discussed as a tool to assess the reliability of grades in standard curricular courses. This unusual, but striking, application is presented when teaching the section on the data treatment of a laboratory course to illustrate the characteristics of the APA and its potential for widespread use, beyond the traditional physics curriculum. The conditions necessary for this kind of analysis are discussed, the general framework is set out and a specific example is given to illustrate its various aspects. Students are often struck by this unusual application and are more apt to remember the APA. Instructors may also benefit from some of the gathered information, as discussed in the paper.

  1. A Priori Bound on the Velocity in Axially Symmetric Navier-Stokes Equations

    NASA Astrophysics Data System (ADS)

    Lei, Zhen; Navas, Esteban A.; Zhang, Qi S.

    2016-01-01

    Let v be the velocity of Leray-Hopf solutions to the axially symmetric three-dimensional Navier-Stokes equations. Under suitable conditions for initial values, we prove the following a priori bound |v(x, t)| ≤ C |ln r|^{1/2}/r^2, qquad 0 < r ≤ 1/2, where r is the distance from x to the z axis, and C is a constant depending only on the initial value. This provides a pointwise upper bound (worst case scenario) for possible singularities, while the recent papers (Chiun-Chuan et al., Commun PDE 34(1-3):203-232, 2009; Koch et al., Acta Math 203(1):83-105, 2009) gave a lower bound. The gap is polynomial order 1 modulo a half log term.

  2. Rapid multi-wavelength optical assessment of circulating blood volume without a priori data

    NASA Astrophysics Data System (ADS)

    Loginova, Ekaterina V.; Zhidkova, Tatyana V.; Proskurnin, Mikhail A.; Zharov, Vladimir P.

    2016-03-01

    The measurement of circulating blood volume (CBV) is crucial in various medical conditions including surgery, iatrogenic problems, rapid fluid administration, transfusion of red blood cells, or trauma with extensive blood loss including battlefield injuries and other emergencies. Currently, available commercial techniques are invasive and time-consuming for trauma situations. Recently, we have proposed high-speed multi-wavelength photoacoustic/photothermal (PA/PT) flow cytometry for in vivo CBV assessment with multiple dyes as PA contrast agents (labels). As the first step, we have characterized the capability of this technique to monitor the clearance of three dyes (indocyanine green, methylene blue, and trypan blue) in an animal model. However, there are strong demands on improvements in PA/PT flow cytometry. As additional verification of our proof-of-concept of this technique, we performed optical photometric CBV measurements in vitro. Three label dyes—methylene blue, crystal violet and, partially, brilliant green—were selected for simultaneous photometric determination of the components of their two-dye mixtures in the circulating blood in vitro without any extra data (like hemoglobin absorption) known a priori. The tests of single dyes and their mixtures in a flow system simulating a blood transfusion system showed a negligible difference between the sensitivities of the determination of these dyes under batch and flow conditions. For individual dyes, the limits of detection of 3×10-6 M‒3×10-6 M in blood were achieved, which provided their continuous determination at a level of 10-5 M for the CBV assessment without a priori data on the matrix. The CBV assessment with errors no higher than 4% were obtained, and the possibility to apply the developed procedure for optical photometric (flow cytometry) with laser sources was shown.

  3. A priori-defined Diet Quality Indexes and Risk of Type 2 diabetes: The Multiethnic Cohort

    PubMed Central

    Jacobs, Simone; Harmon, Brook E.; Boushey, Carol J.; Morimoto, Yukiko; Wilkens, Lynne R.; Le Marchand, Loic; Kröger, Janine; Schulze, Matthias B.; Kolonel, Laurence N.; Maskarinec, Gertraud

    2014-01-01

    Aim Dietary patterns have been associated with type 2 diabetes incidence, but little is known about the impact of ethnicity on this relation. This study evaluated the association of four a priori dietary quality indexes and type 2 diabetes risk among whites, Japanese Americans, and Native Hawaiians in the Hawaii component of the Multiethnic Cohort (MEC). Methods After excluding participants with prevalent diabetes and missing values, the analysis included 89,185 participants (11,217 cases). Dietary intake was assessed at baseline with a quantitative food frequency questionnaire designed for use in the relevant ethnic populations. Sex- and ethnicity-specific hazard ratios were calculated for the Healthy Eating Index-2010 (HEI-2010), the alternative HEI-2010 (AHEI-2010), the alternate Mediterranean diet score (aMED), and the Dietary Approaches to Stop Hypertension (DASH). Results We observed significant inverse associations between higher scores of the DASH index and type 2 diabetes risk in white men and women, as well as in Japanese American women and Native Hawaiian men with respective risk reductions of 37, 31, 19 and 21% (highest compared to lowest index category). A higher adherence to the AHEI-2010 and aMED diet was related to a 13–28% lower type 2 diabetes risk in white participants but not in other ethnic groups. No significant associations with type 2 diabetes risk were observed for the HEI-2010 index. Conclusions The small ethnic differences in type 2 diabetes risk associated with scores of a priori-defined dietary patterns may be due to different consumption patterns of food components and the fact that the original indexes were not based on Asians and Pacific Islanders. PMID:25319012

  4. Knowledge based SAR images exploitations

    NASA Astrophysics Data System (ADS)

    Wang, David L.

    1987-01-01

    One of the basic functions of SAR images exploitation system is the detection of man-made objects. The performance of object detection is strongly limited by performance of segmentation modules. This paper presents a detection paradigm composed of an adaptive segmentation algorithm based on a priori knowledge of objects followed by a top-down hierarchical detection process that generates and evaluates object hypotheses. Shadow information and inter-object relationships can be added to the knowledge base to improve performance over that of a statistical detector based only on the attributes of individual objects.

  5. A-Priori Rupture Models for Northern California Type-A Faults

    USGS Publications Warehouse

    Wills, Chris J.; Weldon, Ray J., II; Field, Edward H.

    2008-01-01

    This appendix describes how a-priori rupture models were developed for the northern California Type-A faults. As described in the main body of this report, and in Appendix G, ?a-priori? models represent an initial estimate of the rate of single and multi-segment surface ruptures on each fault. Whether or not a given model is moment balanced (i.e., satisfies section slip-rate data) depends on assumptions made regarding the average slip on each segment in each rupture (which in turn depends on the chosen magnitude-area relationship). Therefore, for a given set of assumptions, or branch on the logic tree, the methodology of the present Working Group (WGCEP-2007) is to find a final model that is as close as possible to the a-priori model, in the least squares sense, but that also satisfies slip rate and perhaps other data. This is analogous the WGCEP- 2002 approach of effectively voting on the relative rate of each possible rupture, and then finding the closest moment-balance model (under a more limiting set of assumptions than adopted by the present WGCEP, as described in detail in Appendix G). The 2002 Working Group Report (WCCEP, 2003, referred to here as WGCEP-2002), created segmented earthquake rupture forecast models for all faults in the region, including some that had been designated as Type B faults in the NSHMP, 1996, and one that had not previously been considered. The 2002 National Seismic Hazard Maps used the values from WGCEP-2002 for all the faults in the region, essentially treating all the listed faults as Type A faults. As discussed in Appendix A, the current WGCEP found that there are a number of faults with little or no data on slip-per-event, or dates of previous earthquakes. As a result, the WGCEP recommends that faults with minimal available earthquake recurrence data: the Greenville, Mount Diablo, San Gregorio, Monte Vista-Shannon and Concord-Green Valley be modeled as Type B faults to be consistent with similarly poorly-known faults statewide

  6. Plasma Ascorbic Acid, A Priori Diet Quality Score, and Incident Hypertension: A Prospective Cohort Study

    PubMed Central

    Buijsse, Brian; Jacobs, David R.; Steffen, Lyn M.; Kromhout, Daan; Gross, Myron D.

    2015-01-01

    Vitamin C may reduce risk of hypertension, either in itself or by marking a healthy diet pattern. We assessed whether plasma ascorbic acid and the a priori diet quality score relate to incident hypertension and whether they explain each other’s predictive abilities. Data were from 2884 black and white adults (43% black, mean age 35 years) initially hypertension-free in the Coronary Artery Risk Development in Young Adults Study (study year 10, 1995–1996). Plasma ascorbic acid was assessed at year 10 and the diet quality score at year 7. Eight-hundred-and-forty cases of hypertension were documented between years 10 and 25. After multiple adjustments, each 12-point (1 SD) higher diet quality score at year 7 related to mean 3.7 μmol/L (95% CI 2.9 to 4.6) higher plasma ascorbic acid at year 10. In separate multiple-adjusted Cox regression models, the hazard ratio of hypertension per 19.6-μmol/L (1 SD) higher ascorbic acid was 0.85 (95% CI 0.79–0.92) and per 12-points higher diet score 0.86 (95% CI 0.79–0.94). These hazard ratios changed little with mutual adjustment of ascorbic acid and diet quality score for each other, or when adjusted for anthropometric variables, diabetes, and systolic blood pressure at year 10. Intake of dietary vitamin C and several food groups high in vitamin C content were inversely related to hypertension, whereas supplemental vitamin C was not. In conclusion, plasma ascorbic acid and the a priori diet quality score independently predict hypertension. This suggests that hypertension risk is reduced by improving overall diet quality and/or vitamin C status. The inverse association seen for dietary but not for supplemental vitamin C suggests that vitamin C status is preferably improved by eating foods rich in vitamin C, in addition to not smoking and other dietary habits that prevent ascorbic acid from depletion. PMID:26683190

  7. Development of a combined multifrequency MRI-DOT system for human breast imaging using a priori information

    NASA Astrophysics Data System (ADS)

    Thayer, David; Liu, Ning; Unlu, Burcin; Chen, Jeon-Hor; Su, Min-Ying; Nalcioglu, Orhan; Gulsen, Gultekin

    2010-02-01

    Breast cancer is a significant cause of mortality and morbidity among women with early diagnosis being vital to successful treatment. Diffuse Optical Tomography (DOT) is an emerging medical imaging modality that provides information that is complementary to current screening modalities such as MRI and mammography, and may improve the specificity in determining cancer malignancy. Using high-resolution anatomic images as a priori information improves the accuracy of DOT. Measurements are presented characterizing the performance of our system. Preliminary data is also shown illustrating the use of a priori MRI data in phantom studies.ä

  8. On Evaluation of Recharge Model Uncertainty: a Priori and a Posteriori

    SciTech Connect

    Ming Ye; Karl Pohlmann; Jenny Chapman; David Shafer

    2006-01-30

    Hydrologic environments are open and complex, rendering them prone to multiple interpretations and mathematical descriptions. Hydrologic analyses typically rely on a single conceptual-mathematical model, which ignores conceptual model uncertainty and may result in bias in predictions and under-estimation of predictive uncertainty. This study is to assess conceptual model uncertainty residing in five recharge models developed to date by different researchers based on different theories for Nevada and Death Valley area, CA. A recently developed statistical method, Maximum Likelihood Bayesian Model Averaging (MLBMA), is utilized for this analysis. In a Bayesian framework, the recharge model uncertainty is assessed, a priori, using expert judgments collected through an expert elicitation in the form of prior probabilities of the models. The uncertainty is then evaluated, a posteriori, by updating the prior probabilities to estimate posterior model probability. The updating is conducted through maximum likelihood inverse modeling by calibrating the Death Valley Regional Flow System (DVRFS) model corresponding to each recharge model against observations of head and flow. Calibration results of DVRFS for the five recharge models are used to estimate three information criteria (AIC, BIC, and KIC) used to rank and discriminate these models. Posterior probabilities of the five recharge models, evaluated using KIC, are used as weights to average head predictions, which gives posterior mean and variance. The posterior quantities incorporate both parametric and conceptual model uncertainties.

  9. A Priori Attitudes Predict Amniocentesis Uptake in Women of Advanced Maternal Age: A Pilot Study.

    PubMed

    Grinshpun-Cohen, Julia; Miron-Shatz, Talya; Rhee-Morris, Laila; Briscoe, Barbara; Pras, Elon; Towner, Dena

    2015-01-01

    Amniocentesis is an invasive procedure performed during pregnancy to determine, among other things, whether the fetus has Down syndrome. It is often preceded by screening, which gives a probabilistic risk assessment. Thus, ample information is conveyed to women with the goal to inform their decisions. This study examined the factors that predict amniocentesis uptake among pregnant women of advanced maternal age (older than 35 years old at the time of childbirth). Participants filled out a questionnaire regarding risk estimates, demographics, and attitudes on screening and pregnancy termination before their first genetic counseling appointment and were followed up to 24 weeks of gestation. Findings show that women's decisions are not always informed by screening results or having a medical indication. Psychological factors measured at the beginning of pregnancy: amniocentesis risk tolerance, pregnancy termination tolerance, and age risk perception affected amniocentesis uptake. Although most women thought that screening for Down syndrome risk would inform their decision, they later stated other reasons for screening, such as preparing for the possibility of a child with special needs. Findings suggest that women's decisions regarding amniocentesis are driven not only by medical factors, but also by a priori attitudes. The authors believe that these should be addressed in the dialogue on women's informed use of prenatal tests. PMID:26065331

  10. The application of a priori structural information based regularization in image reconstruction in magnetic induction tomography

    NASA Astrophysics Data System (ADS)

    Dekdouk, B.; Ktistis, C.; Yin, W.; Armitage, D. W.; Peyton, A. J.

    2010-04-01

    Magnetic induction tomography (MIT) is a non-invasive contactless modality that could be capable of imaging the conductivity distribution of biological tissues. In this paper we consider the possibility of using absolute MIT voltage measurements for monitoring the progress of a peripheral hemorrhagic stroke in a human brain. The pathology is modelled as a local blood accumulation in the white matter. The solution of the MIT inverse problem is nonlinear and ill-posed and hence requires the use of a regularisation method. In this paper, we describe the construction and present the performance of a regularisation matrix based on a priori structural information of the head tissues obtained from a very recent MRI scan. The method takes the MRI scan as an initial state of the stroke and constructs a learning set containing the possible conductivity distributions of the current state of the stroke. This data is used to calculate an approximation of the covariance matrix and then a subspace is constructed using principal component analysis (PCA). It is shown by simulations the method is capable of producing a representative reconstruction of a stroke compared to smoothing Tikhonov regularization in a simplified model of the head.

  11. DNS/LES of turbulent flow in a square duct: A priori evaluation of subgrid models

    NASA Astrophysics Data System (ADS)

    O'Sullivan, Peter L.; Biringen, Sedat; Huser, Asmund

    We have performed a priori tests of two dynamic subgrid-scale (SGS) turbulence models using a highly resolved direct numerical simulation (DNS) data-base for the case of incompressible flow in a straight duct of square cross-section. The model testing is applied only to the homogeneous flow direction where grid filtering can be applied without the introduction of commutation errors. The first model is the dynamic (Smagorinsky/eddy viscosity) SGS model (DSM) developed by Germano et al. [5] while the second is the dynamic two-parameter (mixed) model (DTM) developed by Salvetti and Banerjee [2]. As found in prior studies of this sort there is a very poor correlation of the modelled and exact subgrid-scale dissipation in the case of the DSM. The DSM over-predicts subgrid-scale dissipation on average. Instantaneously, the model provides an inaccurate representation of subgrid-scale dissipation, in general underestimating the magnitude by approximately one order of magnitude. On the other hand, the DTM shows excellent agreement with the exact SGS dissipation over most of the duct cross-section with a correlation coefficient of approximately 0.9.

  12. A Priori Analyses of Three Subgrid-Scale Models for One-Parameter Families of Filters

    NASA Technical Reports Server (NTRS)

    Pruett, C. David; Adams, Nikolaus A.

    1998-01-01

    The decay of isotropic turbulence a compressible flow is examined by direct numerical simulation (DNS). A priori analyses of the DNS data are then performed to evaluate three subgrid-scale (SGS) models for large-eddy simulation (LES): a generalized Smagorinsky model (M1), a stress-similarity model (M2), and a gradient model (M3). The models exploit one-parameter second- or fourth-order filters of Pade type, which permit the cutoff wavenumber k(sub c) to be tuned independently of the grid increment (delta)x. The modeled (M) and exact (E) SGS-stresses are compared component-wise by correlation coefficients of the form C(E,M) computed over the entire three-dimensional fields. In general, M1 correlates poorly against exact stresses (C < 0.2), M3 correlates moderately well (C approx. 0.6), and M2 correlates remarkably well (0.8 < C < 1.0). Specifically, correlations C(E, M2) are high provided the grid and test filters are of the same order. Moreover, the highest correlations (C approx.= 1.0) result whenever the grid and test filters are identical (in both order and cutoff). Finally, present results reveal the exact SGS stresses obtained by grid filters of differing orders to be only moderately well correlated. Thus, in LES the model should not be specified independently of the filter.

  13. Optical diffraction tomography in fluid velocimetry: the use of a priori information

    NASA Astrophysics Data System (ADS)

    Lobera, J.; Coupland, J. M.

    2008-07-01

    Holographic particle image velocimetry (HPIV) has been used successfully to make three-dimensional, three-component flow measurements from holographic recordings of seeded fluid. It is clear that measurements can only be made in regions that contain particles, but simply adding more seeding results in poor quality images due to the effects of multiple scattering. In this paper, we describe optical diffraction tomography (ODT) techniques and consider its use as a means to overcome the problems of multiple scattering in HPIV. We consider several approaches to tomographic reconstruction that are essentially based on linear and nonlinear combinations of holographic reconstructions of the scattered fields observed under varied illuminating conditions. We show that linear reconstruction provides images of highest fidelity, but none of the methods properly accounts for the effects of multiple scattering. We go on to consider nonlinear optimization methods in ODT that attempt to minimize the error between the scattered field computed from an estimate of the particle distribution and that measured in practice. We describe an optimization procedure that is based on the conjugated gradient method (CGM) that makes use of a priori information (the size and refractive index of the seeding particles) to effectively reduce the problem to that of finding the set of particle locations. Some 2D numerical experiments are computed and some promising results are shown.

  14. A Second Order Expansion of the Separatrix Map for Trigonometric Perturbations of a Priori Unstable Systems

    NASA Astrophysics Data System (ADS)

    Guardia, M.; Kaloshin, V.; Zhang, J.

    2016-07-01

    In this paper we study a so-called separatrix map introduced by Zaslavskii-Filonenko (Sov Phys JETP 27:851-857, 1968) and studied by Treschev (Physica D 116(1-2):21-43, 1998; J Nonlinear Sci 12(1):27-58, 2002), Piftankin (Nonlinearity (19):2617-2644, 2006) Piftankin and Treshchëv (Uspekhi Mat Nauk 62(2(374)):3-108, 2007). We derive a second order expansion of this map for trigonometric perturbations. In Castejon et al. (Random iteration of maps of a cylinder and diffusive behavior. Preprint available at arXiv:1501.03319, 2015), Guardia and Kaloshin (Stochastic diffusive behavior through big gaps in a priori unstable systems (in preparation), 2015), and Kaloshin et al. (Normally Hyperbolic Invariant Laminations and diffusive behavior for the generalized Arnold example away from resonances. Preprint available at http://www.terpconnect.umd.edu/vkaloshi/, 2015), applying the results of the present paper, we describe a class of nearly integrable deterministic systems with stochastic diffusive behavior.

  15. A Priori Analysis of Flamelet-Based Modeling for a Dual-Mode Scramjet Combustor

    NASA Technical Reports Server (NTRS)

    Quinlan, Jesse R.; McDaniel, James C.; Drozda, Tomasz G.; Lacaze, Guilhem; Oefelein, Joseph

    2014-01-01

    An a priori investigation of the applicability of flamelet-based combustion models to dual-mode scramjet combustion was performed utilizing Reynolds-averaged simulations (RAS). For this purpose, the HIFiRE Direct Connect Rig (HDCR) flowpath, fueled with a JP-7 fuel surrogate and operating in dual- and scram-mode was considered. The chemistry of the JP-7 fuel surrogate was modeled using a 22 species, 18-step chemical reaction mechanism. Simulation results were compared to experimentally-obtained, time-averaged, wall pressure measurements to validate the RAS solutions. The analysis of the dual-mode operation of this flowpath showed regions of predominately non-premixed, high-Damkohler number, combustion. Regions of premixed combustion were also present but associated with only a small fraction of the total heat-release in the flow. This is in contrast to the scram-mode operation, where a comparable amount of heat is released from non-premixed and premixed combustion modes. Representative flamelet boundary conditions were estimated by analyzing probability density functions for temperature and pressure for pure fuel and oxidizer conditions. The results of the present study reveal the potential for a flamelet model to accurately model the combustion processes in the HDCR and likely other high-speed flowpaths of engineering interest.

  16. A quantum question order model supported by empirical tests of an a priori and precise prediction.

    PubMed

    Wang, Zheng; Busemeyer, Jerome R

    2013-10-01

    Question order effects are commonly observed in self-report measures of judgment and attitude. This article develops a quantum question order model (the QQ model) to account for four types of question order effects observed in literature. First, the postulates of the QQ model are presented. Second, an a priori, parameter-free, and precise prediction, called the QQ equality, is derived from these mathematical principles, and six empirical data sets are used to test the prediction. Third, a new index is derived from the model to measure similarity between questions. Fourth, we show that in contrast to the QQ model, Bayesian and Markov models do not generally satisfy the QQ equality and thus cannot account for the reported empirical data that support this equality. Finally, we describe the conditions under which order effects are predicted to occur, and we review a broader range of findings that are encompassed by these very same quantum principles. We conclude that quantum probability theory, initially invented to explain order effects on measurements in physics, appears to be a powerful natural explanation for order effects of self-report measures in social and behavioral sciences, too. PMID:24027203

  17. Double-difference traveltime tomography with edge-preserving regularization and a priori interfaces

    NASA Astrophysics Data System (ADS)

    Lin, Youzuo; Syracuse, Ellen M.; Maceira, Monica; Zhang, Haijiang; Larmat, Carene

    2015-05-01

    Conventional traveltime seismic tomography methods with Tikhonov regularization (L2 norm) typically produce smooth models, but these models may be inappropriate when subsurface structure contains discontinuous features, such as faults or fractures, indicating that tomographic models should contain sharp boundaries. For this reason, we develop a double-difference (DD) traveltime tomography method that uses a modified total-variation regularization scheme incorporated with a priori information on interfaces to preserve sharp property contrasts and obtain accurate inversion results. In order to solve the inversion problem, we employ an alternating minimization method to decouple the original DD tomography problem into two separate subproblems: a conventional DD tomography with Tikhonov regularization and a L2 total-variation inversion. We use the LSQR linear solver to solve the Tikhonov inversion and the split-Bregman iterative method to solve the total-variation inversion. Through our numerical examples, we show that our new DD tomography method yields more accurate results than the conventional DD tomography method at almost the same computational cost.

  18. FORTRAN IV Program for One-Way Analysis of Variance with A Priori or A Posteriori Mean Comparisons

    ERIC Educational Resources Information Center

    Fordyce, Michael W.

    1977-01-01

    A flexible Fortran program for computing one way analysis of variance is described. Requiring minimal core space, the program provides a variety of useful group statistics, all summary statistics for the analysis, and all mean comparisons for a priori or a posteriori testing. (Author/JKS)

  19. Musical Probabilities, Abductive Reasoning, and Brain Mechanisms: Extended Perspective of "A Priori" Listening to Music within the Creative Cognition Approach

    ERIC Educational Resources Information Center

    Schmidt, Sebastian; Troge, Thomas A.; Lorrain, Denis

    2013-01-01

    A theory of listening to music is proposed. It suggests that, for listeners, the process of prediction is the starting point to experiencing music. This implies that perception of music starts through both a predisposed and an experience-based extrapolation into the future (this is labeled "a priori" listening). Indications for this…

  20. A priori noise and regularization in least squares collocation of gravity anomalies

    NASA Astrophysics Data System (ADS)

    Jarmołowski, Wojciech

    2013-12-01

    The paper describes the estimation of covariance parameters in least squares collocation (LSC) by the cross-validation (CV) technique called leave-one-out (LOO). Two parameters of Gauss-Markov third order model (GM3) are estimated together with a priori noise standard deviation, which contributes significantly to the covariance matrix composed of the signal and noise. Numerical tests are performed using large set of Bouguer gravity anomalies located in the central part of the U.S. Around 103 000 gravity stations are available in the selected area. This dataset, together with regular grids generated from EGM2008 geopotential model, give an opportunity to work with various spatial resolutions of the data and heterogeneous variances of the signal and noise. This plays a crucial role in the numerical investigations, because the spatial resolution of the gravity data determines the number of gravity details that we may observe and model. This establishes a relation between the spatial resolution of the data and the resolution of the gravity field model. This relation is inspected in the article and compared to the regularization problem occurring frequently in data modeling. Artykuł opisuje estymację parametrów kowariancji w kolokacji najmniejszych kwadratów (LSC) przy pomocy techniki kroswalidacji nazywanej leave-one-out (LOO). Wyznaczane są dwa parametry modelu Gaussa-Markova trzeciego rzędu (GM3) wraz z odchyleniem standardowym szumu a priori, które ma znaczny wpływ na macierz kowariancji złożoną z sygnału i szumu. Testy numeryczne przeprowadzono na dużym zbiorze anomalii grawimetrycznych Bouguera z obszaru centralnej części USA. Obszar ten mieści około 103000 pomiarów grawimetrycznych. Dane te wraz z regularnymi siatkami wygenerowanymi z modelu geopotencjalnego EGM2008 pozwalają na pracę z różną rozdzielczością przestrzenną i różnymi wariancjami sygnału i szumu. Odgrywa to kluczową rolę w badaniach numerycznych, ponieważ rozdzielczo

  1. An a priori DNS study of the shadow-position mixing model

    DOE PAGESBeta

    Zhao, Xin -Yu; Bhagatwala, Ankit; Chen, Jacqueline H.; Haworth, Daniel C.; Pope, Stephen B.

    2016-01-15

    In this study, the modeling of mixing by molecular diffusion is a central aspect for transported probability density function (tPDF) methods. In this paper, the newly-proposed shadow position mixing model (SPMM) is examined, using a DNS database for a temporally evolving di-methyl ether slot jet flame. Two methods that invoke different levels of approximation are proposed to extract the shadow displacement (equivalent to shadow position) from the DNS database. An approach for a priori analysis of the mixing-model performance is developed. The shadow displacement is highly correlated with both mixture fraction and velocity, and the peak correlation coefficient of themore » shadow displacement and mixture fraction is higher than that of the shadow displacement and velocity. This suggests that the composition-space localness is reasonably well enforced by the model, with appropriate choices of model constants. The conditional diffusion of mixture fraction and major species from DNS and from SPMM are then compared, using mixing rates that are derived by matching the mixture fraction scalar dissipation rates. Good qualitative agreement is found, for the prediction of the locations of zero and maximum/minimum conditional diffusion locations for mixture fraction and individual species. Similar comparisons are performed for DNS and the IECM (interaction by exchange with the conditional mean) model. The agreement between SPMM and DNS is better than that between IECM and DNS, in terms of conditional diffusion iso-contour similarities and global normalized residual levels. It is found that a suitable value for the model constant c that controls the mixing frequency can be derived using the local normalized scalar variance, and that the model constant a controls the localness of the model. A higher-Reynolds-number test case is anticipated to be more appropriate to evaluate the mixing models, and stand-alone transported PDF simulations are required to more fully enforce

  2. Quantifying the sensitivity of aerosol optical depths retrieved from MSG SEVIRI to a priori data

    NASA Astrophysics Data System (ADS)

    Bulgin, C. E.; Palmer, P. I.; Merchant, C. J.; Siddans, R.; Poulsen, C.; Grainger, R. G.; Thomas, G.; Carboni, E.; McConnell, C.; Highwood, E.

    2009-12-01

    Radiative forcing contributions from aerosol direct and indirect effects remain one of the most uncertain components of the climate system. Satellite observations of aerosol optical properties offer important constraints on atmospheric aerosols but their sensitivity to prior assumptions must be better characterized before they are used effectively to reduce uncertainty in aerosol radiative forcing. We assess the sensitivity of the Oxford-RAL Aerosol and Cloud (ORAC) optimal estimation retrieval of aerosol optical depth (AOD) from the Spinning Enhanced Visible and InfraRed Imager (SEVIRI) to a priori aerosol data. SEVIRI is a geostationary satellite instrument centred over Africa and the neighbouring Atlantic Ocean, routinely sampling desert dust and biomass burning outflow from Africa. We quantify the uncertainty in SEVIRI AOD retrievals in the presence of desert dust by comparing retrievals that use prior information from the Optical Properties of Aerosol and Cloud (OPAC) database, with those that use measured aerosol properties during the Dust Outflow and Deposition to the Ocean (DODO) aircraft campaign (August, 2006). We also assess the sensitivity of retrieved AODs to changes in solar zenith angle, and the vertical profile of aerosol effective radius and extinction coefficient input into the retrieval forward model. Currently the ORAC retrieval scheme retrieves AODs for five aerosol types (desert dust, biomass burning, maritime, urban and continental) and chooses the most appropriate AOD based on the cost functions. We generate an improved prior aerosol speciation database for SEVIRI based on a statistical analysis of a Saharan Dust Index (SDI) determined using variances of different brightness temperatures, and organic and black carbon tracers from the GEOS-Chem chemistry transport model. This database is described as a function of season and time of day. We quantify the difference in AODs between those chosen based on prior information from the SDI and GEOS

  3. Assessing the benefit of 3D a priori models for earthquake location

    NASA Astrophysics Data System (ADS)

    Tilmann, F. J.; Manzanares, A.; Peters, K.; Kahle, R. L.; Lange, D.; Saul, J.; Nooshiri, N.

    2014-12-01

    Earthquake location in 1D Earth models is a routine procedure. Particularly in environments such as subduction zones where the network geometry is biased and lateral velocity variations are large, the use of a 1D model can lead to strongly biased solutions. This is well-known and it is therefore usually preferred to use three-dimensional models, e.g. from local earthquake tomography. Efficient codes for earthquake location in 3D models are available for routine use, for example NonLinLoc. However, tomographic studies are time-consuming to carry out, and a sufficient number of data might not always be available. However, in many cases, information about the three-dimensional velocity structure is available in the form of refraction surveys or other constraints such as gravity or receiver functions based models. Failing that, global or regional scale crustal models could be employed. However, it is not obvious that models derived using different types of data lead to better location results than an optimised 1D velocity model. On the other hand, correct interpretation of seismicity patterns often requires comparison and exaxt positioning in pre-existing velocity models. In this presentation we draw on examples from the Chilean and Sumatran margins as well as a mid-ocean ridge environments, using both data and synthetic examples to investigate under what conditions the use of a priori 3D models is expected to result in improved location results and modifies interpretation. Furthermore, we introduce MATLAB tools that facilitate the creation of three-dimensional models suitable for earthquake location from refraction profiles, CRUST1 and SLAB1.0 and other model types.

  4. Inversion of EM38 data measured in different heights using a-priori information for stabilization

    NASA Astrophysics Data System (ADS)

    Wunderlich, Tina; Rabbel, Wolfgang

    2010-05-01

    Within the frame of the iSOIL project apparent conductivity measurements using the EM38DD (Geonics) have been conducted on different soil types. The EM38DD is mounted in different heights on a metal-free sled and pulled behind a tractor with an inline sampling distance of 20cm and a profile offset of 1m. The apparent conductivities have been inverted into real conductivities over the whole measured area. In order to improve the equation system and to avoid singular matrices 4 measurements (vertical and horizontal mode in two different heights) at one location are used to determine the conductivities of two layers and the depth of the interface between the layers. The inversion is stabilized by weighted a-priori information for both conductivities and depth and by the inclusion of neighboring points. Depth information is gained from GPR measurements over the same area that have been done in one survey together with the EM38DD measurements. The inversion results are compared to results of 1D and 2D electrical resistivity imaging using optimized and Schlumberger configurations. Principal Component Analysis is used to compare modeled and measured data and correlation coefficients between them are calculated to evaluate the reliability of the inversion. Acknowledgement: iSOIL-Interactions between soil related sciences - Linking geophysics, soil science and digital soil mapping is a Collaborative Project (Grant Agreement number 211386) co-funded by the Research DG of the European Commission within the RTD activities of the FP7 Thematic Priority Environment.

  5. LLNL's 3-D A Priori Model Constraints and Uncertainties for Improving Seismic Location

    SciTech Connect

    Flanagan, M P; Myers, S C; Schultz, C A; Pasyanos, M E; Bhattacharyya, J

    2000-07-14

    Accurate seismic event location is key to monitoring the Comprehensive Nuclear-Test-Ban Treaty (CTBT) and is largely dependent on our understanding of the crust and mantle velocity structure. This is particularly challenging in aseismic regions, devoid of calibration data, which leads us to rely on a priori constraints on the velocities. We investigate our ability to improve seismic event location in the Middle East, North Africa, and the Former Soviet Union (ME/NA/FSU) by using a priori three-dimensional (3-D) velocity models in lieu of more commonly used one dimensional (1-D) models. Event locations based on 1-D models are often biased, as they do not account for significant travel-time variations that result from heterogeneous crust and mantle; it follows that 3-D velocity models have the potential to reduce this bias. Here, we develop a composite 3-D model for the ME/NA/FSU regions. This fully 3-D model is an amalgamation of studies ranging from seismic reflection to geophysical analogy. Our a priori model specifies geographic boundaries and velocity structures based on geology, tectonics, and seismicity and information taken from published literature, namely a global sediment thickness map of 1{sup o} resolution (Laske and Masters, 1997), a regionalized crustal model based on geology and tectonics (Sweeney and Walter, 1998; Bhattacharyya et al., 2000; Walter et al., 2000), and regionalized upper mantle (RUM) models developed from teleseismic travel times (Gudmundsson and Sambridge, 1998). The components of this model were chosen for the complementary structures they provide. The 1{sup o} sediment map and regionalized crustal model provide detailed structures and boundaries not available in the more coarse 5{sup o} models used for global-scale studies. The RUM models offer improved resolution over global tomography, most notably above depths of 300 km where heterogeneity is greatest; however, we plan to test other published upper mantle models of both P- and S

  6. The origin of anomalous transport in porous media - is it possible to make a priori predictions?

    NASA Astrophysics Data System (ADS)

    Bijeljic, Branko; Blunt, Martin

    2013-04-01

    at approximately the average flow speed; in the carbonate with the widest velocity distribution the stagnant concentration peak is persistent, while the emergence of a smaller secondary mobile peak is observed, leading to a highly anomalous behavior. This defines different generic nature of non-Fickian transport in the three media and quantifies the effect of pore structure on transport. Moreover, the propagators obtained by the model are in a very good agreement with the propagators measured on beadpack, Bentheimer sandstone and Portland carbonate cores in nuclear magnetic resonance experiments. These findings demonstrate that it is possible to make a priori predictions of anomalous transport in porous media. The importance of these findings for transport in complex carbonate rock micro-CT images is discussed, classifying them in terms of degree of anomalous transport that can have an impact at the field scale. Extensions to reactive transport will be discussed.

  7. A priori error estimates for an hp-version of the discontinuous Galerkin method for hyperbolic conservation laws

    NASA Technical Reports Server (NTRS)

    Bey, Kim S.; Oden, J. Tinsley

    1993-01-01

    A priori error estimates are derived for hp-versions of the finite element method for discontinuous Galerkin approximations of a model class of linear, scalar, first-order hyperbolic conservation laws. These estimates are derived in a mesh dependent norm in which the coefficients depend upon both the local mesh size h(sub K) and a number p(sub k) which can be identified with the spectral order of the local approximations over each element.

  8. LandScape: a simple method to aggregate p-values and other stochastic variables without a priori grouping.

    PubMed

    Wiuf, Carsten; Schaumburg-Müller Pallesen, Jonatan; Foldager, Leslie; Grove, Jakob

    2016-08-01

    In many areas of science it is custom to perform many, potentially millions, of tests simultaneously. To gain statistical power it is common to group tests based on a priori criteria such as predefined regions or by sliding windows. However, it is not straightforward to choose grouping criteria and the results might depend on the chosen criteria. Methods that summarize, or aggregate, test statistics or p-values, without relying on a priori criteria, are therefore desirable. We present a simple method to aggregate a sequence of stochastic variables, such as test statistics or p-values, into fewer variables without assuming a priori defined groups. We provide different ways to evaluate the significance of the aggregated variables based on theoretical considerations and resampling techniques, and show that under certain assumptions the FWER is controlled in the strong sense. Validity of the method was demonstrated using simulations and real data analyses. Our method may be a useful supplement to standard procedures relying on evaluation of test statistics individually. Moreover, by being agnostic and not relying on predefined selected regions, it might be a practical alternative to conventionally used methods of aggregation of p-values over regions. The method is implemented in Python and freely available online (through GitHub, see the Supplementary information). PMID:27269897

  9. Using models to guide field experiments: a priori predictions for the CO2 response of a nutrient- and water-limited native Eucalypt woodland.

    PubMed

    Medlyn, Belinda E; De Kauwe, Martin G; Zaehle, Sönke; Walker, Anthony P; Duursma, Remko A; Luus, Kristina; Mishurov, Mikhail; Pak, Bernard; Smith, Benjamin; Wang, Ying-Ping; Yang, Xiaojuan; Crous, Kristine Y; Drake, John E; Gimeno, Teresa E; Macdonald, Catriona A; Norby, Richard J; Power, Sally A; Tjoelker, Mark G; Ellsworth, David S

    2016-08-01

    The response of terrestrial ecosystems to rising atmospheric CO2 concentration (Ca ), particularly under nutrient-limited conditions, is a major uncertainty in Earth System models. The Eucalyptus Free-Air CO2 Enrichment (EucFACE) experiment, recently established in a nutrient- and water-limited woodland presents a unique opportunity to address this uncertainty, but can best do so if key model uncertainties have been identified in advance. We applied seven vegetation models, which have previously been comprehensively assessed against earlier forest FACE experiments, to simulate a priori possible outcomes from EucFACE. Our goals were to provide quantitative projections against which to evaluate data as they are collected, and to identify key measurements that should be made in the experiment to allow discrimination among alternative model assumptions in a postexperiment model intercomparison. Simulated responses of annual net primary productivity (NPP) to elevated Ca ranged from 0.5 to 25% across models. The simulated reduction of NPP during a low-rainfall year also varied widely, from 24 to 70%. Key processes where assumptions caused disagreement among models included nutrient limitations to growth; feedbacks to nutrient uptake; autotrophic respiration; and the impact of low soil moisture availability on plant processes. Knowledge of the causes of variation among models is now guiding data collection in the experiment, with the expectation that the experimental data can optimally inform future model improvements. PMID:26946185

  10. Testing the hypothesis of neurodegeneracy in respiratory network function with a priori transected arterially perfused brain stem preparation of rat.

    PubMed

    Jones, Sarah E; Dutschmann, Mathias

    2016-05-01

    Degeneracy of respiratory network function would imply that anatomically discrete aspects of the brain stem are capable of producing respiratory rhythm. To test this theory we a priori transected brain stem preparations before reperfusion and reoxygenation at 4 rostrocaudal levels: 1.5 mm caudal to obex (n = 5), at obex (n = 5), and 1.5 (n = 7) and 3 mm (n = 6) rostral to obex. The respiratory activity of these preparations was assessed via recordings of phrenic and vagal nerves and lumbar spinal expiratory motor output. Preparations with a priori transection at level of the caudal brain stem did not produce stable rhythmic respiratory bursting, even when the arterial chemoreceptors were stimulated with sodium cyanide (NaCN). Reperfusion of brain stems that preserved the pre-Bötzinger complex (pre-BötC) showed spontaneous and sustained rhythmic respiratory bursting at low phrenic nerve activity (PNA) amplitude that occurred simultaneously in all respiratory motor outputs. We refer to this rhythm as the pre-BötC burstlet-type rhythm. Conserving circuitry up to the pontomedullary junction consistently produced robust high-amplitude PNA at lower burst rates, whereas sequential motor patterning across the respiratory motor outputs remained absent. Some of the rostrally transected preparations expressed both burstlet-type and regular PNA amplitude rhythms. Further analysis showed that the burstlet-type rhythm and high-amplitude PNA had 1:2 quantal relation, with burstlets appearing to trigger high-amplitude bursts. We conclude that no degenerate rhythmogenic circuits are located in the caudal medulla oblongata and confirm the pre-BötC as the primary rhythmogenic kernel. The absence of sequential motor patterning in a priori transected preparations suggests that pontine circuits govern respiratory pattern formation. PMID:26888109

  11. PICARA, an Analytical Pipeline Providing Probabilistic Inference about A Priori Candidates Genes Underlying Genome-Wide Association QTL in Plants

    PubMed Central

    Chen, Charles; DeClerck, Genevieve; Tian, Feng; Spooner, William; McCouch, Susan; Buckler, Edward

    2012-01-01

    PICARA is an analytical pipeline designed to systematically summarize observed SNP/trait associations identified by genome wide association studies (GWAS) and to identify candidate genes involved in the regulation of complex trait variation. The pipeline provides probabilistic inference about a priori candidate genes using integrated information derived from genome-wide association signals, gene homology, and curated gene sets embedded in pathway descriptions. In this paper, we demonstrate the performance of PICARA using data for flowering time variation in maize – a key trait for geographical and seasonal adaption of plants. Among 406 curated flowering time-related genes from Arabidopsis, we identify 61 orthologs in maize that are significantly enriched for GWAS SNP signals, including key regulators such as FT (Flowering Locus T) and GI (GIGANTEA), and genes centered in the Arabidopsis circadian pathway, including TOC1 (Timing of CAB Expression 1) and LHY (Late Elongated Hypocotyl). In addition, we discover a regulatory feature that is characteristic of these a priori flowering time candidates in maize. This new probabilistic analytical pipeline helps researchers infer the functional significance of candidate genes associated with complex traits and helps guide future experiments by providing statistical support for gene candidates based on the integration of heterogeneous biological information. PMID:23144785

  12. A Priori Analysis of a Compressible Flamelet Model using RANS Data for a Dual-Mode Scramjet Combustor

    NASA Technical Reports Server (NTRS)

    Quinlan, Jesse R.; Drozda, Tomasz G.; McDaniel, James C.; Lacaze, Guilhem; Oefelein, Joseph

    2015-01-01

    In an effort to make large eddy simulation of hydrocarbon-fueled scramjet combustors more computationally accessible using realistic chemical reaction mechanisms, a compressible flamelet/progress variable (FPV) model was proposed that extends current FPV model formulations to high-speed, compressible flows. Development of this model relied on observations garnered from an a priori analysis of the Reynolds-Averaged Navier-Stokes (RANS) data obtained for the Hypersonic International Flight Research and Experimentation (HI-FiRE) dual-mode scramjet combustor. The RANS data were obtained using a reduced chemical mechanism for the combustion of a JP-7 surrogate and were validated using avail- able experimental data. These RANS data were then post-processed to obtain, in an a priori fashion, the scalar fields corresponding to an FPV-based modeling approach. In the current work, in addition to the proposed compressible flamelet model, a standard incompressible FPV model was also considered. Several candidate progress variables were investigated for their ability to recover static temperature and major and minor product species. The effects of pressure and temperature on the tabulated progress variable source term were characterized, and model coupling terms embedded in the Reynolds- averaged Navier-Stokes equations were studied. Finally, results for the novel compressible flamelet/progress variable model were presented to demonstrate the improvement attained by modeling the effects of pressure and flamelet boundary conditions on the combustion.

  13. Use of a priori spectral information in the measurement of x-ray flux with filtered diode arrays

    NASA Astrophysics Data System (ADS)

    Marrs, R. E.; Widmann, K.; Brown, G. V.; Heeter, R. F.; MacLaren, S. A.; May, M. J.; Moore, A. S.; Schneider, M. B.

    2015-10-01

    Filtered x-ray diode (XRD) arrays are often used to measure x-ray spectra vs. time from spectrally continuous x-ray sources such as hohlraums. A priori models of the incident x-ray spectrum enable a more accurate unfolding of the x-ray flux as compared to the standard technique of modifying a thermal Planckian with spectral peaks or dips at the response energy of each filtered XRD channel. A model x-ray spectrum consisting of a thermal Planckian, a Gaussian at higher energy, and (in some cases) a high energy background provides an excellent fit to XRD-array measurements of x-ray emission from laser heated hohlraums. If high-resolution measurements of part of the x-ray emission spectrum are available, that information can be included in the a priori model. In cases where the x-ray emission spectrum is not Planckian, candidate x-ray spectra can be allowed or excluded by fitting them to measured XRD voltages. Examples are presented from the filtered XRD arrays, named Dante, at the National Ignition Facility and the Laboratory for Laser Energetics.

  14. A priori comparison of RANS scalar flux models using DNS data of a Mach 5 boundary layer

    NASA Astrophysics Data System (ADS)

    Braman, Kalen; Raman, Venkatramanan

    2009-11-01

    In order to investigate the applicability of Reynolds-averaged scalar flux models (SFM) to scalar dispersion in high speed turbulent flows, a priori comparisons have been performed utilizing the results of direct numerical simulations (DNS) of a Mach 5 boundary layer. At a small patch on the solid surface boundary, a scalar was introduced into the flow at a rate depending upon the local surface temperature. This configuration mimics surface ablation in hypersonic flows. In different simulations, the scalar injection rate was varied, and the scalar was treated as both passive, not affecting the flow field, and active, affecting the flow field due to having different molecular properties than the bulk flow and having an injection velocity. Statistics of the simulated scalar fields have been calculated and compared a priori with terms from SFMs. Comparisons from the passive scalar case show that the scalar flux terms in the standard gradient diffusion model fail to predict even the trend of the DNS values. The generalized gradient diffusion models, while an improvement for the streamwise component of scalar flux, nevertheless fail to predict the wall normal and spanwise fluxes. Additionally, production and dissipation models for the scalar variance equation are evaluated.

  15. A priori evaluation of two-stage cluster sampling for accuracy assessment of large-area land-cover maps

    USGS Publications Warehouse

    Wickham, J.D.; Stehman, S.V.; Smith, J.H.; Wade, T.G.; Yang, L.

    2004-01-01

    Two-stage cluster sampling reduces the cost of collecting accuracy assessment reference data by constraining sample elements to fall within a limited number of geographic domains (clusters). However, because classification error is typically positively spatially correlated, within-cluster correlation may reduce the precision of the accuracy estimates. The detailed population information to quantify a priori the effect of within-cluster correlation on precision is typically unavailable. Consequently, a convenient, practical approach to evaluate the likely performance of a two-stage cluster sample is needed. We describe such an a priori evaluation protocol focusing on the spatial distribution of the sample by land-cover class across different cluster sizes and costs of different sampling options, including options not imposing clustering. This protocol also assesses the two-stage design's adequacy for estimating the precision of accuracy estimates for rare land-cover classes. We illustrate the approach using two large-area, regional accuracy assessments from the National Land-Cover Data (NLCD), and describe how the a priorievaluation was used as a decision-making tool when implementing the NLCD design.

  16. Unequal Knowledge.

    ERIC Educational Resources Information Center

    Tilly, Charles

    2003-01-01

    Discusses how the persistence of knowledge inequalities influences higher education. Explores how the control of and access to knowledge affects human well being (i.e., control over production of knowledge, control over its distribution, and access to knowledge by people whose well being it will or could affect). (EV)

  17. Knowledge Management.

    ERIC Educational Resources Information Center

    1999

    The first of the four papers in this symposium, "Knowledge Management and Knowledge Dissemination" (Wim J. Nijhof), presents two case studies exploring the strategies companies use in sharing and disseminating knowledge and expertise among employees. "A Theory of Knowledge Management" (Richard J. Torraco), develops a conceptual framework for…

  18. Knowledge Management

    NASA Technical Reports Server (NTRS)

    Shariq, Syed Z.; Kutler, Paul (Technical Monitor)

    1997-01-01

    The emergence of rapidly expanding technologies for distribution and dissemination of information and knowledge has brought to focus the opportunities for development of knowledge-based networks, knowledge dissemination and knowledge management technologies and their potential applications for enhancing productivity of knowledge work. The challenging and complex problems of the future can be best addressed by developing the knowledge management as a new discipline based on an integrative synthesis of hard and soft sciences. A knowledge management professional society can provide a framework for catalyzing the development of proposed synthesis as well as serve as a focal point for coordination of professional activities in the strategic areas of education, research and technology development. Preliminary concepts for the development of the knowledge management discipline and the professional society are explored. Within this context of knowledge management discipline and the professional society, potential opportunities for application of information technologies for more effectively delivering or transferring information and knowledge (i.e., resulting from the NASA's Mission to Planet Earth) for the development of policy options in critical areas of national and global importance (i.e., policy decisions in economic and environmental areas) can be explored, particularly for those policy areas where a global collaborative knowledge network is likely to be critical to the acceptance of the policies.

  19. Priori mask guided image reconstruction (p-MGIR) for ultra-low dose cone-beam computed tomography

    NASA Astrophysics Data System (ADS)

    Park, Justin C.; Zhang, Hao; Chen, Yunmei; Fan, Qiyong; Kahler, Darren L.; Liu, Chihray; Lu, Bo

    2015-11-01

    Recently, the compressed sensing (CS) based iterative reconstruction method has received attention because of its ability to reconstruct cone beam computed tomography (CBCT) images with good quality using sparsely sampled or noisy projections, thus enabling dose reduction. However, some challenges remain. In particular, there is always a tradeoff between image resolution and noise/streak artifact reduction based on the amount of regularization weighting that is applied uniformly across the CBCT volume. The purpose of this study is to develop a novel low-dose CBCT reconstruction algorithm framework called priori mask guided image reconstruction (p-MGIR) that allows reconstruction of high-quality low-dose CBCT images while preserving the image resolution. In p-MGIR, the unknown CBCT volume was mathematically modeled as a combination of two regions: (1) where anatomical structures are complex, and (2) where intensities are relatively uniform. The priori mask, which is the key concept of the p-MGIR algorithm, was defined as the matrix that distinguishes between the two separate CBCT regions where the resolution needs to be preserved and where streak or noise needs to be suppressed. We then alternately updated each part of image by solving two sub-minimization problems iteratively, where one minimization was focused on preserving the edge information of the first part while the other concentrated on the removal of noise/artifacts from the latter part. To evaluate the performance of the p-MGIR algorithm, a numerical head-and-neck phantom, a Catphan 600 physical phantom, and a clinical head-and-neck cancer case were used for analysis. The results were compared with the standard Feldkamp-Davis-Kress as well as conventional CS-based algorithms. Examination of the p-MGIR algorithm showed that high-quality low-dose CBCT images can be reconstructed without compromising the image resolution. For both phantom and the patient cases, the p-MGIR is able to achieve a clinically

  20. Subfilter-Scale Fluxes over a Surface Roughness Transition. Part II: A priori Study of Large-Eddy Simulation Models

    NASA Astrophysics Data System (ADS)

    Carper, Matthew A.; Porté-Agel, Fernando

    2008-04-01

    The ability of subfilter-scale (SFS) models to reproduce the statistical properties of SFS stresses and energy transfers over heterogeneous surface roughness is key to improving the accuracy of large-eddy simulations of the atmospheric boundary layer. In this study, several SFS models are evaluated a priori using experimental data acquired downwind of a rough-to-smooth transition in a wind tunnel. The SFS models studied include the eddy-viscosity, similarity, non-linear and a mixed model consisting of a combination of the eddy-viscosity and non-linear models. The dynamic eddy-viscosity model is also evaluated. The experimental data consist of vertical and horizontal planes of high-spatial-resolution velocity fields measured using particle image velocimetry. These velocity fields are spatially filtered and used to calculate SFS stresses and SFS transfer rates of resolved kinetic energy. Coefficients for each SFS model are calculated by matching the measured and modelled SFS energy transfer rates. For the eddy-viscosity model, the Smagorinsky coefficient is also evaluated using a dynamic procedure. The model coefficients are found to be scale dependent when the filter scales are larger than the vertical measurement height and fall into the production subrange of the turbulence where the flow scales are anisotropic. Near the surface, the Smagorinsky coefficient is also found to decrease with distance downwind from the transition, in response to the increase in mean shear as the flow adjusts to the smooth surface. In a priori tests, the ability of each model to reproduce statistical properties of the SFS stress is assessed. While the eddy-viscosity model has low spatial correlation with the measured stress, it predicts mean stresses with the same accuracy as the other models. However, the deficiency of the eddy-viscosity model is apparent in the underestimation of the standard deviation of the SFS stresses and the inability to predict transfers of kinetic energy from

  1. Knowledge Alive

    ERIC Educational Resources Information Center

    Perkins, David

    2004-01-01

    The strategies that expose learners to the large volume of knowledge, enables them for creative thinking, self-management and deep reading. The different ways of creating knowledge with the help of creativity, communication, organization, problem solving and decision-making are discussed.

  2. Knowledge Management

    ERIC Educational Resources Information Center

    Deepak

    2005-01-01

    Knowledge Management (KM) is the process through which organizations generate value from their intellectual and knowledge-based assets. Frequently generating value from such assets means sharing them among employees, divisions and even with other companies in order to develop best practices. This article discusses three basic aspects of…

  3. A priori testing of subgrid-scale models for the velocity-pressure and vorticity-velocity formulations

    NASA Technical Reports Server (NTRS)

    Winckelmans, G. S.; Lund, T. S.; Carati, D.; Wray, A. A.

    1996-01-01

    Subgrid-scale models for Large Eddy Simulation (LES) in both the velocity-pressure and the vorticity-velocity formulations were evaluated and compared in a priori tests using spectral Direct Numerical Simulation (DNS) databases of isotropic turbulence: 128(exp 3) DNS of forced turbulence (Re(sub(lambda))=95.8) filtered, using the sharp cutoff filter, to both 32(exp 3) and 16(exp 3) synthetic LES fields; 512(exp 3) DNS of decaying turbulence (Re(sub(Lambda))=63.5) filtered to both 64(exp 3) and 32(exp 3) LES fields. Gaussian and top-hat filters were also used with the 128(exp 3) database. Different LES models were evaluated for each formulation: eddy-viscosity models, hyper eddy-viscosity models, mixed models, and scale-similarity models. Correlations between exact versus modeled subgrid-scale quantities were measured at three levels: tensor (traceless), vector (solenoidal 'force'), and scalar (dissipation) levels, and for both cases of uniform and variable coefficient(s). Different choices for the 1/T scaling appearing in the eddy-viscosity were also evaluated. It was found that the models for the vorticity-velocity formulation produce higher correlations with the filtered DNS data than their counterpart in the velocity-pressure formulation. It was also found that the hyper eddy-viscosity model performs better than the eddy viscosity model, in both formulations.

  4. Wiener filtering of surface EMG with a priori SNR estimation toward myoelectric control for neurological injury patients.

    PubMed

    Liu, Jie; Ying, Dongwen; Zhou, Ping

    2014-12-01

    Voluntary surface electromyogram (EMG) signals from neurological injury patients are often corrupted by involuntary background interference or spikes, imposing difficulties for myoelectric control. We present a novel framework to suppress involuntary background spikes during voluntary surface EMG recordings. The framework applies a Wiener filter to restore voluntary surface EMG signals based on tracking a priori signal to noise ratio (SNR) by using the decision-directed method. Semi-synthetic surface EMG signals contaminated by different levels of involuntary background spikes were constructed from a database of surface EMG recordings in a group of spinal cord injury subjects. After the processing, the onset detection of voluntary muscle activity was significantly improved against involuntary background spikes. The magnitude of voluntary surface EMG signals can also be reliably estimated for myoelectric control purpose. Compared with the previous sample entropy analysis for suppressing involuntary background spikes, the proposed framework is characterized by quick and simple implementation, making it more suitable for application in a myoelectric control system toward neurological injury rehabilitation. PMID:25443536

  5. Tomographic image via background subtraction using an x-ray projection image and a priori computed tomography

    SciTech Connect

    Zhang Jin; Yi Byongyong; Lasio, Giovanni; Suntharalingam, Mohan; Yu, Cedric

    2009-10-15

    Kilovoltage x-ray projection images (kV images for brevity) are increasingly available in image guided radiotherapy (IGRT) for patient positioning. These images are two-dimensional (2D) projections of a three-dimensional (3D) object along the x-ray beam direction. Projecting a 3D object onto a plane may lead to ambiguities in the identification of anatomical structures and to poor contrast in kV images. Therefore, the use of kV images in IGRT is mainly limited to bony landmark alignments. This work proposes a novel subtraction technique that isolates a slice of interest (SOI) from a kV image with the assistance of a priori information from a previous CT scan. The method separates structural information within a preselected SOI by suppressing contributions to the unprocessed projection from out-of-SOI-plane structures. Up to a five-fold increase in the contrast-to-noise ratios (CNRs) was observed in selected regions of the isolated SOI, when compared to the original unprocessed kV image. The tomographic image via background subtraction (TIBS) technique aims to provide a quick snapshot of the slice of interest with greatly enhanced image contrast over conventional kV x-ray projections for fast and accurate image guidance of radiation therapy. With further refinements, TIBS could, in principle, provide real-time tumor localization using gantry-mounted x-ray imaging systems without the need for implanted markers.

  6. Practical Considerations about Expected A Posteriori Estimation in Adaptive Testing: Adaptive A Priori, Adaptive Correction for Bias, and Adaptive Integration Interval.

    ERIC Educational Resources Information Center

    Raiche, Gilles; Blais, Jean-Guy

    In a computerized adaptive test (CAT), it would be desirable to obtain an acceptable precision of the proficiency level estimate using an optimal number of items. Decreasing the number of items is accompanied, however, by a certain degree of bias when the true proficiency level differs significantly from the a priori estimate. G. Raiche (2000) has…

  7. A priori assumptions about characters as a cause of incongruence between molecular and morphological hypotheses of primate interrelationships.

    PubMed

    Tornow, Matthew A; Skelton, Randall R

    2012-01-01

    When molecules and morphology produce incongruent hypotheses of primate interrelationships, the data are typically viewed as incompatible, and molecular hypotheses are often considered to be better indicators of phylogenetic history. However, it has been demonstrated that the choice of which taxa to include in cladistic analysis as well as assumptions about character weighting, character state transformation order, and outgroup choice all influence hypotheses of relationships and may positively influence tree topology, so that relationships between extant taxa are consistent with those found using molecular data. Thus, the source of incongruence between morphological and molecular trees may lie not in the morphological data themselves but in assumptions surrounding the ways characters evolve and their impact on cladistic analysis. In this study, we investigate the role that assumptions about character polarity and transformation order play in creating incongruence between primate phylogenies based on morphological data and those supported by multiple lines of molecular data. By releasing constraints imposed on published morphological analyses of primates from disparate clades and subjecting those data to parsimony analysis, we test the hypothesis that incongruence between morphology and molecules results from inherent flaws in morphological data. To quantify the difference between incongruent trees, we introduce a new method called branch slide distance (BSD). BSD mitigates many of the limitations attributed to other tree comparison methods, thus allowing for a more accurate measure of topological similarity. We find that releasing a priori constraints on character behavior often produces trees that are consistent with molecular trees. Case studies are presented that illustrate how congruence between molecules and unconstrained morphological data may provide insight into issues of polarity, transformation order, homology, and homoplasy. PMID:22065165

  8. A fast 3D surface reconstruction and volume estimation method for grain storage based on priori model

    NASA Astrophysics Data System (ADS)

    Liang, Xian-hua; Sun, Wei-dong

    2011-06-01

    Inventory checking is one of the most significant parts for grain reserves, and plays a very important role on the macro-control of food and food security. Simple, fast and accurate method to obtain internal structure information and further to estimate the volume of the grain storage is needed. Here in our developed system, a special designed multi-site laser scanning system is used to acquire the range data clouds of the internal structure of the grain storage. However, due to the seriously uneven distribution of the range data, this data should firstly be preprocessed by an adaptive re-sampling method to reduce the data redundancy as well as noise. Then the range data is segmented and useful features, such as plane and cylinder information, are extracted. With these features a coarse registration between all of these single-site range data is done, and then an Iterative Closest Point (ICP) algorithm is carried out to achieve fine registration. Taking advantage of the structure of the grain storage being well defined and the types of them are limited, a fast automatic registration method based on the priori model is proposed to register the multi-sites range data more efficiently. Then after the integration of the multi-sites range data, the grain surface is finally reconstructed by a delaunay based algorithm and the grain volume is estimated by a numerical integration method. This proposed new method has been applied to two common types of grain storage, and experimental results shown this method is more effective and accurate, and it can also avoids the cumulative effect of errors when registering the overlapped area pair-wisely.

  9. A nonlinear structural subgrid-scale closure for compressible MHD. II. A priori comparison on turbulence simulation data

    NASA Astrophysics Data System (ADS)

    Grete, Philipp; Vlaykov, Dimitar G.; Schmidt, Wolfram; Schleicher, Dominik R. G.

    2016-06-01

    Even though compressible plasma turbulence is encountered in many astrophysical phenomena, its effect is often not well understood. Furthermore, direct numerical simulations are typically not able to reach the extreme parameters of these processes. For this reason, large-eddy simulations (LES), which only simulate large and intermediate scales directly, are employed. The smallest, unresolved scales and the interactions between small and large scales are introduced by means of a subgrid-scale (SGS) model. We propose and verify a new set of nonlinear SGS closures for future application as an SGS model in LES of compressible magnetohydrodynamics. We use 15 simulations (without explicit SGS model) of forced, isotropic, homogeneous turbulence with varying sonic Mach number Ms=0.2 -20 as reference data for the most extensive a priori tests performed so far in literature. In these tests, we explicitly filter the reference data and compare the performance of the new closures against the most widely tested closures. These include eddy-viscosity and scale-similarity type closures with different normalizations. Performance indicators are correlations with the turbulent energy and cross-helicity flux, the average SGS dissipation, the topological structure and the ability to reproduce the correct magnitude and the direction of the SGS vectors. We find that only the new nonlinear closures exhibit consistently high correlations (median value > 0.8) with the data over the entire parameter space and outperform the other closures in all tests. Moreover, we show that these results are independent of resolution and chosen filter scale. Additionally, the new closures are effectively coefficient-free with a deviation of less than 20%.

  10. Knowledge representation system for assembly using robots

    NASA Technical Reports Server (NTRS)

    Jain, A.; Donath, M.

    1987-01-01

    Assembly robots combine the benefits of speed and accuracy with the capability of adaptation to changes in the work environment. However, an impediment to the use of robots is the complexity of the man-machine interface. This interface can be improved by providing a means of using a priori-knowledge and reasoning capabilities for controlling and monitoring the tasks performed by robots. Robots ought to be able to perform complex assembly tasks with the help of only supervisory guidance from human operators. For such supervisory quidance, it is important to express the commands in terms of the effects desired, rather than in terms of the motion the robot must undertake in order to achieve these effects. A suitable knowledge representation can facilitate the conversion of task level descriptions into explicit instructions to the robot. Such a system would use symbolic relationships describing the a priori information about the robot, its environment, and the tasks specified by the operator to generate the commands for the robot.

  11. Procedural knowledge

    NASA Technical Reports Server (NTRS)

    Georgeff, Michael P.; Lansky, Amy L.

    1986-01-01

    Much of commonsense knowledge about the real world is in the form of procedures or sequences of actions for achieving particular goals. In this paper, a formalism is presented for representing such knowledge using the notion of process. A declarative semantics for the representation is given, which allows a user to state facts about the effects of doing things in the problem domain of interest. An operational semantics is also provided, which shows how this knowledge can be used to achieve particular goals or to form intentions regarding their achievement. Given both semantics, the formalism additionally serves as an executable specification language suitable for constructing complex systems. A system based on this formalism is described, and examples involving control of an autonomous robot and fault diagnosis for NASA's Space Shuttle are provided.

  12. Knowledge-based landmarking of cephalograms.

    PubMed

    Lévy-Mandel, A D; Venetsanopoulos, A N; Tsotsos, J K

    1986-06-01

    Orthodontists have defined a certain number of characteristic points, or landmarks, on X-ray images of the human skull which are used to study growth or as a diagnostic aid. This work presents the first step toward an automatic extraction of these points. They are defined with respect to particular lines which are retrieved first. The original image is preprocessed with a prefiltering operator (median filter) followed by an edge detector (Mero-Vassy operator). A knowledge-based line-following algorithm is subsequently applied, involving a production system with organized sets of rules and a simple interpreter. The a priori knowledge implemented in the algorithm must take into account the fact that the lines represent biological shapes and can vary considerably from one patient to the next. The performance of the algorithm is judged with the help of objective quality criteria. Determination of the exact shapes of the lines allows the computation of the positions of the landmarks. PMID:3519070

  13. Working Knowledge.

    ERIC Educational Resources Information Center

    Beckett, David

    The resurgence of "lifelong learning" has renewed consideration of the nature of "working knowledge." Lifelong learning has many aspects, including construction and distribution of individuals' very self-hood, educational institutions' role in capturing informal experiences, and the juggling required between family and work-based responsibilities.…

  14. [An a priori risk analysis study. Securisation of transfusion of blood product in a hospital: from the reception in the medical unit to its administration].

    PubMed

    Bertrand, E; Lévy, R; Boyeldieu, D

    2013-12-01

    Following an ABO accident after transfusion of red blood cells, an a priori risk analysis study is being performed in a hospital. The scope of this analysis covers from the reception of the blood product in the medical unit to its administration. The risk analysis enables to identify the potentially dangerous situations and the evaluation of the risks in order to propose corrective measures (precautionary or protective) and bring the system back to an acceptable risk level. The innovative concept of an a priori risk analysis in the medical field allows the extension of the analysis of this transfusion risk to other hospitals. In addition, it allows the extension of the use of this approach to other medical fields. PMID:24176607

  15. A gridded version of the US EPA inventory of methane emissions for use as a priori and reference in methane source inversions

    NASA Astrophysics Data System (ADS)

    Maasakkers, J. D.; Jacob, D. J.; Payer Sulprizio, M.; Turner, A. J.; Weitz, M.; Wirth, T. C.; Hight, C.; DeFigueiredo, M.; Desai, M.; Schmeltz, R.; Hockstad, L.; Bloom, A. A.; Bowman, K. W.

    2015-12-01

    The US EPA produces annual estimates of national anthropogenic methane emissions in the Inventory of US Greenhouse Gas Emissions and Sinks (EPA inventory). These are reported to the UN and inform national climate policy. The EPA inventory uses best available information on emitting processes (IPCC Tier 2/3 approaches). However, inversions of atmospheric observations suggest that the inventory could be too low. These inversions rely on crude bottom-up estimates as a priori because the EPA inventory is only available as national totals for most sources. Reliance on an incorrect a priori greatly limits the value of inversions for testing and improving the EPA inventory as allocation of methane emissions by source types and regions can vary greatly between different bottom-up inventories. Here we present a 0.1° × 0.1° monthly version of the EPA inventory to serve as a priori for inversions of atmospheric data and to interpret inversion results. We use a wide range of process-specific information to allocate emissions, incorporating facility-level data reported through the EPA Greenhouse Gas Reporting Program where possible. As an illustration of used gridding strategies, gridded livestock emissions are based on EPA emission data per state, USDA livestock inventories per county, and USDA weighted land cover maps for sub-county localization. Allocation of emissions from natural gas systems incorporates monthly well-level production data, EIA compressor station and processing plant databases, and information on pipelines. Our gridded EPA inventory shows large differences in spatial emission patterns compared to the EDGAR v4.2 global inventory used as a priori in previous inverse studies. Our work greatly enhances the potential of future inversions to test and improve the EPA inventory and more broadly to improve understanding of the factors controlling methane concentrations and their trends. Preliminary inversion results using GOSAT satellite data will be presented.

  16. Seismicity patterns along the Ecuadorian subduction zone: new constraints from earthquake location in a 3-D a priori velocity model

    NASA Astrophysics Data System (ADS)

    Font, Yvonne; Segovia, Monica; Vaca, Sandro; Theunissen, Thomas

    2013-04-01

    To improve earthquake location, we create a 3-D a priori P-wave velocity model (3-DVM) that approximates the large velocity variations of the Ecuadorian subduction system. The 3-DVM is constructed from the integration of geophysical and geological data that depend on the structural geometry and velocity properties of the crust and the upper mantle. In addition, specific station selection is carried out to compensate for the high station density on the Andean Chain. 3-D synthetic experiments are then designed to evaluate the network capacity to recover the event position using only P arrivals and the MAXI technique. Three synthetic earthquake location experiments are proposed: (1) noise-free and (2) noisy arrivals used in the 3-DVM, and (3) noise-free arrivals used in a 1-DVM. Synthetic results indicate that, under the best conditions (exact arrival data set and 3-DVM), the spatiotemporal configuration of the Ecuadorian network can accurately locate 70 per cent of events in the frontal part of the subduction zone (average azimuthal gap is 289° ± 44°). Noisy P arrivals (up to ± 0.3 s) can accurately located 50 per cent of earthquakes. Processing earthquake location within a 1-DVM almost never allows accurate hypocentre position for offshore earthquakes (15 per cent), which highlights the role of using a 3-DVM in subduction zone. For the application to real data, the seismicity distribution from the 3-D-MAXI catalogue is also compared to the determinations obtained in a 1-D-layered VM. In addition to good-quality location uncertainties, the clustering and the depth distribution confirm the 3-D-MAXI catalogue reliability. The pattern of the seismicity distribution (a 13 yr record during the inter-seismic period of the seismic cycle) is compared to the pattern of rupture zone and asperity of the Mw = 7.9 1942 and the Mw = 7.7 1958 events (the Mw = 8.8 1906 asperity patch is not defined). We observe that the nucleation of 1942, 1958 and 1906 events coincides with

  17. Knowledge integration at the center of genomic medicine.

    PubMed

    Khoury, Muin J; Gwinn, Marta; Dotson, W David; Schully, Sheri D

    2012-07-01

    Three articles in this issue of Genetics in Medicine describe examples of "knowledge integration," involving methods for generating and synthesizing rapidly emerging information on health-related genomic technologies and engaging stakeholders around the evidence. Knowledge integration, the central process in translating genomic research, involves three closely related, iterative components: knowledge management, knowledge synthesis, and knowledge translation. Knowledge management is the ongoing process of obtaining, organizing, and displaying evolving evidence. For example, horizon scanning and "infoveillance" use emerging technologies to scan databases, registries, publications, and cyberspace for information on genomic applications. Knowledge synthesis is the process of conducting systematic reviews using a priori rules of evidence. For example, methods including meta-analysis, decision analysis, and modeling can be used to combine information from basic, clinical, and population research. Knowledge translation refers to stakeholder engagement and brokering to influence policy, guidelines and recommendations, as well as the research agenda to close knowledge gaps. The ultrarapid production of information requires adequate public and private resources for knowledge integration to support the evidence-based development of genomic medicine. PMID:22555656

  18. Prediction of extinction and reignition in nonpremixed turbulent flames using a flamelet/progress variable model. 1. A priori study and presumed PDF closure

    SciTech Connect

    Ihme, Matthias; Pitsch, Heinz

    2008-10-15

    Previously conducted studies of the flamelet/progress variable model for the prediction of nonpremixed turbulent combustion processes identified two areas for model improvements: the modeling of the presumed probability density function (PDF) for the reaction progress parameter and the consideration of unsteady effects [Ihme et al., Proc. Combust. Inst. 30 (2005) 793]. These effects are of particular importance during local flame extinction and subsequent reignition. Here, the models for the presumed PDFs for conserved and reactive scalars are re-examined and a statistically most likely distribution (SMLD) is employed and tested in a priori studies using direct numerical simulation (DNS) data and experimental results from the Sandia flame series. In the first part of the paper, the SMLD model is employed for a reactive scalar distribution. Modeling aspects of the a priori PDF, accounting for the bias in composition space, are discussed. The convergence of the SMLD with increasing number of enforced moments is demonstrated. It is concluded that information about more than two moments is beneficial to accurately represent the reactive scalar distribution in turbulent flames with strong extinction and reignition. In addition to the reactive scalar analysis, the potential of the SMLD for the representation of conserved scalar distributions is also analyzed. In the a priori study using DNS data it is found that the conventionally employed beta distribution provides a better representation for the scalar distribution. This is attributed to the fact that the beta-PDF implicitly enforces higher moment information that is in excellent agreement with the DNS data. However, the SMLD outperforms the beta distribution in free shear flow applications, which are typically characterized by strongly skewed scalar distributions, in the case where higher moment information can be enforced. (author)

  19. A priori prediction of tumor payload concentrations: preclinical case study with an auristatin-based anti-5T4 antibody-drug conjugate.

    PubMed

    Shah, Dhaval K; King, Lindsay E; Han, Xiaogang; Wentland, Jo-Ann; Zhang, Yanhua; Lucas, Judy; Haddish-Berhane, Nahor; Betts, Alison; Leal, Mauricio

    2014-05-01

    The objectives of this investigation were as follows: (a) to validate a mechanism-based pharmacokinetic (PK) model of ADC for its ability to a priori predict tumor concentrations of ADC and released payload, using anti-5T4 ADC A1mcMMAF, and (b) to analyze the PK model to find out main pathways and parameters model outputs are most sensitive to. Experiential data containing biomeasures, and plasma and tumor concentrations of ADC and payload, following A1mcMMAF administration in two different xenografts, were used to build and validate the model. The model performed reasonably well in terms of a priori predicting tumor exposure of total antibody, ADC, and released payload, and the exposure of released payload in plasma. Model predictions were within two fold of the observed exposures. Pathway analysis and local sensitivity analysis were conducted to investigate main pathways and set of parameters the model outputs are most sensitive to. It was discovered that payload dissociation from ADC and tumor size were important determinants of plasma and tumor payload exposure. It was also found that the sensitivity of the model output to certain parameters is dose-dependent, suggesting caution before generalizing the results from the sensitivity analysis. Model analysis also revealed the importance of understanding and quantifying the processes responsible for ADC and payload disposition within tumor cell, as tumor concentrations were sensitive to these parameters. Proposed ADC PK model provides a useful tool for a priori predicting tumor payload concentrations of novel ADCs preclinically, and possibly translating them to the clinic. PMID:24578215

  20. Developing framework to constrain the geometry of the seismic rupture plane on subduction interfaces a priori - A probabilistic approach

    USGS Publications Warehouse

    Hayes, G.P.; Wald, D.J.

    2009-01-01

    A key step in many earthquake source inversions requires knowledge of the geometry of the fault surface on which the earthquake occurred. Our knowledge of this surface is often uncertain, however, and as a result fault geometry misinterpretation can map into significant error in the final temporal and spatial slip patterns of these inversions. Relying solely on an initial hypocentre and CMT mechanism can be problematic when establishing rupture characteristics needed for rapid tsunami and ground shaking estimates. Here, we attempt to improve the quality of fast finite-fault inversion results by combining several independent and complementary data sets to more accurately constrain the geometry of the seismic rupture plane of subducting slabs. Unlike previous analyses aimed at defining the general form of the plate interface, we require mechanisms and locations of the seismicity considered in our inversions to be consistent with their occurrence on the plate interface, by limiting events to those with well-constrained depths and with CMT solutions indicative of shallow-dip thrust faulting. We construct probability density functions about each location based on formal assumptions of their depth uncertainty and use these constraints to solve for the ‘most-likely’ fault plane. Examples are shown for the trench in the source region of the Mw 8.6 Southern Sumatra earthquake of March 2005, and for the Northern Chile Trench in the source region of the November 2007 Antofagasta earthquake. We also show examples using only the historic catalogues in regions without recent great earthquakes, such as the Japan and Kamchatka Trenches. In most cases, this method produces a fault plane that is more consistent with all of the data available than is the plane implied by the initial hypocentre and CMT mechanism. Using the aggregated data sets, we have developed an algorithm to rapidly determine more accurate initial fault plane geometries for source inversions of future

  1. Nursing knowledge development: where to from here?

    PubMed

    Geanellos, R

    1997-01-01

    Issues related to nursing epistemology are reviewed. This review includes discussion of logical positivism, empiricism and interpretive-emancipatory paradigms, their influence on the construction of knowledge and on its methods of derivation and verification. Changes in the conceptualisation of science are explored, and scientific realism is introduced as a contemporary philosophy of science through which the discipline of nursing can develop. Questions surrounding the development of nursing knowledge are examined; for example, the implications of theory construction through the use of borrowed theory and the acceptance of external philosophies of science. Argument is offered for and against borrowing external theories and philosophies, or developing theories and philosophies from research into nursing practice. The relationship between research method and the phenomenon under study is discussed. The need to develop a broad base of nursing knowledge through diverse research methods is addressed. Links are created between the development of non-practice-based theories, the derivation of knowledge a priori, and the poor use of nursing theory and research in nursing practice. It is suggested that nursing science should develop through a dialectic between nursing research and practice, and that such a dialectic could assist the forward movement of nursing through the evolution of meaningful nursing theories and philosophies of nursing science. PMID:9272005

  2. An algorithm of geophysical data inversion based on non-probabilistic presentation of a priori information and definition of Pareto-optimality

    NASA Astrophysics Data System (ADS)

    Kozlovskaya, Elena

    2000-06-01

    This paper presents an inversion algorithm that can be used to solve a wide range of geophysical nonlinear inverse problems. The algorithm in based upon the principle of a direct search for the optimal solution in the parameter space. The main difference of the algorithm from existing techniques such as genetic algorithms and simulated annealing is that the optimum search is performed under control of a priori information formulated as a fuzzy set in the parameter space. In such a formulation the inverse problem becomes a multiobjective optimization problem with two objective functions, one of them is a membership function of the fuzzy set of feasible solutions, the other is the conditional probability density function of the observed data. The solution to such a problem is a set of Pareto optimal solutions that is constructed in the parameter space by a three-stage search procedure. The advantage of the proposed technique is that it provides the possibility of involving a wide range of non-probabilistic a priori information into the inversion procedure and can be applied to the solution of strongly nonlinear problems. It allows one to decrease the number of forward-problem calculations due to selective sampling of trial points from the parameter space. The properties of the algorithm are illustrated with an application to a local earthquake hypocentre location problem with synthetic and real data.

  3. Cell wall composition profiling of parasitic giant dodder (Cuscuta reflexa) and its hosts: a priori differences and induced changes.

    PubMed

    Johnsen, Hanne R; Striberny, Bernd; Olsen, Stian; Vidal-Melgosa, Silvia; Fangel, Jonatan U; Willats, William G T; Rose, Jocelyn K C; Krause, Kirsten

    2015-08-01

    Host plant penetration is the gateway to survival for holoparasitic Cuscuta and requires host cell wall degradation. Compositional differences of cell walls may explain why some hosts are amenable to such degradation while others can resist infection. Antibody-based techniques for comprehensive profiling of cell wall epitopes and cell wall-modifying enzymes were applied to several susceptible hosts and a resistant host of Cuscuta reflexa and to the parasite itself. Infected tissue of Pelargonium zonale contained high concentrations of de-esterified homogalacturonans in the cell walls, particularly adjacent to the parasite's haustoria. High pectinolytic activity in haustorial extracts and high expression levels of pectate lyase genes suggest that the parasite contributes directly to wall remodeling. Mannan and xylan concentrations were low in P. zonale and in five susceptible tomato introgression lines, but high in the resistant Solanum lycopersicum cv M82, and in C. reflexa itself. Knowledge of the composition of resistant host cell walls and the parasite's own cell walls is useful in developing strategies to prevent infection by parasitic plants. PMID:25808919

  4. Knowledge-based image bandwidth compression and enhancement

    NASA Astrophysics Data System (ADS)

    Saghri, John A.; Tescher, Andrew G.

    1987-01-01

    Techniques for incorporating a priori knowledge in the digital coding and bandwidth compression of image data are described and demonstrated. An algorithm for identifying and highlighting thin lines and point objects prior to coding is presented, and the precoding enhancement of a slightly smoothed version of the image is shown to be more effective than enhancement of the original image. Also considered are readjustment of the local distortion parameter and variable-block-size coding. The line-segment criteria employed in the classification are listed in a table, and sample images demonstrating the effectiveness of the enhancement techniques are presented.

  5. Sample Size Calculation: Inaccurate A Priori Assumptions for Nuisance Parameters Can Greatly Affect the Power of a Randomized Controlled Trial

    PubMed Central

    Tavernier, Elsa; Giraudeau, Bruno

    2015-01-01

    We aimed to examine the extent to which inaccurate assumptions for nuisance parameters used to calculate sample size can affect the power of a randomized controlled trial (RCT). In a simulation study, we separately considered an RCT with continuous, dichotomous or time-to-event outcomes, with associated nuisance parameters of standard deviation, success rate in the control group and survival rate in the control group at some time point, respectively. For each type of outcome, we calculated a required sample size N for a hypothesized treatment effect, an assumed nuisance parameter and a nominal power of 80%. We then assumed a nuisance parameter associated with a relative error at the design stage. For each type of outcome, we randomly drew 10,000 relative errors of the associated nuisance parameter (from empirical distributions derived from a previously published review). Then, retro-fitting the sample size formula, we derived, for the pre-calculated sample size N, the real power of the RCT, taking into account the relative error for the nuisance parameter. In total, 23%, 0% and 18% of RCTs with continuous, binary and time-to-event outcomes, respectively, were underpowered (i.e., the real power was < 60%, as compared with the 80% nominal power); 41%, 16% and 6%, respectively, were overpowered (i.e., with real power > 90%). Even with proper calculation of sample size, a substantial number of trials are underpowered or overpowered because of imprecise knowledge of nuisance parameters. Such findings raise questions about how sample size for RCTs should be determined. PMID:26173007

  6. Star Identification Without Attitude Knowledge: Testing with X-Ray Timing Experiment Data

    NASA Technical Reports Server (NTRS)

    Ketchum, Eleanor

    1997-01-01

    As the budget for the scientific exploration of space shrinks, the need for more autonomous spacecraft increases. For a spacecraft with a star tracker, the ability to determinate attitude from a lost in space state autonomously requires the capability to identify the stars in the field of view of the tracker. Although there have been efforts to produce autonomous star trackers which perform this function internally, many programs cannot afford these sensors. The author previously presented a method for identifying stars without a priori attitude knowledge specifically targeted for onboard computers as it minimizes the necessary computer storage. The method has previously been tested with simulated data. This paper provides results of star identification without a priori attitude knowledge using flight data from two 8 by 8 degree charge coupled device star trackers onboard the X-Ray Timing Experiment.

  7. Priori calculations of pK/sub a/'s for organic compounds in water. The pK/sub a/ of ethane

    SciTech Connect

    Jorgensen, W.L.; Briggs, J.M.; Gao, J.

    1987-10-28

    The enduring fascination of organic chemists with acidities and basicities reflects the fundamental importance of these concepts in understanding organic reactivity. Developing scales of aqueous acidities for weak organic acids is challenging in view of the need for extrapolations from organic solvents to water, ion-pairing and aggregation effects for organometallic compounds, and the derivation of thermodynamic quantities from kinetic measurements. The problems are reflected in the experimental ranges for the pK/sub a/'s of the simplest alkanes, methane and ethane, which cover from 40 to 60. In the present communication, they demonstrate how simulation methodology can be used to obtain a priori predictions for the relative pK/sub a/'s of organic compounds in water. The first applications are for the pK/sub a/'s of acetonitrile and ethane relative to methanethiol.

  8. Feasibility of improving a priori regional climate model estimates of Greenland ice sheet surface mass loss through assimilation of measured ice surface temperatures

    NASA Astrophysics Data System (ADS)

    Navari, M.; Margulis, S. A.; Bateni, S. M.; Tedesco, M.; Alexander, P.; Fettweis, X.

    2016-01-01

    The Greenland ice sheet (GrIS) has been the focus of climate studies due to its considerable impact on sea level rise. Accurate estimates of surface mass fluxes would contribute to understanding the cause of its recent changes and would help to better estimate the past, current and future contribution of the GrIS to sea level rise. Though the estimates of the GrIS surface mass fluxes have improved significantly over the last decade, there is still considerable disparity between the results from different methodologies (e.g., Rae et al., 2012; Vernon et al., 2013). The data assimilation approach can merge information from different methodologies in a consistent way to improve the GrIS surface mass fluxes. In this study, an ensemble batch smoother data assimilation approach was developed to assess the feasibility of generating a reanalysis estimate of the GrIS surface mass fluxes via integrating remotely sensed ice surface temperature measurements with a regional climate model (a priori) estimate. The performance of the proposed methodology for generating an improved posterior estimate was investigated within an observing system simulation experiment (OSSE) framework using synthetically generated ice surface temperature measurements. The results showed that assimilation of ice surface temperature time series were able to overcome uncertainties in near-surface meteorological forcing variables that drive the GrIS surface processes. Our findings show that the proposed methodology is able to generate posterior reanalysis estimates of the surface mass fluxes that are in good agreement with the synthetic true estimates. The results also showed that the proposed data assimilation framework improves the root-mean-square error of the posterior estimates of runoff, sublimation/evaporation, surface condensation, and surface mass loss fluxes by 61, 64, 76, and 62 %, respectively, over the nominal a priori climate model estimates.

  9. Knowledge Management, Codification and Tacit Knowledge

    ERIC Educational Resources Information Center

    Kimble, Chris

    2013-01-01

    Introduction: This article returns to a theme addressed in Vol. 8(1) October 2002 of the journal: knowledge management and the problem of managing tacit knowledge. Method: The article is primarily a review and analysis of the literature associated with the management of knowledge. In particular, it focuses on the works of a group of economists who…

  10. Doctoring the Knowledge Worker

    ERIC Educational Resources Information Center

    Tennant, Mark

    2004-01-01

    In this paper I examine the impact of the new 'knowledge economy' on contemporary doctoral education. I argue that the knowledge economy promotes a view of knowledge and knowledge workers that fundamentally challenges the idea of a university as a community of autonomous scholars transmitting and adding to society's 'stock of knowledge'. The paper…

  11. Using CCTB at Nutfield Priory.

    ERIC Educational Resources Information Center

    Faragher, Kenneth

    1980-01-01

    The use of closed circuit television to teach language and speech to congenitally deaf secondary school children in England is discussed. The equipment enables the staff to record material and add subtitles before showing it to the students. (PHR)

  12. Overview of Knowledge Management.

    ERIC Educational Resources Information Center

    Serban, Andreea M.; Luan, Jing

    2002-01-01

    Defines knowledge management, its components, processes, and outcomes. Addresses the importance of knowledge management for higher education in general and for institutional research in particular. (EV)

  13. Advancing techniques to constrain the geometry of the seismic rupture plane on subduction interfaces a priori: Higher-order functional fits

    USGS Publications Warehouse

    Hayes, G.P.; Wald, D.J.; Keranen, K.

    2009-01-01

    Ongoing developments in earthquake source inversions incorporate nonplanar fault geometries as inputs to the inversion process, improving previous approaches that relied solely on planar fault surfaces. This evolution motivates advancing the existing framework for constraining fault geometry, particularly in subduction zones where plate boundary surfaces that host highly hazardous earthquakes are clearly nonplanar. Here, we improve upon the existing framework for the constraint of the seismic rupture plane of subduction interfaces by incorporating active seismic and seafloor sediment thickness data with existing independent data sets and inverting for the most probable nonplanar subduction geometry. Constraining the rupture interface a priori with independent geological and seismological information reduces the uncertainty in the derived earthquake source inversion parameters over models that rely on simpler assumptions, such as the moment tensor inferred fault plane. Examples are shown for a number of wellconstrained global locations. We expand the coverage of previous analyses to a more uniform global data set and show that even in areas of sparse data this approach is able to accurately constrain the approximate subduction geometry, particularly when aided with the addition of data from local active seismic surveys. In addition, we show an example of the integration of many two-dimensional profiles into a threedimensional surface for the Sunda subduction zone and introduce the development of a new global threedimensional subduction interface model: Slab1.0. ?? 2009 by the American Geophysical Union.

  14. ICA analysis of fMRI with real-time constraints: an evaluation of fast detection performance as function of algorithms, parameters and a priori conditions

    PubMed Central

    Soldati, Nicola; Calhoun, Vince D.; Bruzzone, Lorenzo; Jovicich, Jorge

    2013-01-01

    Independent component analysis (ICA) techniques offer a data-driven possibility to analyze brain functional MRI data in real-time. Typical ICA methods used in functional magnetic resonance imaging (fMRI), however, have been until now mostly developed and optimized for the off-line case in which all data is available. Real-time experiments are ill-posed for ICA in that several constraints are added: limited data, limited analysis time and dynamic changes in the data and computational speed. Previous studies have shown that particular choices of ICA parameters can be used to monitor real-time fMRI (rt-fMRI) brain activation, but it is unknown how other choices would perform. In this rt-fMRI simulation study we investigate and compare the performance of 14 different publicly available ICA algorithms systematically sampling different growing window lengths (WLs), model order (MO) as well as a priori conditions (none, spatial or temporal). Performance is evaluated by computing the spatial and temporal correlation to a target component as well as computation time. Four algorithms are identified as best performing (constrained ICA, fastICA, amuse, and evd), with their corresponding parameter choices. Both spatial and temporal priors are found to provide equal or improved performances in similarity to the target compared with their off-line counterpart, with greatly reduced computation costs. This study suggests parameter choices that can be further investigated in a sliding-window approach for a rt-fMRI experiment. PMID:23378835

  15. Optimal site-centered electronic structure basis set from a displaced-center expansion: Improved results via a priori estimates of saddle points in the density

    NASA Astrophysics Data System (ADS)

    Alam, Aftab; Johnson, D. D.

    2009-09-01

    Site-centered, electronic-structure methods use an expansion inside nonoverlapping “muffin-tin” (MT) spheres plus an interstitial basis set. As the boundary separating the more spherical from nonspherical density between atoms, the “saddle-point” radii (SPR) in the density provide an optimal spherical region for expanding in spherical harmonics, as used in augmented plane wave, muffin-tin orbital, and multiple-scattering [Korringa, Kohn, and Rostoker (KKR)] methods. These MT-SPR guarantee unique, convex Voronoi polyhedra at each site, in distinction to Bader topological cells. We present a numerically fast, two-center expansion to find SPR a priori from overlapping atomic charge densities, valid also for disordered alloys. We adopt this MT-SPR basis for KKR in the atomic sphere approximation and study (dis)ordered alloys with large differences in atomic size (fcc CoPt and bcc CrW). For this simple and unique improvement, we find formation energies and structural parameters in strikingly better agreement with more exact methods or experiment, and resolve issues with former results.

  16. Advancing techniques to constrain the geometry of the seismic rupture plane on subduction interfaces a priori: Higher-order functional fits

    NASA Astrophysics Data System (ADS)

    Hayes, Gavin P.; Wald, David J.; Keranen, Katie

    2009-09-01

    Ongoing developments in earthquake source inversions incorporate nonplanar fault geometries as inputs to the inversion process, improving previous approaches that relied solely on planar fault surfaces. This evolution motivates advancing the existing framework for constraining fault geometry, particularly in subduction zones where plate boundary surfaces that host highly hazardous earthquakes are clearly nonplanar. Here, we improve upon the existing framework for the constraint of the seismic rupture plane of subduction interfaces by incorporating active seismic and seafloor sediment thickness data with existing independent data sets and inverting for the most probable nonplanar subduction geometry. Constraining the rupture interface a priori with independent geological and seismological information reduces the uncertainty in the derived earthquake source inversion parameters over models that rely on simpler assumptions, such as the moment tensor inferred fault plane. Examples are shown for a number of well-constrained global locations. We expand the coverage of previous analyses to a more uniform global data set and show that even in areas of sparse data this approach is able to accurately constrain the approximate subduction geometry, particularly when aided with the addition of data from local active seismic surveys. In addition, we show an example of the integration of many two-dimensional profiles into a three-dimensional surface for the Sunda subduction zone and introduce the development of a new global three-dimensional subduction interface model: Slab1.0.

  17. Knowledge Repository for Fmea Related Knowledge

    NASA Astrophysics Data System (ADS)

    Cândea, Gabriela Simona; Kifor, Claudiu Vasile; Cândea, Ciprian

    2014-11-01

    This paper presents innovative usage of knowledge system into Failure Mode and Effects Analysis (FMEA) process using the ontology to represent the knowledge. Knowledge system is built to serve multi-projects work that nowadays are in place in any manufacturing or services provider, and knowledge must be retained and reused at the company level and not only at project level. The system is following the FMEA methodology and the validation of the concept is compliant with the automotive industry standards published by Automotive Industry Action Group, and not only. Collaboration is assured trough web-based GUI that supports multiple users access at any time

  18. Knowledge and Its Enemies

    ERIC Educational Resources Information Center

    Kruk, Miroslav

    2007-01-01

    As libraries are the physical manifestations of knowledge, some refection about the concept of knowledge would not be unjustified. In modern societies, knowledge plays such a central role that it requires some effort and imagination to understand on what grounds knowledge could be rejected. Karl Popper wrote about the open society and its enemies.…

  19. Knowledge Engineering and Education.

    ERIC Educational Resources Information Center

    Lopez, Antonio M., Jr.; Donlon, James

    2001-01-01

    Discusses knowledge engineering, computer software, and possible applications in the field of education. Highlights include the distinctions between data, information, and knowledge; knowledge engineering as a subfield of artificial intelligence; knowledge acquisition; data mining; ontology development for subject terms; cognitive apprentices; and…

  20. Making Programming Knowledge Explicit.

    ERIC Educational Resources Information Center

    Navrat, Pavol; Rozinajova, Viera

    1993-01-01

    Addresses the question of how to write computer programs using explicit knowledge and rules-based systems. Highlights include the knowledge representation tool; the knowledge base on programming; and results of experiments that tested the system. Appendices include the set of rules for the experimental knowledge base and details of two…

  1. Facilitating Collaborative Knowledge Building

    ERIC Educational Resources Information Center

    Hmelo-Silver, Cindy E.; Barrows, Howard S.

    2008-01-01

    This article describes a detailed analysis of knowledge building in a problem-based learning group. Knowledge building involves increasing the collective knowledge of a group through social discourse. For knowledge building to occur in the classroom, the teacher needs to create opportunities for constructive discourse in order to support student…

  2. Tacit Knowledge: Revisiting the Epistemology of Knowledge

    ERIC Educational Resources Information Center

    Lejeune, Michel

    2011-01-01

    The concept of tacit knowledge encompasses all of the intricacy of the different experiences that people acquire over time, and which they utilize and bring to bear in carrying out tasks effectively, reacting to unforeseen circumstances, or innovating. The intuitive nature of tacit knowledge, its particular context, and the difficulty of…

  3. Knowledge and Policy: Research and Knowledge Transfer

    ERIC Educational Resources Information Center

    Ozga, Jenny

    2007-01-01

    Knowledge transfer (KT) is the emergent "third sector" of higher education activity--alongside research and teaching. Its commercialization origins are evidenced in its concerns to extract maximum value from research, and in the policy push to make research-based knowledge trapped in disciplinary silos more responsive to the growing information…

  4. Testing the forward modeling approach in asteroseismology. I. Seismic solutions for the hot B subdwarf Balloon 090100001 with and without a priori mode identification

    NASA Astrophysics Data System (ADS)

    Van Grootel, V.; Charpinet, S.; Fontaine, G.; Brassard, P.; Green, E. M.; Chayer, P.; Randall, S. K.

    2008-09-01

    Context: Balloon 090100001, the brightest of the known pulsating hot B subdwarfs, exhibits simultaneoulsy both short- and long-period pulsation modes, and shows relatively large amplitudes for its dominant modes. For these reasons, it has been studied extensively over the past few years, including a successful experiment carried out at the Canada-France-Hawaii Telescope to pin down or constrain the value of the degree index ℓ of several pulsation modes through multicolor photometry. Aims: The primary goal of this paper is to take advantage of such partial mode identification to test the robustness of our standard approach to the asteroseismology of pulsating subdwarf B stars. The latter is based on the forward approach whereby a model that best matches the observed periods is searched for in parameter space with no a priori assumption about mode identification. When successful, this method leads to the determination of the global structural parameters of the pulsator. As a bonus, it also leads, after the fact, to complete mode identification. For the first time, with the availability of partial mode identification for Balloon 090100001, we are able to evaluate the sensitivity of the inferred seismic model to possible uncertainty in mode identification. Methods: We carry out a number of exercises based on the double optimization technique that we developed within the framework of the forward modeling approach in asteroseismology. We use the set of ten periods corresponding to the independent pulsation modes for which values of ℓ have been either formally identified or constrained through multicolor photometry in Balloon 090100001. These exercises differ in that they assume different a priori mode identification. Results: Our primary result is that the asteroseismic solution stands very robust, whether or not external constraints on the values of the degree ℓ are used. Although this may come as a small surprise, the test proves to be conclusive, and small

  5. A priori and a posteriori investigations for developing large eddy simulations of multi-species turbulent mixing under high-pressure conditions

    SciTech Connect

    Borghesi, Giulio; Bellan, Josette

    2015-03-15

    A Direct Numerical Simulation (DNS) database was created representing mixing of species under high-pressure conditions. The configuration considered is that of a temporally evolving mixing layer. The database was examined and analyzed for the purpose of modeling some of the unclosed terms that appear in the Large Eddy Simulation (LES) equations. Several metrics are used to understand the LES modeling requirements. First, a statistical analysis of the DNS-database large-scale flow structures was performed to provide a metric for probing the accuracy of the proposed LES models as the flow fields obtained from accurate LESs should contain structures of morphology statistically similar to those observed in the filtered-and-coarsened DNS (FC-DNS) fields. To characterize the morphology of the large-scales structures, the Minkowski functionals of the iso-surfaces were evaluated for two different fields: the second-invariant of the rate of deformation tensor and the irreversible entropy production rate. To remove the presence of the small flow scales, both of these fields were computed using the FC-DNS solutions. It was found that the large-scale structures of the irreversible entropy production rate exhibit higher morphological complexity than those of the second invariant of the rate of deformation tensor, indicating that the burden of modeling will be on recovering the thermodynamic fields. Second, to evaluate the physical effects which must be modeled at the subfilter scale, an a priori analysis was conducted. This a priori analysis, conducted in the coarse-grid LES regime, revealed that standard closures for the filtered pressure, the filtered heat flux, and the filtered species mass fluxes, in which a filtered function of a variable is equal to the function of the filtered variable, may no longer be valid for the high-pressure flows considered in this study. The terms requiring modeling are the filtered pressure, the filtered heat flux, the filtered pressure work

  6. The significance of content knowledge for informal reasoning regarding socioscientific issues: Applying genetics knowledge to genetic engineering issues

    NASA Astrophysics Data System (ADS)

    Sadler, Troy D.; Zeidler, Dana L.

    2005-01-01

    This study focused on informal reasoning regarding socioscientific issues. It sought to explore how content knowledge influenced the negotiation and resolution of contentious and complex scenarios based on genetic engineering. Two hundred and sixty-nine students drawn from undergraduate natural science and nonnatural science courses completed a quantitative test of genetics concepts. Two subsets (n = 15 for each group) of the original sample representing divergent levels of content knowledge participated in individual interviews, during which they articulated positions, rationales, counterpositions, and rebuttals in response to three gene therapy scenarios and three cloning scenarios. A mixed-methods approach was used to examine the effects of content knowledge on the use of informal reasoning patterns and the quality of informal reasoning. Participants from both groups employed the same general patterns of informal reasoning. Data did indicate that differences in content knowledge were related to variations in informal reasoning quality. Participants, with more advanced understandings of genetics, demonstrated fewer instances of reasoning flaws, as defined by a priori criteria, and were more likely to incorporate content knowledge in their reasoning patterns than participants with more naïve understandings of genetics. Implications for instruction and future research are discussed.

  7. Documentation and knowledge acquisition

    NASA Technical Reports Server (NTRS)

    Rochowiak, Daniel; Moseley, Warren

    1990-01-01

    Traditional approaches to knowledge acquisition have focused on interviews. An alternative focuses on the documentation associated with a domain. Adopting a documentation approach provides some advantages during familiarization. A knowledge management tool was constructed to gain these advantages.

  8. An approach for the long-term 30-m land surface snow-free albedo retrieval from historic Landsat surface reflectance and MODIS-based a priori anisotropy knowledge

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Land surface albedo has been recognized by the Global Terrestrial Observing System (GTOS) as an essential climate variable crucial for accurate modeling and monitoring of the Earth’s radiative budget. While global climate studies can leverage albedo datasets from MODIS, VIIRS, and other coarse-reso...

  9. Assessing Teachers' Practical Knowledge.

    ERIC Educational Resources Information Center

    Beijaard, Douwe; Verloop, Nico

    1996-01-01

    This exploration of the assessment of teachers' practical knowledge begins with a description of the characteristics of teacher knowledge and a discussion of the importance of teacher views and teaching as a profession. The possibilities and problems associated with assessing teacher practical knowledge are discussed. (SLD)

  10. Knowledge Retrieval Solutions.

    ERIC Educational Resources Information Center

    Khan, Kamran

    1998-01-01

    Excalibur RetrievalWare offers true knowledge retrieval solutions. Its fundamental technologies, Adaptive Pattern Recognition Processing and Semantic Networks, have capabilities for knowledge discovery and knowledge management of full-text, structured and visual information. The software delivers a combination of accuracy, extensibility,…

  11. Building Background Knowledge

    ERIC Educational Resources Information Center

    Neuman, Susan B.; Kaefer, Tanya; Pinkham, Ashley

    2014-01-01

    This article make a case for the importance of background knowledge in children's comprehension. It suggests that differences in background knowledge may account for differences in understanding text for low- and middle-income children. It then describes strategies for building background knowledge in the age of common core standards.

  12. Activating Event Knowledge

    ERIC Educational Resources Information Center

    Hare, Mary; Jones, Michael; Thomson, Caroline; Kelly, Sarah; McRae, Ken

    2009-01-01

    An increasing number of results in sentence and discourse processing demonstrate that comprehension relies on rich pragmatic knowledge about real-world events, and that incoming words incrementally activate such knowledge. If so, then even outside of any larger context, nouns should activate knowledge of the generalized events that they denote or…

  13. Knowledge, People, and Risk

    NASA Technical Reports Server (NTRS)

    Rogers, Edward W.

    2008-01-01

    NASA's mandate is to take risks to got into space while applying its best knowledge. NASA's knowledge is the result of scientific insights from research, engineering wisdom from experience, project management skills, safety and team consciousness and institutional support and collaboration. This presentation highlights NASA's organizational knowledge, communication and growth efforts.

  14. Knowledge Discovery in Databases.

    ERIC Educational Resources Information Center

    Norton, M. Jay

    1999-01-01

    Knowledge discovery in databases (KDD) revolves around the investigation and creation of knowledge, processes, algorithms, and mechanisms for retrieving knowledge from data collections. The article is an introductory overview of KDD. The rationale and environment of its development and applications are discussed. Issues related to database design…

  15. Relationships between Knowledge(s): Implications for "Knowledge Integration"

    ERIC Educational Resources Information Center

    Evering, Brigitte

    2012-01-01

    This article contributes to a critical dialogue about what is currently called "knowledge integration" in environmental research and related educational programming. Indigenous understandings in particular are seen as offering (re)new(ed) ways of thinking that have and will lead to innovative practices for addressing complex environmental issues.…

  16. 3-D multiobservable probabilistic inversion for the compositional and thermal structure of the lithosphere and upper mantle. I: a priori petrological information and geophysical observables

    NASA Astrophysics Data System (ADS)

    Afonso, J. C.; Fullea, J.; Griffin, W. L.; Yang, Y.; Jones, A. G.; D. Connolly, J. A.; O'Reilly, S. Y.

    2013-05-01

    Traditional inversion techniques applied to the problem of characterizing the thermal and compositional structure of the upper mantle are not well suited to deal with the nonlinearity of the problem, the trade-off between temperature and compositional effects on wave velocities, the nonuniqueness of the compositional space, and the dissimilar sensitivities of physical parameters to temperature and composition. Probabilistic inversions, on the other hand, offer a powerful formalism to cope with all these difficulties, while allowing for an adequate treatment of the intrinsic uncertainties associated with both data and physical theories. This paper presents a detailed analysis of the two most important elements controlling the outputs of probabilistic (Bayesian) inversions for temperature and composition of the Earth's mantle, namely the a priori information on model parameters, ρ(m), and the likelihood function, L(m). The former is mainly controlled by our current understanding of lithosphere and mantle composition, while the latter conveys information on the observed data, their uncertainties, and the physical theories used to relate model parameters to observed data. The benefits of combining specific geophysical datasets (Rayleigh and Love dispersion curves, body wave tomography, magnetotelluric, geothermal, petrological, gravity, elevation, and geoid), and their effects on L(m), are demonstrated by analyzing their individual and combined sensitivities to composition and temperature as well as their observational uncertainties. The dependence of bulk density, electrical conductivity, and seismic velocities to major-element composition is systematically explored using Monte Carlo simulations. We show that the dominant source of uncertainty in the identification of compositional anomalies within the lithosphere is the intrinsic nonuniqueness in compositional space. A general strategy for defining ρ(m) is proposed based on statistical analyses of a large database

  17. Earthquake relocation using a 3D a-priori geological velocity model from the western Alps to Corsica: Implication for seismic hazard

    NASA Astrophysics Data System (ADS)

    Béthoux, Nicole; Theunissen, Thomas; Beslier, Marie-Odile; Font, Yvonne; Thouvenot, François; Dessa, Jean-Xavier; Simon, Soazig; Courrioux, Gabriel; Guillen, Antonio

    2016-02-01

    The region between the inner zones of the Alps and Corsica juxtaposes an overthickened crust to an oceanic domain, which makes difficult to ascertain the focal depth of seismic events using routine location codes and average 1D velocity models. The aim of this article is to show that, even with a rather lose monitoring network, accurate routine locations can be achieved by using realistic 3D modelling and advanced location techniques. Previous earthquake tomography studies cover the whole region with spatial resolutions of several tens of kilometres on land, but they fail to resolve the marine domain due to the absence of station coverage and sparse seismicity. To overcome these limitations, we first construct a 3D a-priori P and S velocity model integrating known geophysical and geological information. Significant progress has been achieved in the 3D numerical modelling of complex geological structures by the development of dedicated softwares (e.g. 3D GeoModeller), capable at once of elaborating a 3D structural model from geological and geophysical constraints and, possibly, of refining it by inversion processes (Calcagno et al., 2008). Then, we build an arrival-time catalogue of 1500 events recorded from 2000 to 2011. Hypocentres are then located in this model using a numerical code based on the maximum intersection method (Font et al., 2004), updated by Theunissen et al. (2012), as well as another 3D location technique, the NonLinLoc software (Lomax and Curtis, 2001). The reduction of arrival-time residuals and uncertainties (dh, dz) with respect to classical 1D locations demonstrates the improved accuracy allowed by our approach and confirms the coherence of the 3D geological model built and used in this study. Our results are also compared with previous works that benefitted from the installation of dense temporary networks surrounding the studied epicentre area. The resulting 3D location catalogue allows us to improve the regional seismic hazard assessment

  18. Induction as Knowledge Integration

    NASA Technical Reports Server (NTRS)

    Smith, Benjamin D.; Rosenbloom, Paul S.

    1996-01-01

    Two key issues for induction algorithms are the accuracy of the learned hypothesis and the computational resources consumed in inducing that hypothesis. One of the most promising ways to improve performance along both dimensions is to make use of additional knowledge. Multi-strategy learning algorithms tackle this problem by employing several strategies for handling different kinds of knowledge in different ways. However, integrating knowledge into an induction algorithm can be difficult when the new knowledge differs significantly from the knowledge the algorithm already uses. In many cases the algorithm must be rewritten. This paper presents Knowledge Integration framework for Induction (KII), a KII, that provides a uniform mechanism for integrating knowledge into induction. In theory, arbitrary knowledge can be integrated with this mechanism, but in practice the knowledge representation language determines both the knowledge that can be integrated, and the costs of integration and induction. By instantiating KII with various set representations, algorithms can be generated at different trade-off points along these dimensions. One instantiation of KII, called RS-KII, is presented that can implement hybrid induction algorithms, depending on which knowledge it utilizes. RS-KII is demonstrated to implement AQ-11, as well as a hybrid algorithm that utilizes a domain theory and noisy examples. Other algorithms are also possible.

  19. Knowledge to Manage the Knowledge Society

    ERIC Educational Resources Information Center

    Minati, Gianfranco

    2012-01-01

    Purpose: The purpose of this research is to make evident the inadequateness of concepts and language based on industrial knowledge still used in current practices by managers to cope with problems of the post-industrial societies characterised by non-linear process of emergence and acquisition of properties. The purpose is to allow management to…

  20. Reasoning about procedural knowledge

    NASA Technical Reports Server (NTRS)

    Georgeff, M. P.

    1985-01-01

    A crucial aspect of automated reasoning about space operations is that knowledge of the problem domain is often procedural in nature - that is, the knowledge is often in the form of sequences of actions or procedures for achieving given goals or reacting to certain situations. In this paper a system is described that explicitly represents and reasons about procedural knowledge. The knowledge representation used is sufficiently rich to describe the effects of arbitrary sequences of tests and actions, and the inference mechanism provides a means for directly using this knowledge to reach desired operational goals. Furthermore, the representation has a declarative semantics that provides for incremental changes to the system, rich explanatory capabilities, and verifiability. The approach also provides a mechanism for reasoning about the use of this knowledge, thus enabling the system to choose effectively between alternative courses of action.

  1. Interactive knowledge acquisition tools

    NASA Technical Reports Server (NTRS)

    Dudziak, Martin J.; Feinstein, Jerald L.

    1987-01-01

    The problems of designing practical tools to aid the knowledge engineer and general applications used in performing knowledge acquisition tasks are discussed. A particular approach was developed for the class of knowledge acquisition problem characterized by situations where acquisition and transformation of domain expertise are often bottlenecks in systems development. An explanation is given on how the tool and underlying software engineering principles can be extended to provide a flexible set of tools that allow the application specialist to build highly customized knowledge-based applications.

  2. Recording Scientific Knowledge

    SciTech Connect

    Bowker, Geof

    2006-01-09

    The way we record knowledge, and the web of technical, formal, and social practices that surrounds it, inevitably affects the knowledge that we record. The ways we hold knowledge about the past - in handwritten manuscripts, in printed books, in file folders, in databases - shape the kind of stories we tell about that past. In this talk, I look at how over the past two hundred years, information technology has affected the nature and production of scientific knowledge. Further, I explore ways in which the emergent new cyberinfrastructure is changing our relationship to scientific practice.

  3. Automated knowledge generation

    NASA Technical Reports Server (NTRS)

    Myler, Harley R.; Gonzalez, Avelino J.

    1988-01-01

    The general objectives of the NASA/UCF Automated Knowledge Generation Project were the development of an intelligent software system that could access CAD design data bases, interpret them, and generate a diagnostic knowledge base in the form of a system model. The initial area of concentration is in the diagnosis of the process control system using the Knowledge-based Autonomous Test Engineer (KATE) diagnostic system. A secondary objective was the study of general problems of automated knowledge generation. A prototype was developed, based on object-oriented language (Flavors).

  4. Global land cover knowledge database for supporting optical remote sensing satellite intelligent imaging

    NASA Astrophysics Data System (ADS)

    Yan, Ming; Wang, Zhiyong; He, Shaoshuai; Wu, Fei; Yu, Bingyang

    2014-05-01

    With the development of high spatial resolution, high spectral resolution, high radiant resolution and high temporal resolution remote sensing satellites being put into use widely, the adaptive intelligent observation becomes an important function of a new generation of satellite remote sensing system. In order to realize the adaptive intelligent observation function, the first step is to construct the land cover priori knowledge and prejudge the land cover types and its reflectance values of the imaging areas. During the satellite imaging, the setting parameters of optimal camera including the on-orbit CCD integral time, electrical gain and image compression ratio are estimated according to the relationship of apparent radiance with sun illumination condition and land surface reflectance. In the paper, Medium Resolution Imaging Spectrometer (MERIS) bimonthly mean land surface reflectance imagery and 2009 GlobCover map are used to build the global land cover and its reflectance knowledge database. The land cover types include the cropland, urban, grassland, forest, desert, soil, water and ice land cover classes and the mean reflectance values in blue, green, red and near infrared spectral band were calculated in various seasons. The global land cover and reflectance values database has been integrated into the Beijing-1 small satellite mission programming system as the priori landscape knowledge of imaging areas to estimate the proper electrical gain of multispectral camera. After the intelligent observation mode was used in Beijing-1 small satellite, the entropy and SNR of multispectral imagery acquired by the Beijing-1 satellite had been increased greatly.

  5. Leveraging intelligent agents for knowledge discovery from heterogeneous healthcare data repositories.

    PubMed

    Zaidi, Syed Zahid Hassan; Abidi, Syed Sibte Raza; Manickam, Selvakumar

    2002-01-01

    This paper presents a case for an intelligent agent based framework for knowledge discovery in a distributed healthcare environment comprising multiple heterogeneous healthcare data repositories. Data-mediated knowledge discovery, especially from multiple heterogeneous data resources, is a tedious process and imposes significant operational constraints on end-users. We demonstrate that autonomous, reactive and proactive intelligent agents provide an opportunity to generate end-user oriented, packaged, value-added decision-support/strategic planning services for healthcare professionals, manages and policy makers, without the need for a priori technical knowledge. Since effective healthcare is grounded in good communication, experience sharing, continuous learning and proactive actions, we use intelligent agents to implement an Agent based Data Mining Infostructure that provides a suite of healthcare-oriented decision-support/strategic planning services. PMID:15460713

  6. [Knowledge management (I)].

    PubMed

    Ruiz Moreno, J; Cruz Martín Delgado, M

    2001-09-01

    Beyond to be in fashion, the knowledge management (KM) is by itself a powerful strategic weapon for managing organizations. In a first part, the authors analyze strategic concepts related to management, emphasizing the attachment between KM and competitive advantage. Finally, the authors tie the KM to learning process ("tacit knowledge", "socialization", "externalization", "combination", and "internationalization"). PMID:12150129

  7. [Acquisition of arithmetic knowledge].

    PubMed

    Fayol, Michel

    2008-01-01

    The focus of this paper is on contemporary research on the number counting and arithmetical competencies that emerge during infancy, the preschool years, and the elementary school. I provide a brief overview of the evolution of children's conceptual knowledge of arithmetic knowledge, the acquisition and use of counting and how they solve simple arithmetic problems (e.g. 4 + 3). PMID:18198117

  8. Public Knowledge Cultures

    ERIC Educational Resources Information Center

    Peters, Michael A.; Besley, A. C.

    2006-01-01

    This article first reviews claims for the knowledge economy in terms of excludability, rivalry, and transparency indicating the way that digital goods behave differently from other commodities. In the second section it discusses the theory of "public knowledge cultures" starting from the primacy of practice based on Marx, Wittgenstein and…

  9. The Bridge of Knowledge

    ERIC Educational Resources Information Center

    Dong, Yu Ren

    2014-01-01

    Although many English language learners (ELLs) in the United States have knowledge gaps that make it hard for them to master high-level content and skills, ELLs also often have background knowledge relevant to school learning that teachers neglect to access, this author argues. In the Common Core era, with ELLs being the fastest growing population…

  10. Knowledge representation for commonality

    NASA Technical Reports Server (NTRS)

    Yeager, Dorian P.

    1990-01-01

    Domain-specific knowledge necessary for commonality analysis falls into two general classes: commonality constraints and costing information. Notations for encoding such knowledge should be powerful and flexible and should appeal to the domain expert. The notations employed by the Commonality Analysis Problem Solver (CAPS) analysis tool are described. Examples are given to illustrate the main concepts.

  11. The Knowledge Bluff

    ERIC Educational Resources Information Center

    Vanderburg, Willem H.

    2007-01-01

    Our knowledge "system" is built up from disciplines and specialties as its components, which are "wired" by patterns of collaboration that constitute its organization. The intellectual autonomy of these components prevents this knowledge system from adequately accounting for what we have gradually discovered during the past 50 years: In human…

  12. Marine Education Knowledge Inventory.

    ERIC Educational Resources Information Center

    Hounshell, Paul B.; Hampton, Carolyn

    This 35-item, multiple-choice Marine Education Knowledge Inventory was developed for use in upper elementary/middle schools to measure a student's knowledge of marine science. Content of test items is drawn from oceanography, ecology, earth science, navigation, and the biological sciences (focusing on marine animals). Steps in the construction of…

  13. Organizational Knowledge Management Structure

    ERIC Educational Resources Information Center

    Walczak, Steven

    2005-01-01

    Purpose: To propose and evaluate a novel management structure that encourages knowledge sharing across an organization. Design/methodology/approach: The extant literature on the impact of organizational culture and its link to management structure is examined and used to develop a new knowledge sharing management structure. Roadblocks to…

  14. Knowledge Production and Utilization.

    ERIC Educational Resources Information Center

    Beal, George M.; Meehan, Peter

    The study of knowledge production and utilization deals with the problems of how to translate theoretical concepts and knowledge into practical solutions to people's problems. Many criticisms have been leveled at aspects of previous information system models, including their linearity, one-way communication paths, overdependence on scientific…

  15. Knowledges in Context.

    ERIC Educational Resources Information Center

    Wynne, Brian

    1991-01-01

    In everyday life people have to interpret and negotiate scientific knowledge in conjunction with other forms of knowledge. Three levels of public understanding of science are described, including its intellectual contents, its research methods, and its organizational forms of ownership and control. (KR)

  16. Translating Facts into Knowledge

    ERIC Educational Resources Information Center

    Umewaka, Soraya

    2011-01-01

    Many education systems have a tendency to be limiting and rigid. These systems teach children to value facts over knowledge and routine and repetition over playfulness and curiosity to seek knowledge. How can we unleash our children's imagination and permit them to use play and other creative tools as a means of learning? This article proposes new…

  17. Is Knowledge Like Love?

    ERIC Educational Resources Information Center

    Saussois, Jean-Michel

    2014-01-01

    The label "knowledge management" is a source of ambiguity within the education community. In fact, the role played by knowledge within economics has an impact on the education sector, especially on the nature of the teacher's job. Compared to other sectors such as engineering and health, research and development investment is still weak.…

  18. Essays on Knowledge Management

    ERIC Educational Resources Information Center

    Xiao, Wenli

    2012-01-01

    For many firms, particularly those operating in high technology and competitive markets, knowledge is cited as the most important strategic asset to the firm, which significantly drives its survival and success (Grant 1996, Webber 1993). Knowledge management (KM) impacts the firm's ability to develop process features that reduce manufacturing…

  19. Pedagogical Content Knowledge Taxonomies.

    ERIC Educational Resources Information Center

    Veal, William R.; MaKinster, James G.

    1999-01-01

    Presents two taxonomies that offer a relatively comprehensive categorization scheme for future studies of pedagogical content knowledge (PCK) development in teacher education. "The General Taxonomy of PCK" addresses distinctions within and between the knowledge bases of various disciplines, science subjects, and science topics. "The Taxonomy of…

  20. Reuniting Virtue and Knowledge

    ERIC Educational Resources Information Center

    Culham, Tom

    2015-01-01

    Einstein held that intuition is more important than rational inquiry as a source of discovery. Further, he explicitly and implicitly linked the heart, the sacred, devotion and intuitive knowledge. The raison d'être of universities is the advance of knowledge; however, they have primarily focused on developing student's skills in working with…

  1. Enriching Number Knowledge

    ERIC Educational Resources Information Center

    Mack, Nancy K.

    2011-01-01

    Exploring number systems of other cultures can be an enjoyable learning experience that enriches students' knowledge of numbers and number systems in important ways. It helps students deepen mental computation fluency, knowledge of place value, and equivalent representations for numbers. This article describes how the author designed her…

  2. Cultural Knowledge in Translation.

    ERIC Educational Resources Information Center

    Olk, Harald

    2003-01-01

    Describes a study exploring the influence of cultural knowledge on the translation performance of German students of English. Found that the students often lacked sufficient knowledge about British culture to deal with widely-used cultural concepts. Findings suggest that factual reference sources have an important role to play in translation…

  3. Educating the Knowledge Worker.

    ERIC Educational Resources Information Center

    Leddick, Susan; Gharajedaghi, Jamshid

    2001-01-01

    In the new economy, knowledge (not labor, raw material, or capital) is the key resource to be converted to goods and services. Public schools will have to educate three tiers of knowledge workers (doers, problem solvers, and designers) using differentiated assessment, curricula, and instruction. Organizational action, not mantras, is needed. (MLH)

  4. Constructing Knowledge from Interactions.

    ERIC Educational Resources Information Center

    Lawler, Robert W.

    1990-01-01

    Using case studies that are functionalist in orientation and computational in technique, the role of control knowledge in developing constructive thinking is illustrated. Further, the integration of related knowledge structures, emanating from diverse sensory modes and pertaining to both place value in addition and angle relationships in geometry,…

  5. Adding Confidence to Knowledge

    ERIC Educational Resources Information Center

    Goodson, Ludwika Aniela; Slater, Don; Zubovic, Yvonne

    2015-01-01

    A "knowledge survey" and a formative evaluation process led to major changes in an instructor's course and teaching methods over a 5-year period. Design of the survey incorporated several innovations, including: a) using "confidence survey" rather than "knowledge survey" as the title; b) completing an…

  6. Teaching Knowledge Management (SIG KM).

    ERIC Educational Resources Information Center

    McInerney, Claire

    2000-01-01

    Presents an abstract of a planned session on teaching knowledge management, including knowledge management for information professionals; differences between teaching knowledge management in library schools and in business schools; knowledge practices for small groups; and current research. (LRW)

  7. Intensional reasoning about knowledge

    SciTech Connect

    Popov, O.B.

    1987-01-01

    As demands and ambitions increase in Artificial Intelligence, the need for formal systems that facilitate a study and a simulation of a machine cognition has become an inevitability. This paper explores and develops the foundations of a formal system for propositional reasoning about knowledge. The semantics of every meaningful expression in the system is fully determined by its intension, the set of complexes in which the expression is confirmed. The knowledge system is based on three zeroth-order theories of epistemic reasoning for consciousness, knowledge, and entailed knowledge. The results presented determine the soundness and the completeness of the knowledge system. The modes of reasoning and the relations among the various epistemic notions emphasize the expressive power of the intensional paradigm.

  8. Knowledge-based vision and simple visual machines.

    PubMed Central

    Cliff, D; Noble, J

    1997-01-01

    The vast majority of work in machine vision emphasizes the representation of perceived objects and events: it is these internal representations that incorporate the 'knowledge' in knowledge-based vision or form the 'models' in model-based vision. In this paper, we discuss simple machine vision systems developed by artificial evolution rather than traditional engineering design techniques, and note that the task of identifying internal representations within such systems is made difficult by the lack of an operational definition of representation at the causal mechanistic level. Consequently, we question the nature and indeed the existence of representations posited to be used within natural vision systems (i.e. animals). We conclude that representations argued for on a priori grounds by external observers of a particular vision system may well be illusory, and are at best place-holders for yet-to-be-identified causal mechanistic interactions. That is, applying the knowledge-based vision approach in the understanding of evolved systems (machines or animals) may well lead to theories and models that are internally consistent, computationally plausible, and entirely wrong. PMID:9304684

  9. Knowledge based programming at KSC

    NASA Technical Reports Server (NTRS)

    Tulley, J. H., Jr.; Delaune, C. I.

    1986-01-01

    Various KSC knowledge-based systems projects are discussed. The objectives of the knowledge-based automatic test equipment and Shuttle connector analysis network projects are described. It is observed that knowledge-based programs must handle factual and expert knowledge; the characteristics of these two types of knowledge are examined. Applications for the knowledge-based programming technique are considered.

  10. Knowledge Convergence and Collaborative Learning

    ERIC Educational Resources Information Center

    Jeong, Heisawn; Chi, Michelene T. H.

    2007-01-01

    This paper operationalized the notion of knowledge convergence and assessed quantitatively how much knowledge convergence occurred during collaborative learning. Knowledge convergence was defined as an increase in common knowledge where common knowledge referred to the knowledge that all collaborating partners had. Twenty pairs of college students…

  11. Gradualness facilitates knowledge refinement.

    PubMed

    Rada, R

    1985-05-01

    To facilitate knowledge refinement, a system should be designed so that small changes in the knowledge correspond to small changes in the function or performance of the system. Two sets of experiments show the value of small, heuristically guided changes in a weighted rule base. In the first set, the ordering among numbers (reflecting certainties) makes their manipulation more straightforward than the manipulation of relationships. A simple credit assignment and weight adjustment strategy for improving numbers in a weighted, rule-based expert system is presented. In the second set, the rearrangement of predicates benefits from additional knowledge about the ``ordering'' among predicates. A third set of experiments indicates the importance of the proper level of granularity when augmenting a knowledge base. Augmentation of one knowledge base by analogical reasoning from another knowledge base did not work with only binary relationships, but did succeed with ternary relationships. To obtain a small improvement in the knowledge base, a substantial amount of structure had to be treated as a unit. PMID:21869290

  12. Knowledge and luck.

    PubMed

    Turri, John; Buckwalter, Wesley; Blouw, Peter

    2015-04-01

    Nearly all success is due to some mix of ability and luck. But some successes we attribute to the agent's ability, whereas others we attribute to luck. To better understand the criteria distinguishing credit from luck, we conducted a series of four studies on knowledge attributions. Knowledge is an achievement that involves reaching the truth. But many factors affecting the truth are beyond our control, and reaching the truth is often partly due to luck. Which sorts of luck are compatible with knowledge? We found that knowledge attributions are highly sensitive to lucky events that change the explanation for why a belief is true. By contrast, knowledge attributions are surprisingly insensitive to lucky events that threaten, but ultimately fail to change the explanation for why a belief is true. These results shed light on our concept of knowledge, help explain apparent inconsistencies in prior work on knowledge attributions, and constitute progress toward a general understanding of the relation between success and luck. PMID:25005164

  13. Unconscious knowledge: A survey

    PubMed Central

    Augusto, Luís M.

    2011-01-01

    The concept of unconscious knowledge is fundamental for an understanding of human thought processes and mentation in general; however, the psychological community at large is not familiar with it. This paper offers a survey of the main psychological research currently being carried out into cognitive processes, and examines pathways that can be integrated into a discipline of unconscious knowledge. It shows that the field has already a defined history and discusses some of the features that all kinds of unconscious knowledge seem to share at a deeper level. With the aim of promoting further research, we discuss the main challenges which the postulation of unconscious cognition faces within the psychological community. PMID:21814538

  14. Knowledge Management: A Skeptic's Guide

    NASA Technical Reports Server (NTRS)

    Linde, Charlotte

    2006-01-01

    A viewgraph presentation discussing knowledge management is shown. The topics include: 1) What is Knowledge Management? 2) Why Manage Knowledge? The Presenting Problems; 3) What Gets Called Knowledge Management? 4) Attempts to Rethink Assumptions about Knowledgs; 5) What is Knowledge? 6) Knowledge Management and INstitutional Memory; 7) Knowledge Management and Culture; 8) To solve a social problem, it's easier to call for cultural rather than organizational change; 9) Will the Knowledge Management Effort Succeed? and 10) Backup: Metrics for Valuing Intellectural Capital i.e. Knowledge.

  15. Test Your Asthma Knowledge

    MedlinePlus

    ... Current Issue Past Issues Special Section Test Your Asthma Knowledge Past Issues / Fall 2007 Table of Contents ... page please turn Javascript on. True or False? Asthma is caused by an inflammation of the inner ...

  16. Visualizing Knowledge Domains.

    ERIC Educational Resources Information Center

    Borner, Katy; Chen, Chaomei; Boyack, Kevin W.

    2003-01-01

    Reviews visualization techniques for scientific disciplines and information retrieval and classification. Highlights include historical background of scientometrics, bibliometrics, and citation analysis; map generation; process flow of visualizing knowledge domains; measures and similarity calculations; vector space model; factor analysis;…

  17. The Costs of Knowledge

    NASA Technical Reports Server (NTRS)

    Prusak, Laurence

    2008-01-01

    Acquiring knowledge-genuinely learning something new-requires the consent and commitment of the person you're trying to learn from. In contrast to information, which can usually be effectively transmitted in a document or diagram, knowledge comes from explaining, clarifying, questioning, and sometimes actually working together. Getting this kind of attention and commitment often involves some form of negotiation, since even the most generous person's time and energy are limited. Few experts sit around waiting to share their knowledge with strangers or casual acquaintances. In reasonably collaborative enterprises- I think NASA is one-this sort of negotiation isn't too onerous. People want to help each other and share what they know, so the "cost" of acquiring knowledge is relatively low. In many organizations (and many communities and countries), however, there are considerable costs associated with this activity, and many situations in which negotiations fail. The greatest knowledge cost is in and adopting knowledge to one's own use. Sometimes this means formally organizing what one learns in writing. Sometimes it means just taking time to reflect on someone else's thoughts and experiences-thinking about knowledge that is not exactly what you need but can lead you to develop ideas that will be useful. A long, discursive conversation, with all the back-and-forth that defines conversation, can be a mechanism of knowledge exchange. I have seen many participants at NASA APPEL Masters Forums talking, reflecting, and thinking-adapting what they are hearing to their own needs. Knowledge transfer is not a simple proposition. An enormous amount of information flows through the world every day, but knowledge is local, contextual, and "stickyn-that is, it takes real effort to move it from one place to another. There is no way around this. To really learn a subject, you have to work at it, you have to pay your "knowledge dues." So while, thanks to advances in technology

  18. US Spacesuit Knowledge Capture

    NASA Technical Reports Server (NTRS)

    Chullen, Cinda; Thomas, Ken; McMann, Joe; Dolan, Kristi; Bitterly, Rose; Lewis, Cathleen

    2011-01-01

    The ability to learn from both the mistakes and successes of the past is vital to assuring success in the future. Due to the close physical interaction between spacesuit systems and human beings as users, spacesuit technology and usage lends itself rather uniquely to the benefits realized from the skillful organization of historical information; its dissemination; the collection and identification of artifacts; and the education of those in the field. The National Aeronautics and Space Administration (NASA), other organizations and individuals have been performing United States (U.S.) Spacesuit Knowledge Capture since the beginning of space exploration. Avenues used to capture the knowledge have included publication of reports; conference presentations; specialized seminars; and classes usually given by veterans in the field. More recently the effort has been more concentrated and formalized whereby a new avenue of spacesuit knowledge capture has been added to the archives in which videotaping occurs engaging both current and retired specialists in the field presenting technical scope specifically for education and preservation of knowledge. With video archiving, all these avenues of learning can now be brought to life with the real experts presenting their wealth of knowledge on screen for future learners to enjoy. Scope and topics of U.S. spacesuit knowledge capture have included lessons learned in spacesuit technology, experience from the Gemini, Apollo, Skylab and Shuttle programs, hardware certification, design, development and other program components, spacesuit evolution and experience, failure analysis and resolution, and aspects of program management. Concurrently, U.S. spacesuit knowledge capture activities have progressed to a level where NASA, the National Air and Space Museum (NASM), Hamilton Sundstrand (HS) and the spacesuit community are now working together to provide a comprehensive closed-looped spacesuit knowledge capture system which includes

  19. Hybrid knowledge systems

    NASA Technical Reports Server (NTRS)

    Subrahmanian, V. S.

    1994-01-01

    An architecture called hybrid knowledge system (HKS) is described that can be used to interoperate between a specification of the control laws describing a physical system, a collection of databases, knowledge bases and/or other data structures reflecting information about the world in which the physical system controlled resides, observations (e.g. sensor information) from the external world, and actions that must be taken in response to external observations.

  20. A priori and a posteriori dietary patterns at the age of 1 year and body composition at the age of 6 years: the Generation R Study.

    PubMed

    Voortman, Trudy; Leermakers, Elisabeth T M; Franco, Oscar H; Jaddoe, Vincent W V; Moll, Henriette A; Hofman, Albert; van den Hooven, Edith H; Kiefte-de Jong, Jessica C

    2016-08-01

    Dietary patterns have been linked to obesity in adults, however, not much is known about this association in early childhood. We examined associations of different types of dietary patterns in 1-year-old children with body composition at school age in 2026 children participating in a population-based cohort study. Dietary intake at the age of 1 year was assessed with a food-frequency questionnaire. At the children's age of 6 years we measured their body composition with dual-energy X-ray absorptiometry and we calculated body mass index, fat mass index (FMI), and fat-free mass index (FFMI). Three dietary pattern approaches were used: (1) An a priori-defined diet quality score; (2) dietary patterns based on variation in food intake, derived from principal-component-analysis (PCA); and (3) dietary patterns based on variations in FMI and FFMI, derived with reduced-rank-regression (RRR). Both the a priori-defined diet score and a 'Health-conscious' PCA-pattern were characterized by a high intake of fruit, vegetables, grains, and vegetable oils, and, after adjustment for confounders, children with higher adherence to these patterns had a higher FFMI at 6 years [0.19 SD (95 % CI 0.08;0.30) per SD increase in diet score], but had no different FMI. One of the two RRR-patterns was also positively associated with FFMI and was characterized by intake of whole grains, pasta and rice, and vegetable oils. Our results suggest that different a priori- and a posteriori-derived health-conscious dietary patterns in early childhood are associated with a higher fat-free mass, but not with fat mass, in later childhood. PMID:27384175

  1. The Roles of Knowledge Professionals for Knowledge Management.

    ERIC Educational Resources Information Center

    Kim, Seonghee

    This paper starts by exploring the definition of knowledge and knowledge management; examples of acquisition, creation, packaging, application, and reuse of knowledge are provided. It then considers the partnership for knowledge management and especially how librarians as knowledge professionals, users, and technology experts can contribute to…

  2. Knowledge elicitation for an operator assistant system in process control tasks

    NASA Technical Reports Server (NTRS)

    Boy, Guy A.

    1988-01-01

    A knowledge based system (KBS) methodology designed to study human machine interactions and levels of autonomy in allocation of process control tasks is presented. Users are provided with operation manuals to assist them in normal and abnormal situations. Unfortunately, operation manuals usually represent only the functioning logic of the system to be controlled. The user logic is often totally different. A method is focused on which illicits user logic to refine a KBS shell called an Operator Assistant (OA). If the OA is to help the user, it is necessary to know what level of autonomy gives the optimal performance of the overall man-machine system. For example, for diagnoses that must be carried out carefully by both the user and the OA, interactions are frequent, and processing is mostly sequential. Other diagnoses can be automated, in which the case the OA must be able to explain its reasoning in an appropriate level of detail. OA structure was used to design a working KBS called HORSES (Human Orbital Refueling System Expert System). Protocol analysis of pilots interacting with this system reveals that the a-priori analytical knowledge becomes more structured with training and the situation patterns more complex and dynamic. This approach can improve the a-priori understanding of human and automatic reasoning.

  3. The Music Educator's Professional Knowledge

    ERIC Educational Resources Information Center

    Jorquera Jaramillo, Maria Cecilia

    2008-01-01

    Professional knowledge in teaching is broadly based on personal knowledge. Hence, it is important to build teachers' development out of their initial knowledge. The idea of a sociogenesis of educational knowledge, teacher knowledge and training models as well as teaching models are the basis of this study. It aims to diagnose the knowledge…

  4. A priori and a posteriori approaches for finding genes of evolutionary interest in non-model species: osmoregulatory genes in the kidney transcriptome of the desert rodent Dipodomys spectabilis (banner-tailed kangaroo rat).

    PubMed

    Marra, Nicholas J; Eo, Soo Hyung; Hale, Matthew C; Waser, Peter M; DeWoody, J Andrew

    2012-12-01

    One common goal in evolutionary biology is the identification of genes underlying adaptive traits of evolutionary interest. Recently next-generation sequencing techniques have greatly facilitated such evolutionary studies in species otherwise depauperate of genomic resources. Kangaroo rats (Dipodomys sp.) serve as exemplars of adaptation in that they inhabit extremely arid environments, yet require no drinking water because of ultra-efficient kidney function and osmoregulation. As a basis for identifying water conservation genes in kangaroo rats, we conducted a priori bioinformatics searches in model rodents (Mus musculus and Rattus norvegicus) to identify candidate genes with known or suspected osmoregulatory function. We then obtained 446,758 reads via 454 pyrosequencing to characterize genes expressed in the kidney of banner-tailed kangaroo rats (Dipodomys spectabilis). We also determined candidates a posteriori by identifying genes that were overexpressed in the kidney. The kangaroo rat sequences revealed nine different a priori candidate genes predicted from our Mus and Rattus searches, as well as 32 a posteriori candidate genes that were overexpressed in kidney. Mutations in two of these genes, Slc12a1 and Slc12a3, cause human renal diseases that result in the inability to concentrate urine. These genes are likely key determinants of physiological water conservation in desert rodents. PMID:22841684

  5. Knowledge management across domains

    NASA Astrophysics Data System (ADS)

    Gilfillan, Lynne G.; Haddock, Gail; Borek, Stan

    2001-02-01

    This paper presents a secure, Internet-enabled, third wave knowledge management system. TheResearchPlaceTM, that will facilitate a collaborative, strategic approach to analyzing public safety problems and developing interventions to reduce them. TheResearchPlace, currently being developed under Government and private funding for use by the National Cancer Institute, Federal agencies, and the Defense Advanced Research Project Agency, will augment Geographic Information Systems and analytical tool capabilities by providing a synergistic workspace where teams of multidisciplinary professions can manage portfolios of existing knowledge resources, locate and create new knowledge resources that are added to portfolios, and collaborate with colleagues to leverage evolving portfolios' capabilities on team missions. TheResearchPlace is currently in use by selected alpha users at selected federal sites, and by the faculty of Howard University.

  6. Uncertainty as knowledge

    PubMed Central

    Lewandowsky, Stephan; Ballard, Timothy; Pancost, Richard D.

    2015-01-01

    This issue of Philosophical Transactions examines the relationship between scientific uncertainty about climate change and knowledge. Uncertainty is an inherent feature of the climate system. Considerable effort has therefore been devoted to understanding how to effectively respond to a changing, yet uncertain climate. Politicians and the public often appeal to uncertainty as an argument to delay mitigative action. We argue that the appropriate response to uncertainty is exactly the opposite: uncertainty provides an impetus to be concerned about climate change, because greater uncertainty increases the risks associated with climate change. We therefore suggest that uncertainty can be a source of actionable knowledge. We survey the papers in this issue, which address the relationship between uncertainty and knowledge from physical, economic and social perspectives. We also summarize the pervasive psychological effects of uncertainty, some of which may militate against a meaningful response to climate change, and we provide pointers to how those difficulties may be ameliorated. PMID:26460108

  7. Exchanging clinical knowledge via Internet.

    PubMed

    Buchan, I E; Hanka, R

    1997-11-01

    The need for effective and efficient exchange of clinical knowledge is increasing. Paper based methods for managing clinical knowledge are not meeting the demand for knowledge and this has undoubtedly contributed to the widely reported failures of clinical guidelines. Internet affords both opportunities and dangers for clinical knowledge. Systems such as Wax have demonstrated the importance of intuitive structure in the management of knowledge. We report on a new initiative for the global management of clinical knowledge. PMID:9506390

  8. Knowledge-based vision for space station object motion detection, recognition, and tracking

    NASA Technical Reports Server (NTRS)

    Symosek, P.; Panda, D.; Yalamanchili, S.; Wehner, W., III

    1987-01-01

    Computer vision, especially color image analysis and understanding, has much to offer in the area of the automation of Space Station tasks such as construction, satellite servicing, rendezvous and proximity operations, inspection, experiment monitoring, data management and training. Knowledge-based techniques improve the performance of vision algorithms for unstructured environments because of their ability to deal with imprecise a priori information or inaccurately estimated feature data and still produce useful results. Conventional techniques using statistical and purely model-based approaches lack flexibility in dealing with the variabilities anticipated in the unstructured viewing environment of space. Algorithms developed under NASA sponsorship for Space Station applications to demonstrate the value of a hypothesized architecture for a Video Image Processor (VIP) are presented. Approaches to the enhancement of the performance of these algorithms with knowledge-based techniques and the potential for deployment of highly-parallel multi-processor systems for these algorithms are discussed.

  9. Self-emergence of knowledge trees: Extraction of the Wikipedia hierarchies

    NASA Astrophysics Data System (ADS)

    Muchnik, Lev; Itzhack, Royi; Solomon, Sorin; Louzoun, Yoram

    2007-07-01

    The rapid accumulation of knowledge and the recent emergence of new dynamic and practically unmoderated information repositories have rendered the classical concept of the hierarchal knowledge structure irrelevant and impossible to impose manually. This led to modern methods of data location, such as browsing or searching, which conceal the underlying information structure. We here propose methods designed to automatically construct a hierarchy from a network of related terms. We apply these methods to Wikipedia and compare the hierarchy obtained from the article network to the complementary acyclic category layer of the Wikipedia and show an excellent fit. We verify our methods in two networks with no a priori hierarchy (the E. Coli genetic regulatory network and the C. Elegans neural network) and a network of function libraries of modern computer operating systems that are intrinsically hierarchical and reproduce a known functional order.

  10. Activating Event Knowledge

    PubMed Central

    Hare, Mary; Jones, Michael; Thomson, Caroline; Kelly, Sarah; McRae, Ken

    2009-01-01

    An increasing number of results in sentence and discourse processing demonstrate that comprehension relies on rich pragmatic knowledge about real-world events, and that incoming words incrementally activate such knowledge. If so, then even outside of any larger context, nouns should activate knowledge of the generalized events that they denote or typically play a role in. We used short stimulus onset asynchrony priming to demonstrate that (1) event nouns prime people (sale-shopper) and objects (trip-luggage) commonly found at those events; (2) location nouns prime people/animals (hospital-doctor) and objects (barn-hay) commonly found at those locations; and (3) instrument nouns prime things on which those instruments are commonly used (key-door), but not the types of people who tend to use them (hose-gardener). The priming effects are not due to normative word association. On our account, facilitation results from event knowledge relating primes and targets. This has much in common with computational models like LSA or BEAGLE in which one word primes another if they frequently occur in similar contexts. LSA predicts priming for all six experiments, whereas BEAGLE correctly predicted that priming should not occur for the instrument-people relation but should occur for the other five. We conclude that event-based relations are encoded in semantic memory and computed as part of word meaning, and have a strong influence on language comprehension. PMID:19298961

  11. Anishinaabe Star Knowledge.

    ERIC Educational Resources Information Center

    Price, Michael Wassegijig

    2002-01-01

    A connection with nature constitutes the difference between Western science and indigenous perspectives of the natural world. Understanding the synchronicity of natural and astronomical cycles is integral to Anishinaabe cosmology. Examples show how the Anishinaabe cultural worldview and philosophy are reflected in their celestial knowledge and how…

  12. Managing knowledge in neuroscience.

    PubMed

    Crasto, Chiquito J; Shepherd, Gordon M

    2007-01-01

    Processing text from scientific literature has become a necessity due to the burgeoning amounts of information that are fast becoming available, stemming from advances in electronic information technology. We created a program, NeuroText ( http://senselab.med.yale.edu/textmine/neurotext.pl ), designed specifically to extract information relevant to neuroscience-specific databases, NeuronDB and CellPropDB ( http://senselab.med.yale.edu/senselab/ ), housed at the Yale University School of Medicine. NeuroText extracts relevant information from the Neuroscience literature in a two-step process: each step parses text at different levels of granularity. NeuroText uses an expert-mediated knowledge base and combines the techniques of indexing, contextual parsing, semantic and lexical parsing, and supervised and non-supervised learning to extract information. The constrains, metadata elements, and rules for information extraction are stored in the knowledge base. NeuroText was created as a pilot project to process 3 years of publications in Journal of Neuroscience and was subsequently tested for 40,000 PubMed abstracts. We also present here a template to create domain non-specific knowledge base that when linked to a text-processing tool like NeuroText can be used to extract knowledge in other fields of research. PMID:18368357

  13. Medical Knowledge Bases.

    ERIC Educational Resources Information Center

    Miller, Randolph A.; Giuse, Nunzia B.

    1991-01-01

    Few commonly available, successful computer-based tools exist in medical informatics. Faculty expertise can be included in computer-based medical information systems. Computers allow dynamic recombination of knowledge to answer questions unanswerable with print textbooks. Such systems can also create stronger ties between academic and clinical…

  14. National Knowledge Commission

    NASA Astrophysics Data System (ADS)

    Pitroda, Sam

    2007-04-01

    India's National Knowledge Commission (NKC) established by the prime minister is focused on building institutions and infrastructure in Education, Science and Technology, Innovation etc. to meet the challenges of the knowledge economy in the 21st century and increase India's competitive advantage in the global market. India today stands poised to reap the benefits of a rapidly growing economy and a major demographic advantage, with 550 million young people below the age of 25 years, the largest in the world. The NKC is focused on five critical areas of knowledge related to access, concepts, creation, applications and services. This includes a variety of subject areas such as language, translations, libraries, networks, portals, affirmative action, distance learning, intellectual property, Entrepreneurship, application in Agriculture, health, small and medium scale industries, e-governance etc. One of the keys to this effort is to build a national broadband gigabit of networks of 500 nodes to connect universities, Libraries, Laboratories, Hospitals, Agriculture institutions etc. to share resources and collaborate on multidisciplinary activities. This presentation will introduce the NKC, discuss methodology, subject areas, specific recommendation and outline a plan to build knowledge networks and specifics on network architecture, applications, and utilities.

  15. Assessing Knowledge of Cultures.

    ERIC Educational Resources Information Center

    Norris, Robert

    The procedures used in a study to determine how well a group of American Indian college students understood their traditional and modern cultures and a college Caucasian culture were explained in this paper. The sample consisted of 111 Indian students enrolled in the University of New Mexico. The students were tested in the areas of knowledge of…

  16. Transforming Data into Knowledge

    ERIC Educational Resources Information Center

    Mills, Lane

    2006-01-01

    School systems can be data rich and information poor if they do not understand and manage their data effectively. The task for school leaders is to put existing data into a format that lends itself to answering questions and improving outcomes for the students. Common barriers to transforming data into knowledge in education settings often include…

  17. Keeping Knowledge in Site

    ERIC Educational Resources Information Center

    Livingstone, David N.

    2010-01-01

    Recent work on the history of education has been registering a "spatial turn" in its historiography. These reflections from a historical geographer working on the spatiality of knowledge enterprises (science in particular) reviews some recent developments in the field before turning to three themes--landscape agency, geographies of textuality, and…

  18. Is Knowledge Really Power?

    ERIC Educational Resources Information Center

    Gold, Robert S.; Kelly, Miriam A.

    1988-01-01

    There is a vast difference between factual information and a sense of understanding that comes from the organization of knowledge in a way in which it can be used in decision processes. Recognition of interdependencies and interrelatedness leads to understanding and utilization. (JD)

  19. Spectral Bayesian Knowledge Tracing

    ERIC Educational Resources Information Center

    Falakmasir, Mohammad; Yudelson, Michael; Ritter, Steve; Koedinger, Ken

    2015-01-01

    Bayesian Knowledge Tracing (BKT) has been in wide use for modeling student skill acquisition in Intelligent Tutoring Systems (ITS). BKT tracks and updates student's latent mastery of a skill as a probability distribution of a binary variable. BKT does so by accounting for observed student successes in applying the skill correctly, where success is…

  20. Hermeneutics of Integrative Knowledge.

    ERIC Educational Resources Information Center

    Shin, Un-chol

    This paper examines and compares the formation processes and structures of three types of integrative knowledge that in general represent natural sciences, social sciences, and humanities. These three types can be observed, respectively, in the philosophies of Michael Polanyi, Jurgen Habermas, and Paul Ricoeur. These types of integrative knowledge…

  1. Electoral Knowledge and Uncertainty.

    ERIC Educational Resources Information Center

    Blood, R. Warwick; And Others

    Research indicates that the media play a role in shaping the information that voters have about election options. Knowledge of those options has been related to actual vote, but has not been shown to be strongly related to uncertainty. Uncertainty, however, does seem to motivate voters to engage in communication activities, some of which may…

  2. Knowledge-Based Abstracting.

    ERIC Educational Resources Information Center

    Black, William J.

    1990-01-01

    Discussion of automatic abstracting of technical papers focuses on a knowledge-based method that uses two sets of rules. Topics discussed include anaphora; text structure and discourse; abstracting techniques, including the keyword method and the indicator phrase method; and tools for text skimming. (27 references) (LRW)

  3. Knowledge Management as Enterprise

    ERIC Educational Resources Information Center

    Kutay, Cat

    2007-01-01

    Indigenous people have been for a long time deprived of financial benefit from their knowledge. Campaigns around the stolen wages and the "Pay the Rent" campaign highlight this. As does the endemic poverty and economic disenfranchisement experienced by many Indigenous people and communities in Australia. Recent enterprises developed by Indigenous…

  4. Doing Knowledge Management

    ERIC Educational Resources Information Center

    Firestone, Joseph M.; McElroy, Mark W.

    2005-01-01

    Purpose: Knowledge management (KM) as a field has been characterized by great confusion about its conceptual foundations and scope, much to the detriment of assessments of its impact and track record. The purpose of this paper is to contribute toward defining the scope of KM and ending the confusion, by presenting a conceptual framework and set of…

  5. Reusing Design Knowledge Based on Design Cases and Knowledge Map

    ERIC Educational Resources Information Center

    Yang, Cheng; Liu, Zheng; Wang, Haobai; Shen, Jiaoqi

    2013-01-01

    Design knowledge was reused for innovative design work to support designers with product design knowledge and help designers who lack rich experiences to improve their design capacity and efficiency. First, based on the ontological model of product design knowledge constructed by taxonomy, implicit and explicit knowledge was extracted from some…

  6. Depth of Teachers' Knowledge: Frameworks for Teachers' Knowledge of Mathematics

    ERIC Educational Resources Information Center

    Holmes, Vicki-Lynn

    2012-01-01

    This article describes seven teacher knowledge frameworks and relates these frameworks to the teaching and assessment of elementary teacher's mathematics knowledge. The frameworks classify teachers' knowledge and provide a vocabulary and common language through which knowledge can be discussed and assessed. These frameworks are categorized into…

  7. Creating Illusions of Knowledge: Learning Errors that Contradict Prior Knowledge

    ERIC Educational Resources Information Center

    Fazio, Lisa K.; Barber, Sarah J.; Rajaram, Suparna; Ornstein, Peter A.; Marsh, Elizabeth J.

    2013-01-01

    Most people know that the Pacific is the largest ocean on Earth and that Edison invented the light bulb. Our question is whether this knowledge is stable, or if people will incorporate errors into their knowledge bases, even if they have the correct knowledge stored in memory. To test this, we asked participants general-knowledge questions 2 weeks…

  8. Knowledge From Pictures (KFP)

    NASA Technical Reports Server (NTRS)

    Truszkowski, Walt; Paterra, Frank; Bailin, Sidney

    1993-01-01

    The old maxim goes: 'A picture is worth a thousand words'. The objective of the research reported in this paper is to demonstrate this idea as it relates to the knowledge acquisition process and the automated development of an expert system's rule base. A prototype tool, the Knowledge From Pictures (KFP) tool, has been developed which configures an expert system's rule base by an automated analysis of and reasoning about a 'picture', i.e., a graphical representation of some target system to be supported by the diagnostic capabilities of the expert system under development. This rule base, when refined, could then be used by the expert system for target system monitoring and fault analysis in an operational setting. Most people, when faced with the problem of understanding the behavior of a complicated system, resort to the use of some picture or graphical representation of the system as an aid in thinking about it. This depiction provides a means of helping the individual to visualize the bahavior and dynamics of the system under study. An analysis of the picture augmented with the individual's background information, allows the problem solver to codify knowledge about the system. This knowledge can, in turn, be used to develop computer programs to automatically monitor the system's performance. The approach taken is this research was to mimic this knowledge acquisition paradigm. A prototype tool was developed which provides the user: (1) a mechanism for graphically representing sample system-configurations appropriate for the domain, and (2) a linguistic device for annotating the graphical representation with the behaviors and mutual influences of the components depicted in the graphic. The KFP tool, reasoning from the graphical depiction along with user-supplied annotations of component behaviors and inter-component influences, generates a rule base that could be used in automating the fault detection, isolation, and repair of the system.

  9. Standard model of knowledge representation

    NASA Astrophysics Data System (ADS)

    Yin, Wensheng

    2016-03-01

    Knowledge representation is the core of artificial intelligence research. Knowledge representation methods include predicate logic, semantic network, computer programming language, database, mathematical model, graphics language, natural language, etc. To establish the intrinsic link between various knowledge representation methods, a unified knowledge representation model is necessary. According to ontology, system theory, and control theory, a standard model of knowledge representation that reflects the change of the objective world is proposed. The model is composed of input, processing, and output. This knowledge representation method is not a contradiction to the traditional knowledge representation method. It can express knowledge in terms of multivariate and multidimensional. It can also express process knowledge, and at the same time, it has a strong ability to solve problems. In addition, the standard model of knowledge representation provides a way to solve problems of non-precision and inconsistent knowledge.

  10. Engineering Knowledge for Assistive Living

    NASA Astrophysics Data System (ADS)

    Chen, Liming; Nugent, Chris

    This paper introduces a knowledge based approach to assistive living in smart homes. It proposes a system architecture that makes use of knowledge in the lifecycle of assistive living. The paper describes ontology based knowledge engineering practices and discusses mechanisms for exploiting knowledge for activity recognition and assistance. It presents system implementation and experiments, and discusses initial results.

  11. The Folk Conception of Knowledge

    ERIC Educational Resources Information Center

    Starmans, Christina; Friedman, Ori

    2012-01-01

    How do people decide which claims should be considered mere beliefs and which count as knowledge? Although little is known about how people attribute knowledge to others, philosophical debate about the nature of knowledge may provide a starting point. Traditionally, a belief that is both true and justified was thought to constitute knowledge.…

  12. Kinds of Knowledge in Algebra.

    ERIC Educational Resources Information Center

    Lewis, Clayton

    Solving equations in elementary algebra requires knowledge of the permitted operations, and knowledge of what operation to use at a given point in the solution process. While just these kinds of knowledge would be adequate for an ideal solver, human solvers appear to need and use other kinds of knowledge. First, many errors seem to indicate that…

  13. Sexual Knowledge among Norwegian Adolescents.

    ERIC Educational Resources Information Center

    Kraft, Pal

    1993-01-01

    Studied sexual knowledge among Norwegian adolescents (n=1,855) aged 17-19 years. Found knowledge gaps among adolescents on sexual physiology and anatomy, sexually transmitted diseases, and fecundation/contraception. Level of sexual knowledge was higher among girls than boys and increased with increasing age. Sexual knowledge did not predict…

  14. Knowledge Translation: Implications for Evaluation

    ERIC Educational Resources Information Center

    Davison, Colleen M.

    2009-01-01

    Translation theory originates in the field of applied linguistics and communication. The term knowledge translation has been adopted in health and other fields to refer to the exchange, synthesis, and application of knowledge. The logic model is a circular or iterative loop among various knowledge translation actors (knowledge producers and users)…

  15. Investigating the Knowledge Management Culture

    ERIC Educational Resources Information Center

    Stylianou, Vasso; Savva, Andreas

    2016-01-01

    Knowledge Management (KM) efforts aim at leveraging an organization into a knowledge organization thereby presenting knowledge employees with a very powerful tool; organized valuable knowledge accessible when and where needed in flexible, technologically-enhanced modes. The attainment of this aim, i.e., the transformation into a knowledge…

  16. Knowledge Creation in Constructivist Learning

    ERIC Educational Resources Information Center

    Jaleel, Sajna; Verghis, Alie Molly

    2015-01-01

    In today's competitive global economy characterized by knowledge acquisition, the concept of knowledge management has become increasingly prevalent in academic and business practices. Knowledge creation is an important factor and remains a source of competitive advantage over knowledge management. Constructivism holds that learners learn actively…

  17. The Knowledge Stealing Initiative?

    NASA Technical Reports Server (NTRS)

    Goshorn, Larry

    2005-01-01

    I have the honor of being on the Academy of Program and Project Leadership (APPL) Knowledge Sharing Feedback and Assessment Team (FAA), and as such, I am privileged to receive the feedback written by many of you as attendees of the Project Management (PM) Master s Forums. It is the intent of the FAA Team and APPL leadership to use this feedback as a tool for continuous program improvement. As a retired (sort of) PM in the payload contracting industry, I'm a big supporter of NASA s Knowledge Sharing Initiative (KSI), especially the Master's Forums. I really enjoy participating in them. Unfortunately I had to miss the 8th forum in Pasadena this past Spring, but I did get the feedback package for the Assessment Team work. So here I was, reviewing twelve pages of comments, reflections, learning notes and critiques from attendees of the 8th forum.

  18. Test your troubleshooting knowledge.

    PubMed

    Snyder, E

    2001-01-01

    While troubleshooting and repairing medical instrumentation may be all that BMETs would like to do, it's just too limited in scope to perform the job effectively. Flattened organizations can require greater responsibility for BMETs--and lead to greater ambiguity. Besides electronic troubleshooting skills, mechanical ability, and the knowledge of how medical equipment normally operates, additional skills are required of the BMET to effectively facilitate a repair--such as knowledge of pertinent codes and standards, job safety laws and guidelines, politeness, and empathy for the equipment user. You will notice that many of these relate to interpersonal relations. The ability to interact with fellow health care workers in a non-threatening manner and to have an appreciation for their perspectives are valuable customer service skills--potentially more valuable than being able to do component-level troubleshooting! PMID:11668951

  19. Spatial Knowledge Capture Library

    Energy Science and Technology Software Center (ESTSC)

    2005-05-16

    The Spatial Knowledge Capture Library is a set of algorithms to capture regularities in shapes and trajectories through space and time. We have applied Spatial Knowledge Capture to model the actions of human experts in spatial domains, such as an AWACS Weapons Director task simulation. The library constructs a model to predict the expert’s response to sets of changing cues, such as the movements and actions of adversaries on a battlefield, The library includes amore » highly configurable feature extraction functionality, which supports rapid experimentation to discover causative factors. We use k-medoid clustering to group similar episodes of behavior, and construct a Markov model of system state transitions induced by agents’ actions.« less

  20. Threads of common knowledge.

    PubMed

    Icamina, P

    1993-04-01

    Indigenous knowledge is examined as it is affected by development and scientific exploration. The indigenous culture of shamanism, which originated in northern and southeast Asia, is a "political and religious technique for managing societies through rituals, myths, and world views." There is respect for the natural environment and community life as a social common good. This world view is still practiced by many in Latin America and in Colombia specifically. Colombian shamanism has an environmental accounting system, but the Brazilian government has established its own system of land tenure and political representation which does not adequately represent shamanism. In 1992 a conference was held in the Philippines by the International Institute for Rural Reconstruction and IDRC on sustainable development and indigenous knowledge. The link between the two is necessary. Unfortunately, there are already examples in the Philippines of loss of traditional crop diversity after the introduction of modern farming techniques and new crop varieties. An attempt was made to collect species, but without proper identification. Opposition was expressed to the preservation of wilderness preserves; the desire was to allow indigenous people to maintain their homeland and use their time-tested sustainable resource management strategies. Property rights were also discussed during the conference. Of particular concern was the protection of knowledge rights about biological diversity or pharmaceutical properties of indigenous plant species. The original owners and keepers of the knowledge must retain access and control. The research gaps were identified and found to be expansive. Reference was made to a study of Mexican Indian children who knew 138 plant species while non-Indian children knew only 37. Sometimes there is conflict of interest where foresters prefer timber forests and farmers desire fuelwood supplies and fodder and grazing land, which is provided by shrubland. Information

  1. Knowledge Translation in Audiology

    PubMed Central

    Kothari, Anita; Bagatto, Marlene P.; Seewald, Richard; Miller, Linda T.; Scollie, Susan D.

    2011-01-01

    The impetus for evidence-based practice (EBP) has grown out of widespread concern with the quality, effectiveness (including cost-effectiveness), and efficiency of medical care received by the public. Although initially focused on medicine, EBP principles have been adopted by many of the health care professions and are often represented in practice through the development and use of clinical practice guidelines (CPGs). Audiology has been working on incorporating EBP principles into its mandate for professional practice since the mid-1990s. Despite widespread efforts to implement EBP and guidelines into audiology practice, gaps still exist between the best evidence based on research and what is being done in clinical practice. A collaborative dynamic and iterative integrated knowledge translation (KT) framework rather than a researcher-driven hierarchical approach to EBP and the development of CPGs has been shown to reduce the knowledge-to-clinical action gaps. This article provides a brief overview of EBP and CPGs, including a discussion of the barriers to implementing CPGs into clinical practice. It then offers a discussion of how an integrated KT process combined with a community of practice (CoP) might facilitate the development and dissemination of evidence for clinical audiology practice. Finally, a project that uses the knowledge-to-action (KTA) framework for the development of outcome measures in pediatric audiology is introduced. PMID:22194314

  2. Knowledge and question asking.

    PubMed

    Ibáñez Molinero, Rafael; García-Madruga, Juan Antonio

    2011-02-01

    The ability and the motivation for question asking are, or should be, some of the most important aims of education. Unfortunately, students neither ask many questions, nor good ones. The present paper is about the capacity of secondary school pupils for asking questions and how this activity depends on prior knowledge. To examine this, we use texts containing different levels of information about a specific topic: biodiversity. We found a positive relationship between the amount of information provided and the number of questions asked about the texts, supporting the idea that more knowledgeable people ask more questions. Some students were warned that there would be an exam after the reading, and this led to a diminishing number of questions asked, and yet this still did not significantly improve their exam scores. In such a case, it seems that reading was more concerned with immediacy, hindering critical thinking and the dialog between their previous ideas and the new information. Thus, question asking seems to be influenced not only by the amount of knowledge, but also by the reader's attitude towards the information. PMID:21266138

  3. Creating illusions of knowledge: learning errors that contradict prior knowledge.

    PubMed

    Fazio, Lisa K; Barber, Sarah J; Rajaram, Suparna; Ornstein, Peter A; Marsh, Elizabeth J

    2013-02-01

    Most people know that the Pacific is the largest ocean on Earth and that Edison invented the light bulb. Our question is whether this knowledge is stable, or if people will incorporate errors into their knowledge bases, even if they have the correct knowledge stored in memory. To test this, we asked participants general-knowledge questions 2 weeks before they read stories that contained errors (e.g., "Franklin invented the light bulb"). On a later general-knowledge test, participants reproduced story errors despite previously answering the questions correctly. This misinformation effect was found even for questions that were answered correctly on the initial test with the highest level of confidence. Furthermore, prior knowledge offered no protection against errors entering the knowledge base; the misinformation effect was equivalent for previously known and unknown facts. Errors can enter the knowledge base even when learners have the knowledge necessary to catch the errors. PMID:22612770

  4. Western Hemisphere Knowledge Partnerships

    NASA Astrophysics Data System (ADS)

    Malone, T. F.

    2001-05-01

    Society in general, and geophysicists in particular, are challenged by problems and opportunities in the prospects for an additional three billion people on finite planet Earth by 2050 in a global economy four to six times larger than it is at present. A problem was identified by the Pilot Assessment of Global Ecosystems (PAGE): "If we choose to continue our current patterns of use, we face almost certain decline in the ability of ecosystems to yield their broad spectrum of benefits - from clean water to stable climate, fuel wood to food crops, timber to wildlife habitat." This is the issue of environmental sustainability. Another problem is the widening gap in wealth and health between affluent nations and impoverished countries. Every day each of the more than a billion people in the industrial nations produces goods and services worth nearly 60 dollars to meet their basic needs and "wants." This figure increases by about 85 cents annually. Every day each of the 600 million people in the least developed countries produces goods and services worth about 75 cents to meet their basic needs and limited wants. That number grows by less that a penny a day annually. This is the issue of economic prosperity and equity. By harnessing revolutionary technologies in communications to distribute expanding knowledge in the physical, chemical, and geophysical sciences and exploding knowledge in the biological and health sciences, a new vision for world society is brought within reach in The Knowledge Age. It is a society in which all of the basic human needs and an equitable share of human wants can be met while maintaining healthy, attractive, and biologically productive ecosystems. This society is environmentally sustainable, economically prosperous and equitable, and therefore likely to be politically stable. The time has arrived to fashion a strategy to pursue that vision. A knowledge-based and human-centered strategy will involve the discovery, integration, dissemination

  5. Procedural and Conceptual Knowledge: Exploring the Gap between Knowledge Type and Knowledge Quality

    ERIC Educational Resources Information Center

    Star, Jon R.; Stylianides, Gabriel J.

    2013-01-01

    Following Star (2005, 2007), we continue to problematize the entangling of type and quality in the use of conceptual knowledge and procedural knowledge. Although those whose work is guided by types of knowledge and those whose work is guided by qualities of knowledge seem to be referring to the same phenomena, actually they are not. This lack of…

  6. Distinguishing Knowledge-Sharing, Knowledge-Construction, and Knowledge-Creation Discourses

    ERIC Educational Resources Information Center

    van Aalst, Jan

    2009-01-01

    The study reported here sought to obtain the clear articulation of asynchronous computer-mediated discourse needed for Carl Bereiter and Marlene Scardamalia's knowledge-creation model. Distinctions were set up between three modes of discourse: knowledge sharing, knowledge construction, and knowledge creation. These were applied to the asynchronous…

  7. From knowledge presentation to knowledge representation to knowledge construction: Future directions for hypermedia

    NASA Technical Reports Server (NTRS)

    Palumbo, David B.

    1990-01-01

    Relationships between human memory systems and hypermedia systems are discussed with particular emphasis on the underlying importance of associational memory. The distinctions between knowledge presentation, knowledge representation, and knowledge constructions are addressed. Issues involved in actually developing individualizable hypermedia based knowledge construction tools are presented.

  8. Knowledge Integration to Make Decisions About Complex Systems: Sustainability of Energy Production from Agriculture

    ScienceCinema

    Danuso, Francesco [University of Udine, Italy

    2010-01-08

    A major bottleneck for improving the governance of complex systems, rely on our ability to integrate different forms of knowledge into a decision support system (DSS). Preliminary aspects are the classification of different types of knowledge (a priori or general, a posteriori or specific, with uncertainty, numerical, textual, algorithmic, complete/incomplete, etc.), the definition of ontologies for knowledge management and the availability of proper tools like continuous simulation models, event driven models, statistical approaches, computational methods (neural networks, evolutionary optimization, rule based systems etc.) and procedure for textual documentation. Following these views at University of Udine, a computer language (SEMoLa, Simple, Easy Modelling Language) for knowledge integration has been developed.  SEMoLa can handle models, data, metadata and textual knowledge; it implements and extends the system dynamics ontology (Forrester, 1968; Jørgensen, 1994) in which systems are modelled by the concepts of material, group, state, rate, parameter, internal and external events and driving variables. As an example, a SEMoLa model to improve management and sustainability (economical, energetic, environmental) of the agricultural farms is presented. The model (X-Farm) simulates a farm in which cereal and forage yield, oil seeds, milk, calves and wastes can be sold or reused. X-Farm is composed by integrated modules describing fields (crop and soil), feeds and materials storage, machinery management, manpower  management, animal husbandry, economic and energetic balances, seed oil extraction, manure and wastes management, biogas production from animal wastes and biomasses.

  9. Knowledge Integration to Make Decisions About Complex Systems: Sustainability of Energy Production from Agriculture

    SciTech Connect

    Danuso, Francesco

    2008-06-18

    A major bottleneck for improving the governance of complex systems, rely on our ability to integrate different forms of knowledge into a decision support system (DSS). Preliminary aspects are the classification of different types of knowledge (a priori or general, a posteriori or specific, with uncertainty, numerical, textual, algorithmic, complete/incomplete, etc.), the definition of ontologies for knowledge management and the availability of proper tools like continuous simulation models, event driven models, statistical approaches, computational methods (neural networks, evolutionary optimization, rule based systems etc.) and procedure for textual documentation. Following these views at University of Udine, a computer language (SEMoLa, Simple, Easy Modelling Language) for knowledge integration has been developed. SEMoLa can handle models, data, metadata and textual knowledge; it implements and extends the system dynamics ontology (Forrester, 1968; Joergensen, 1994) in which systems are modeled by the concepts of material, group, state, rate, parameter, internal and external events and driving variables. As an example, a SEMoLa model to improve management and sustainability (economical, energetic, environmental) of the agricultural farms is presented. The model (X-Farm) simulates a farm in which cereal and forage yield, oil seeds, milk, calves and wastes can be sold or reused. X-Farm is composed by integrated modules describing fields (crop and soil), feeds and materials storage, machinery management, manpower management, animal husbandry, economic and energetic balances, seed oil extraction, manure and wastes management, biogas production from animal wastes and biomasses.

  10. Knowledge Integration to Make Decisions About Complex Systems: Sustainability of Energy Production from Agriculture

    SciTech Connect

    Danuso, Francesco

    2008-06-18

    A major bottleneck for improving the governance of complex systems, rely on our ability to integrate different forms of knowledge into a decision support system (DSS). Preliminary aspects are the classification of different types of knowledge (a priori or general, a posteriori or specific, with uncertainty, numerical, textual, algorithmic, complete/incomplete, etc.), the definition of ontologies for knowledge management and the availability of proper tools like continuous simulation models, event driven models, statistical approaches, computational methods (neural networks, evolutionary optimization, rule based systems etc.) and procedure for textual documentation. Following these views at University of Udine, a computer language (SEMoLa, Simple, Easy Modelling Language) for knowledge integration has been developed.  SEMoLa can handle models, data, metadata and textual knowledge; it implements and extends the system dynamics ontology (Forrester, 1968; Jørgensen, 1994) in which systems are modelled by the concepts of material, group, state, rate, parameter, internal and external events and driving variables. As an example, a SEMoLa model to improve management and sustainability (economical, energetic, environmental) of the agricultural farms is presented. The model (X-Farm) simulates a farm in which cereal and forage yield, oil seeds, milk, calves and wastes can be sold or reused. X-Farm is composed by integrated modules describing fields (crop and soil), feeds and materials storage, machinery management, manpower  management, animal husbandry, economic and energetic balances, seed oil extraction, manure and wastes management, biogas production from animal wastes and biomasses.