Science.gov

Sample records for information theory-based methods

  1. Optimal cross-sectional sampling for river modelling with bridges: An information theory-based method

    NASA Astrophysics Data System (ADS)

    Ridolfi, E.; Alfonso, L.; Di Baldassarre, G.; Napolitano, F.

    2016-06-01

    The description of river topography has a crucial role in accurate one-dimensional (1D) hydraulic modelling. Specifically, cross-sectional data define the riverbed elevation, the flood-prone area, and thus, the hydraulic behavior of the river. Here, the problem of the optimal cross-sectional spacing is solved through an information theory-based concept. The optimal subset of locations is the one with the maximum information content and the minimum amount of redundancy. The original contribution is the introduction of a methodology to sample river cross sections in the presence of bridges. The approach is tested on the Grosseto River (IT) and is compared to existing guidelines. The results show that the information theory-based approach can support traditional methods to estimate rivers' cross-sectional spacing.

  2. Optimal cross-sectional sampling for river modelling with bridges: An information theory-based method

    SciTech Connect

    Ridolfi, E.; Napolitano, F.; Alfonso, L.; Di Baldassarre, G.

    2016-06-08

    The description of river topography has a crucial role in accurate one-dimensional (1D) hydraulic modelling. Specifically, cross-sectional data define the riverbed elevation, the flood-prone area, and thus, the hydraulic behavior of the river. Here, the problem of the optimal cross-sectional spacing is solved through an information theory-based concept. The optimal subset of locations is the one with the maximum information content and the minimum amount of redundancy. The original contribution is the introduction of a methodology to sample river cross sections in the presence of bridges. The approach is tested on the Grosseto River (IT) and is compared to existing guidelines. The results show that the information theory-based approach can support traditional methods to estimate rivers’ cross-sectional spacing.

  3. A reexamination of information theory-based methods for DNA-binding site identification

    PubMed Central

    Erill, Ivan; O'Neill, Michael C

    2009-01-01

    Background Searching for transcription factor binding sites in genome sequences is still an open problem in bioinformatics. Despite substantial progress, search methods based on information theory remain a standard in the field, even though the full validity of their underlying assumptions has only been tested in artificial settings. Here we use newly available data on transcription factors from different bacterial genomes to make a more thorough assessment of information theory-based search methods. Results Our results reveal that conventional benchmarking against artificial sequence data leads frequently to overestimation of search efficiency. In addition, we find that sequence information by itself is often inadequate and therefore must be complemented by other cues, such as curvature, in real genomes. Furthermore, results on skewed genomes show that methods integrating skew information, such as Relative Entropy, are not effective because their assumptions may not hold in real genomes. The evidence suggests that binding sites tend to evolve towards genomic skew, rather than against it, and to maintain their information content through increased conservation. Based on these results, we identify several misconceptions on information theory as applied to binding sites, such as negative entropy, and we propose a revised paradigm to explain the observed results. Conclusion We conclude that, among information theory-based methods, the most unassuming search methods perform, on average, better than any other alternatives, since heuristic corrections to these methods are prone to fail when working on real data. A reexamination of information content in binding sites reveals that information content is a compound measure of search and binding affinity requirements, a fact that has important repercussions for our understanding of binding site evolution. PMID:19210776

  4. A fuzzy-theory-based method for studying the effect of information transmission on nonlinear crowd dispersion dynamics

    NASA Astrophysics Data System (ADS)

    Fu, Libi; Song, Weiguo; Lo, Siuming

    2017-01-01

    Emergencies involved in mass events are related to a variety of factors and processes. An important factor is the transmission of information on danger that has an influence on nonlinear crowd dynamics during the process of crowd dispersion. Due to much uncertainty in this process, there is an urgent need to propose a method to investigate the influence. In this paper, a novel fuzzy-theory-based method is presented to study crowd dynamics under the influence of information transmission. Fuzzy functions and rules are designed for the ambiguous description of human states. Reasonable inference is employed to decide the output values of decision making such as pedestrian movement speed and directions. Through simulation under four-way pedestrian situations, good crowd dispersion phenomena are achieved. Simulation results under different conditions demonstrate that information transmission cannot always induce successful crowd dispersion in all situations. This depends on whether decision strategies in response to information on danger are unified and effective, especially in dense crowds. Results also suggest that an increase in drift strength at low density and the percentage of pedestrians, who choose one of the furthest unoccupied Von Neumann neighbors from the dangerous source as the drift direction at high density, is helpful in crowd dispersion. Compared with previous work, our comprehensive study improves an in-depth understanding of nonlinear crowd dynamics under the effect of information on danger.

  5. Managing for resilience: an information theory-based ...

    EPA Pesticide Factsheets

    Ecosystems are complex and multivariate; hence, methods to assess the dynamics of ecosystems should have the capacity to evaluate multiple indicators simultaneously. Most research on identifying leading indicators of regime shifts has focused on univariate methods and simple models which have limited utility when evaluating real ecosystems, particularly because drivers are often unknown. We discuss some common univariate and multivariate approaches for detecting critical transitions in ecosystems and demonstrate their capabilities via case studies. Synthesis and applications. We illustrate the utility of an information theory-based index for assessing ecosystem dynamics. Trends in this index also provide a sentinel of both abrupt and gradual transitions in ecosystems. In response to the need to identify leading indicators of regime shifts in ecosystems, our research compares traditional indicators and Fisher information, an information theory based method, by examining four case study systems. Results demonstrate the utility of methods and offers great promise for quantifying and managing for resilience.

  6. Novel information theory based method for superimposition of lateral head radiographs and cone beam computed tomography images

    PubMed Central

    Jacquet, W; Nyssen, E; Bottenberg, P; de Groen, P; Vande Vannet, B

    2010-01-01

    Objectives The aim was to introduce a novel alignment criterion, focus mutual information (FMI), for the superimposition of lateral cephalometric radiographs and three dimensional (3D) cone beam computed images as well as the assessment of the alignment characteristics of the new method and comparison of the novel methodology with the region of interest (ROI) approach. Methods Implementation of a FMI criterion-based methodology that only requires the approximate indication of stable structures in one single image. The robustness of the method was first addressed in a phantom experiment comparing the new technique with a ROI approach. Two consecutive cephalometric radiographs were then obtained, one before and one after functional twin block application. These images were then superimposed using alignment by FMI where the following were focused on, in several ways: (1) cranial base and acoustic meatus, (2) palatal plane and (3) mandibular symphysis. The superimposed images were subtracted and coloured. The applicability to cone beam CT (CBCT) is illustrated by the alignment of CBCT images acquired before and after craniofacial surgery. Results The phantom experiment clearly shows superior alignment when compared to the ROI approach (Wilcoxon n = 17, Z = −3.290, and P = 0.001), and robustness with respect to the choice of parameters (one-sample t-test n = 50, t = −12.355, and P = 0.000). The treatment effects are revealed clearly in the subtraction image of well-aligned cephalometric radiographs. The colouring scheme of the subtraction image emphasises the areas of change and visualizes the remodelling of the soft tissue. Conclusions FMI allows for cephalometry without tracing, it avoids the error inherent to the use of landmarks and the interaction of the practitioner is kept to a minimum. The robustness to focal distribution variations limits the influence of possible examiner inaccuracy. PMID:20395459

  7. Information theory based approaches to cellular signaling.

    PubMed

    Waltermann, Christian; Klipp, Edda

    2011-10-01

    Cells interact with their environment and they have to react adequately to internal and external changes such changes in nutrient composition, physical properties like temperature or osmolarity and other stresses. More specifically, they must be able to evaluate whether the external change is significant or just in the range of noise. Based on multiple external parameters they have to compute an optimal response. Cellular signaling pathways are considered as the major means of information perception and transmission in cells. Here, we review different attempts to quantify information processing on the level of individual cells. We refer to Shannon entropy, mutual information, and informal measures of signaling pathway cross-talk and specificity. Information theory in systems biology has been successfully applied to identification of optimal pathway structures, mutual information and entropy as system response in sensitivity analysis, and quantification of input and output information. While the study of information transmission within the framework of information theory in technical systems is an advanced field with high impact in engineering and telecommunication, its application to biological objects and processes is still restricted to specific fields such as neuroscience, structural and molecular biology. However, in systems biology dealing with a holistic understanding of biochemical systems and cellular signaling only recently a number of examples for the application of information theory have emerged. This article is part of a Special Issue entitled Systems Biology of Microorganisms. Copyright © 2011 Elsevier B.V. All rights reserved.

  8. Evaluating hydrological model performance using information theory-based metrics

    USDA-ARS?s Scientific Manuscript database

    The accuracy-based model performance metrics not necessarily reflect the qualitative correspondence between simulated and measured streamflow time series. The objective of this work was to use the information theory-based metrics to see whether they can be used as complementary tool for hydrologic m...

  9. IMMAN: free software for information theory-based chemometric analysis.

    PubMed

    Urias, Ricardo W Pino; Barigye, Stephen J; Marrero-Ponce, Yovani; García-Jacas, César R; Valdes-Martiní, José R; Perez-Gimenez, Facundo

    2015-05-01

    The features and theoretical background of a new and free computational program for chemometric analysis denominated IMMAN (acronym for Information theory-based CheMoMetrics ANalysis) are presented. This is multi-platform software developed in the Java programming language, designed with a remarkably user-friendly graphical interface for the computation of a collection of information-theoretic functions adapted for rank-based unsupervised and supervised feature selection tasks. A total of 20 feature selection parameters are presented, with the unsupervised and supervised frameworks represented by 10 approaches in each case. Several information-theoretic parameters traditionally used as molecular descriptors (MDs) are adapted for use as unsupervised rank-based feature selection methods. On the other hand, a generalization scheme for the previously defined differential Shannon's entropy is discussed, as well as the introduction of Jeffreys information measure for supervised feature selection. Moreover, well-known information-theoretic feature selection parameters, such as information gain, gain ratio, and symmetrical uncertainty are incorporated to the IMMAN software ( http://mobiosd-hub.com/imman-soft/ ), following an equal-interval discretization approach. IMMAN offers data pre-processing functionalities, such as missing values processing, dataset partitioning, and browsing. Moreover, single parameter or ensemble (multi-criteria) ranking options are provided. Consequently, this software is suitable for tasks like dimensionality reduction, feature ranking, as well as comparative diversity analysis of data matrices. Simple examples of applications performed with this program are presented. A comparative study between IMMAN and WEKA feature selection tools using the Arcene dataset was performed, demonstrating similar behavior. In addition, it is revealed that the use of IMMAN unsupervised feature selection methods improves the performance of both IMMAN and WEKA

  10. Evaluation of the Performance of Information Theory-Based Methods and Cross-Correlation to Estimate the Functional Connectivity in Cortical Networks

    PubMed Central

    Garofalo, Matteo; Nieus, Thierry; Massobrio, Paolo; Martinoia, Sergio

    2009-01-01

    Functional connectivity of in vitro neuronal networks was estimated by applying different statistical algorithms on data collected by Micro-Electrode Arrays (MEAs). First we tested these “connectivity methods” on neuronal network models at an increasing level of complexity and evaluated the performance in terms of ROC (Receiver Operating Characteristic) and PPC (Positive Precision Curve), a new defined complementary method specifically developed for functional links identification. Then, the algorithms better estimated the actual connectivity of the network models, were used to extract functional connectivity from cultured cortical networks coupled to MEAs. Among the proposed approaches, Transfer Entropy and Joint-Entropy showed the best results suggesting those methods as good candidates to extract functional links in actual neuronal networks from multi-site recordings. PMID:19652720

  11. Scale effects on information theory-based measures applied to streamflow patterns in two rural watersheds

    NASA Astrophysics Data System (ADS)

    Pan, Feng; Pachepsky, Yakov A.; Guber, Andrey K.; McPherson, Brian J.; Hill, Robert L.

    2012-01-01

    varying from 0.2 to 0.6 in the two watersheds. We conclude that temporal effects must be evaluated and accounted for when the information theory-based methods are used for performance evaluation and comparison of hydrological models.

  12. Trends in information theory-based chemical structure codification.

    PubMed

    Barigye, Stephen J; Marrero-Ponce, Yovani; Pérez-Giménez, Facundo; Bonchev, Danail

    2014-08-01

    This report offers a chronological review of the most relevant applications of information theory in the codification of chemical structure information, through the so-called information indices. Basically, these are derived from the analysis of the statistical patterns of molecular structure representations, which include primitive global chemical formulae, chemical graphs, or matrix representations. Finally, new approaches that attempt to go "back to the roots" of information theory, in order to integrate other information-theoretic measures in chemical structure coding are discussed.

  13. Kinetic theory based new upwind methods for inviscid compressible flows

    NASA Technical Reports Server (NTRS)

    Deshpande, S. M.

    1986-01-01

    Two new upwind methods called the Kinetic Numerical Method (KNM) and the Kinetic Flux Vector Splitting (KFVS) method for the solution of the Euler equations have been presented. Both of these methods can be regarded as some suitable moments of an upwind scheme for the solution of the Boltzmann equation provided the distribution function is Maxwellian. This moment-method strategy leads to a unification of the Riemann approach and the pseudo-particle approach used earlier in the development of upwind methods for the Euler equations. A very important aspect of the moment-method strategy is that the new upwind methods satisfy the entropy condition because of the Boltzmann H-Theorem and suggest a possible way of extending the Total Variation Diminishing (TVD) principle within the framework of the H-Theorem. The ability of these methods in obtaining accurate wiggle-free solution is demonstrated by applying them to two test problems.

  14. Density functional theory based generalized effective fragment potential method

    SciTech Connect

    Nguyen, Kiet A. E-mail: ruth.pachter@wpafb.af.mil; Pachter, Ruth E-mail: ruth.pachter@wpafb.af.mil; Day, Paul N.

    2014-06-28

    We present a generalized Kohn-Sham (KS) density functional theory (DFT) based effective fragment potential (EFP2-DFT) method for the treatment of solvent effects. Similar to the original Hartree-Fock (HF) based potential with fitted parameters for water (EFP1) and the generalized HF based potential (EFP2-HF), EFP2-DFT includes electrostatic, exchange-repulsion, polarization, and dispersion potentials, which are generated for a chosen DFT functional for a given isolated molecule. The method does not have fitted parameters, except for implicit parameters within a chosen functional and the dispersion correction to the potential. The electrostatic potential is modeled with a multipolar expansion at each atomic center and bond midpoint using Stone's distributed multipolar analysis. The exchange-repulsion potential between two fragments is composed of the overlap and kinetic energy integrals and the nondiagonal KS matrices in the localized molecular orbital basis. The polarization potential is derived from the static molecular polarizability. The dispersion potential includes the intermolecular D3 dispersion correction of Grimme et al. [J. Chem. Phys. 132, 154104 (2010)]. The potential generated from the CAMB3LYP functional has mean unsigned errors (MUEs) with respect to results from coupled cluster singles, doubles, and perturbative triples with a complete basis set limit (CCSD(T)/CBS) extrapolation, of 1.7, 2.2, 2.0, and 0.5 kcal/mol, for the S22, water-benzene clusters, water clusters, and n-alkane dimers benchmark sets, respectively. The corresponding EFP2-HF errors for the respective benchmarks are 2.41, 3.1, 1.8, and 2.5 kcal/mol. Thus, the new EFP2-DFT-D3 method with the CAMB3LYP functional provides comparable or improved results at lower computational cost and, therefore, extends the range of applicability of EFP2 to larger system sizes.

  15. Discovery and validation of information theory-based transcription factor and cofactor binding site motifs.

    PubMed

    Lu, Ruipeng; Mucaki, Eliseos J; Rogan, Peter K

    2016-11-28

    Data from ChIP-seq experiments can derive the genome-wide binding specificities of transcription factors (TFs) and other regulatory proteins. We analyzed 765 ENCODE ChIP-seq peak datasets of 207 human TFs with a novel motif discovery pipeline based on recursive, thresholded entropy minimization. This approach, while obviating the need to compensate for skewed nucleotide composition, distinguishes true binding motifs from noise, quantifies the strengths of individual binding sites based on computed affinity and detects adjacent cofactor binding sites that coordinate with the targets of primary, immunoprecipitated TFs. We obtained contiguous and bipartite information theory-based position weight matrices (iPWMs) for 93 sequence-specific TFs, discovered 23 cofactor motifs for 127 TFs and revealed six high-confidence novel motifs. The reliability and accuracy of these iPWMs were determined via four independent validation methods, including the detection of experimentally proven binding sites, explanation of effects of characterized SNPs, comparison with previously published motifs and statistical analyses. We also predict previously unreported TF coregulatory interactions (e.g. TF complexes). These iPWMs constitute a powerful tool for predicting the effects of sequence variants in known binding sites, performing mutation analysis on regulatory SNPs and predicting previously unrecognized binding sites and target genes.

  16. An information theory based search for homogeneity on the largest accessible scale

    NASA Astrophysics Data System (ADS)

    Sarkar, Suman; Pandey, Biswajit

    2016-11-01

    We analyse the Sloan Digital Sky Survey Data Release 12 quasar catalogue to test the large-scale smoothness in the quasar distribution. We quantify the degree of inhomogeneity in the quasar distribution using information theory based measures and find that the degree of inhomogeneity diminishes with increasing length scales which finally reach a plateau at ˜250 h-1 Mpc. The residual inhomogeneity at the plateau is consistent with that expected for a Poisson point process. Our results indicate that the quasar distribution is homogeneous beyond length scales of 250 h-1 Mpc.

  17. A variable-order laminated plate theory based on the variational-asymptotical method

    NASA Technical Reports Server (NTRS)

    Lee, Bok W.; Sutyrin, Vladislav G.; Hodges, Dewey H.

    1993-01-01

    The variational-asymptotical method is a mathematical technique by which the three-dimensional analysis of laminated plate deformation can be split into a linear, one-dimensional, through-the-thickness analysis and a nonlinear, two-dimensional, plate analysis. The elastic constants used in the plate analysis are obtained from the through-the-thickness analysis, along with approximate, closed-form three-dimensional distributions of displacement, strain, and stress. In this paper, a theory based on this technique is developed which is capable of approximating three-dimensional elasticity to any accuracy desired. The asymptotical method allows for the approximation of the through-the-thickness behavior in terms of the eigenfunctions of a certain Sturm-Liouville problem associated with the thickness coordinate. These eigenfunctions contain all the necessary information about the nonhomogeneities along the thickness coordinate of the plate and thus possess the appropriate discontinuities in the derivatives of displacement. The theory is presented in this paper along with numerical results for the eigenfunctions of various laminated plates.

  18. A variable-order laminated plate theory based on the variational-asymptotical method

    NASA Technical Reports Server (NTRS)

    Lee, Bok W.; Sutyrin, Vladislav G.; Hodges, Dewey H.

    1993-01-01

    The variational-asymptotical method is a mathematical technique by which the three-dimensional analysis of laminated plate deformation can be split into a linear, one-dimensional, through-the-thickness analysis and a nonlinear, two-dimensional, plate analysis. The elastic constants used in the plate analysis are obtained from the through-the-thickness analysis, along with approximate, closed-form three-dimensional distributions of displacement, strain, and stress. In this paper, a theory based on this technique is developed which is capable of approximating three-dimensional elasticity to any accuracy desired. The asymptotical method allows for the approximation of the through-the-thickness behavior in terms of the eigenfunctions of a certain Sturm-Liouville problem associated with the thickness coordinate. These eigenfunctions contain all the necessary information about the nonhomogeneities along the thickness coordinate of the plate and thus possess the appropriate discontinuities in the derivatives of displacement. The theory is presented in this paper along with numerical results for the eigenfunctions of various laminated plates.

  19. A new theory-based social classification in Japan and its validation using historically collected information.

    PubMed

    Hiyoshi, Ayako; Fukuda, Yoshiharu; Shipley, Martin J; Bartley, Mel; Brunner, Eric J

    2013-06-01

    Studies of health inequalities in Japan have increased since the millennium. However, there remains a lack of an accepted theory-based classification to measure occupation-related social position for Japan. This study attempts to derive such a classification based on the National Statistics Socio-economic Classification in the UK. Using routinely collected data from the nationally representative Comprehensive Survey of the Living Conditions of People on Health and Welfare, the Japanese Socioeconomic Classification was derived using two variables - occupational group and employment status. Validation analyses were conducted using household income, home ownership, self-rated good or poor health, and Kessler 6 psychological distress (n ≈ 36,000). After adjustment for age, marital status, and area (prefecture), one step lower social class was associated with mean 16% (p < 0.001) lower income, and a risk ratio of 0.93 (p < 0.001) for home ownership. The probability of good health showed a trend in men and women (risk ratio 0.94 and 0.93, respectively, for one step lower social class, p < 0.001). The trend for poor health was significant in women (odds ratio 1.12, p < 0.001) but not in men. Kessler 6 psychological distress showed significant trends in men (risk ratio 1.03, p = 0.044) and in women (1.05, p = 0.004). We propose the Japanese Socioeconomic Classification, derived from basic occupational and employment status information, as a meaningful, theory-based and standard classification system suitable for monitoring occupation-related health inequalities in Japan.

  20. Using a Mixed Methods Sequential Design to Identify Factors Associated with African American Mothers' Intention to Vaccinate Their Daughters Aged 9 to 12 for HPV with a Purpose of Informing a Culturally-Relevant, Theory-Based Intervention

    ERIC Educational Resources Information Center

    Cunningham, Jennifer L.

    2013-01-01

    The purpose of this sequential, explanatory mixed methods research study was to understand what factors influenced African American maternal intentions to get their daughters aged 9 years to 12 years vaccinated in Alabama. In the first, quantitative phase of the study, the research questions focused on identifying the predictive power of eleven…

  1. Using a Mixed Methods Sequential Design to Identify Factors Associated with African American Mothers' Intention to Vaccinate Their Daughters Aged 9 to 12 for HPV with a Purpose of Informing a Culturally-Relevant, Theory-Based Intervention

    ERIC Educational Resources Information Center

    Cunningham, Jennifer L.

    2013-01-01

    The purpose of this sequential, explanatory mixed methods research study was to understand what factors influenced African American maternal intentions to get their daughters aged 9 years to 12 years vaccinated in Alabama. In the first, quantitative phase of the study, the research questions focused on identifying the predictive power of eleven…

  2. Genome-wide prediction, display and refinement of binding sites with information theory-based models

    PubMed Central

    Gadiraju, Sashidhar; Vyhlidal, Carrie A; Leeder, J Steven; Rogan, Peter K

    2003-01-01

    Background We present Delila-genome, a software system for identification, visualization and analysis of protein binding sites in complete genome sequences. Binding sites are predicted by scanning genomic sequences with information theory-based (or user-defined) weight matrices. Matrices are refined by adding experimentally-defined binding sites to published binding sites. Delila-Genome was used to examine the accuracy of individual information contents of binding sites detected with refined matrices as a measure of the strengths of the corresponding protein-nucleic acid interactions. The software can then be used to predict novel sites by rescanning the genome with the refined matrices. Results Parameters for genome scans are entered using a Java-based GUI interface and backend scripts in Perl. Multi-processor CPU load-sharing minimized the average response time for scans of different chromosomes. Scans of human genome assemblies required 4–6 hours for transcription factor binding sites and 10–19 hours for splice sites, respectively, on 24- and 3-node Mosix and Beowulf clusters. Individual binding sites are displayed either as high-resolution sequence walkers or in low-resolution custom tracks in the UCSC genome browser. For large datasets, we applied a data reduction strategy that limited displays of binding sites exceeding a threshold information content to specific chromosomal regions within or adjacent to genes. An HTML document is produced listing binding sites ranked by binding site strength or chromosomal location hyperlinked to the UCSC custom track, other annotation databases and binding site sequences. Post-genome scan tools parse binding site annotations of selected chromosome intervals and compare the results of genome scans using different weight matrices. Comparisons of multiple genome scans can display binding sites that are unique to each scan and identify sites with significantly altered binding strengths. Conclusions Delila-Genome was used to

  3. Genome-wide prediction, display and refinement of binding sites with information theory-based models.

    PubMed

    Gadiraju, Sashidhar; Vyhlidal, Carrie A; Leeder, J Steven; Rogan, Peter K

    2003-09-08

    We present Delila-genome, a software system for identification, visualization and analysis of protein binding sites in complete genome sequences. Binding sites are predicted by scanning genomic sequences with information theory-based (or user-defined) weight matrices. Matrices are refined by adding experimentally-defined binding sites to published binding sites. Delila-Genome was used to examine the accuracy of individual information contents of binding sites detected with refined matrices as a measure of the strengths of the corresponding protein-nucleic acid interactions. The software can then be used to predict novel sites by rescanning the genome with the refined matrices. Parameters for genome scans are entered using a Java-based GUI interface and backend scripts in Perl. Multi-processor CPU load-sharing minimized the average response time for scans of different chromosomes. Scans of human genome assemblies required 4-6 hours for transcription factor binding sites and 10-19 hours for splice sites, respectively, on 24- and 3-node Mosix and Beowulf clusters. Individual binding sites are displayed either as high-resolution sequence walkers or in low-resolution custom tracks in the UCSC genome browser. For large datasets, we applied a data reduction strategy that limited displays of binding sites exceeding a threshold information content to specific chromosomal regions within or adjacent to genes. An HTML document is produced listing binding sites ranked by binding site strength or chromosomal location hyperlinked to the UCSC custom track, other annotation databases and binding site sequences. Post-genome scan tools parse binding site annotations of selected chromosome intervals and compare the results of genome scans using different weight matrices. Comparisons of multiple genome scans can display binding sites that are unique to each scan and identify sites with significantly altered binding strengths. Delila-Genome was used to scan the human genome sequence

  4. Electrostatic Introduction Theory Based Spatial Filtering Method for Solid Particle Velocity Measurement

    NASA Astrophysics Data System (ADS)

    Xu, Chuanlong; Tang, Guanghua; Zhou, Bin; Yang, Daoye; Zhang, Jianyong; Wang, Shimin

    2007-06-01

    Electrostatic induction theory based spatial filtering method for particle velocity measurement has the advantages of the simplicity of measurement system and of the convenience of data processing. In this paper, the relationship between solid particle velocity and the power spectrum of the output signal of the electrostatic senor was derived theoretically. And the effects of the length of the electrode, the thickness of the dielectric pipe and its length on the spatial filtering characteristics of the electrostatic sensor were investigated numerically using finite element method. Additionally, as for the roughness and the difficult determination of the peak frequency fmax of the power spectrum characteristics curve of the output signal, a wavelet analysis based filtering method was adopted to smooth the curve, which can determine peak frequency fmax accurately. Finally, the velocity measurement method was applied in a dense phase pneumatic conveying system under high pressure, and the experimental results show that the system repeatability is within ±4% over the gas superficial velocity range of 8.63-18.62 m/s for particle concentration range 0.067-0.130 m3/m3.

  5. An efficient graph theory based method to identify every minimal reaction set in a metabolic network

    PubMed Central

    2014-01-01

    Background Development of cells with minimal metabolic functionality is gaining importance due to their efficiency in producing chemicals and fuels. Existing computational methods to identify minimal reaction sets in metabolic networks are computationally expensive. Further, they identify only one of the several possible minimal reaction sets. Results In this paper, we propose an efficient graph theory based recursive optimization approach to identify all minimal reaction sets. Graph theoretical insights offer systematic methods to not only reduce the number of variables in math programming and increase its computational efficiency, but also provide efficient ways to find multiple optimal solutions. The efficacy of the proposed approach is demonstrated using case studies from Escherichia coli and Saccharomyces cerevisiae. In case study 1, the proposed method identified three minimal reaction sets each containing 38 reactions in Escherichia coli central metabolic network with 77 reactions. Analysis of these three minimal reaction sets revealed that one of them is more suitable for developing minimal metabolism cell compared to other two due to practically achievable internal flux distribution. In case study 2, the proposed method identified 256 minimal reaction sets from the Saccharomyces cerevisiae genome scale metabolic network with 620 reactions. The proposed method required only 4.5 hours to identify all the 256 minimal reaction sets and has shown a significant reduction (approximately 80%) in the solution time when compared to the existing methods for finding minimal reaction set. Conclusions Identification of all minimal reactions sets in metabolic networks is essential since different minimal reaction sets have different properties that effect the bioprocess development. The proposed method correctly identified all minimal reaction sets in a both the case studies. The proposed method is computationally efficient compared to other methods for finding minimal

  6. Information-theory-based solution of the inverse problem in classical statistical mechanics.

    PubMed

    D'Alessandro, Marco; Cilloco, Francesco

    2010-08-01

    We present a procedure for the determination of the interaction potential from the knowledge of the radial pair distribution function. The method, realized inside an inverse Monte Carlo simulation scheme, is based on the application of the maximum entropy principle of information theory and the interaction potential emerges as the asymptotic expression of the transition probability. Results obtained for high density monoatomic fluids are very satisfactory and provide an accurate extraction of the potential, despite a modest computational effort.

  7. Discovering Pair-Wise Genetic Interactions: An Information Theory-Based Approach

    PubMed Central

    Ignac, Tomasz M.; Skupin, Alexander; Sakhanenko, Nikita A.; Galas, David J.

    2014-01-01

    Phenotypic variation, including that which underlies health and disease in humans, results in part from multiple interactions among both genetic variation and environmental factors. While diseases or phenotypes caused by single gene variants can be identified by established association methods and family-based approaches, complex phenotypic traits resulting from multi-gene interactions remain very difficult to characterize. Here we describe a new method based on information theory, and demonstrate how it improves on previous approaches to identifying genetic interactions, including both synthetic and modifier kinds of interactions. We apply our measure, called interaction distance, to previously analyzed data sets of yeast sporulation efficiency, lipid related mouse data and several human disease models to characterize the method. We show how the interaction distance can reveal novel gene interaction candidates in experimental and simulated data sets, and outperforms other measures in several circumstances. The method also allows us to optimize case/control sample composition for clinical studies. PMID:24670935

  8. Managing for resilience: an information theory-based approach to assessing ecosystems

    EPA Science Inventory

    Ecosystems are complex and multivariate; hence, methods to assess the dynamics of ecosystems should have the capacity to evaluate multiple indicators simultaneously. Most research on identifying leading indicators of regime shifts has focused on univariate methods and simple mod...

  9. Prediction of mutant mRNA splice isoforms by information theory-based exon definition.

    PubMed

    Mucaki, Eliseos J; Shirley, Ben C; Rogan, Peter K

    2013-04-01

    Mutations that affect mRNA splicing often produce multiple mRNA isoforms, resulting in complex molecular phenotypes. Definition of an exon and its inclusion in mature mRNA relies on joint recognition of both acceptor and donor splice sites. This study predicts cryptic and exon-skipping isoforms in mRNA produced by splicing mutations from the combined information contents (R(i), which measures binding-site strength, in bits) and distribution of the splice sites defining these exons. The total information content of an exon (R(i),total) is the sum of the R(i) values of its acceptor and donor splice sites, adjusted for the self-information of the distance separating these sites, that is, the gap surprisal. Differences between total information contents of an exon (ΔR(i,total)) are predictive of the relative abundance of these exons in distinct processed mRNAs. Constraints on splice site and exon selection are used to eliminate nonconforming and poorly expressed isoforms. Molecular phenotypes are computed by the Automated Splice Site and Exon Definition Analysis (http://splice.uwo.ca) server. Predictions of splicing mutations were highly concordant (85.2%; n = 61) with published expression data. In silico exon definition analysis will contribute to streamlining assessment of abnormal and normal splice isoforms resulting from mutations. © 2013 Wiley Periodicals, Inc.

  10. Analysis and Comparison of Information Theory-based Distances for Genomic Strings

    NASA Astrophysics Data System (ADS)

    Balzano, Walter; Cicalese, Ferdinando; Del Sorbo, Maria Rosaria; Vaccaro, Ugo

    2008-07-01

    Genomic string comparison via alignment are widely applied for mining and retrieval of information in biological databases. In some situation, the effectiveness of such alignment based comparison is still unclear, e.g., for sequences with non-uniform length and with significant shuffling of identical substrings. An alternative approach is the one based on information theory distances. Biological data information content is stored in very long strings of only four characters. In last ten years, several entropic measures have been proposed for genomic string analysis. Notwithstanding their individual merit and experimental validation, to the nest of our knowledge, there is no direct comparison of these different metrics. We shall present four of the most representative alignment-free distance measures, based on mutual information. Each one has a different origin and expression. Our comparison involves a sort of arrangement, to reduce different concepts to a unique formalism, so as it has been possible to construct a phylogenetic tree for each of them. The trees produced via these metrics are compared to the ones widely accepted as biologically validated. In general the results provided more evidence of the reliability of the alignment-free distance models. Also, we observe that one of the metrics appeared to be more robust than the other three. We believe that this result can be object of further researches and observations. Many of the results of experimentation, the graphics and the table are available at the following URL: http://people.na.infn.it/˜wbalzano/BIO

  11. Scale effects on information theory-based measures applied to streamflow patterns in two rural watersheds

    USDA-ARS?s Scientific Manuscript database

    Understanding streamflow patterns in space and time is important to improve the flood and drought forecasting, water resources management, and predictions of ecological changes. The objectives of this work were (a) to characterize the spatial and temporal patterns of streamflow using information the...

  12. Signal Detection Theory-Based Information Processing for the Detection of Breast Cancer at Microwave Frequencies

    DTIC Science & Technology

    2002-08-01

    the measurement noise, as well as the physical model of the forward scattered electric field. The Bayesian algorithms for the Uncertain Permittivity...received at multiple sensors. In this research project a tissue- model -based signal-detection theory approach for the detection of mammary tumors in the...oriented information processors. In this research project a tissue- model - based signal detection theory approach for the detection of mammary tumors in the

  13. Multivariate drought index: An information theory based approach for integrated drought assessment

    NASA Astrophysics Data System (ADS)

    Rajsekhar, Deepthi; Singh, Vijay. P.; Mishra, Ashok. K.

    2015-07-01

    Most of the existing drought indices are based on a single variable (e.g. precipitation) or a combination of two variables (e.g., precipitation and streamflow). This may not be sufficient for reliable quantification of the existing drought condition. It is possible that a region might be experiencing only a single type of drought at times, but multiple drought types affecting a region is quite common too. To have a comprehensive representation, it is better to consider all the variables that lead to different physical forms of drought, such as meteorological, hydrological, and agricultural droughts. Therefore, we propose to develop a multivariate drought index (MDI) that will utilize information from hydroclimatic variables, including precipitation, runoff, evapotranspiration and soil moisture as indicator variables, thus accounting for all the physical forms of drought. The entropy theory was utilized to develop this proposed index, that led to the smallest set of features maximally preserving the information of the input data set. MDI was then compared with the Palmer drought severity index (PDSI) for all climate regions within Texas for the time period 1950-2012, with particular attention to the two major drought occurrences in Texas, viz. the droughts which occurred in 1950-1957, and 2010-2011. The proposed MDI was found to represent drought conditions well, due to its multivariate, multi scalar, and nonlinear properties. To help the user choose the right time scale for further analysis, entropy maps of MDI at different time scales were used as a guideline. The MDI time scale that has the highest entropy value may be chosen, since a higher entropy indicates a higher information content.

  14. Information-theory-based snake adapted to multiregion objects with different noise models

    NASA Astrophysics Data System (ADS)

    Galland, Frédéric; Réfrégier, Philippe

    2004-07-01

    We propose a segmentation technique adapted to objects composed of several regions with gray-level fluctuations described by different probability laws. This approach is based on information theory techniques and leads to a multiregion polygonal snake driven by the minimization of a criterion without any parameters to be tuned by the user. We demonstrate the improvements obtained with this approach as well as its low computational cost. This approach is compatible with applications such as object recognition and object tracking with nonrigid deformation in images perturbed by different types of optical noise.

  15. What gets recycled: an information theory based model for product recycling.

    PubMed

    Dahmus, Jeffrey B; Gutowski, Timothy G

    2007-11-01

    This work focuses on developing a concise representation of the material recycling potential for products at end-of life. To do this we propose a model similar to the "Sherwood Plot", but for products rather than for dilute mixtures. The difference is reflected in the material composition and the processing systems used for the two different applications. Cost estimates for product recycling systems are developed using Shannon's information theory. The resulting model is able to resolve the material recycling potential for a wide range of end-of-life products with vastly different material compositions and recycling rates in the U.S. Preliminary data on historical trends in product design suggest a significant shift toward less recyclable products.

  16. An information theory based framework for the measurement of population health.

    PubMed

    Nesson, Erik T; Robinson, Joshua J

    2015-04-01

    This paper proposes a new framework for the measurement of population health and the ranking of the health of different geographies. Since population health is a latent variable, studies which measure and rank the health of different geographies must aggregate observable health attributes into one summary measure. We show that the methods used in nearly all the literature to date implicitly assume that all attributes are infinitely substitutable. Our method, based on the measurement of multidimensional welfare and inequality, minimizes the entropic distance between the summary measure of population health and the distribution of the underlying attributes. This summary function coincides with the constant elasticity of substitution and Cobb-Douglas production functions and naturally allows different assumptions regarding attribute substitutability or complementarity. To compare methodologies, we examine a well-known ranking of the population health of U.S. states, America's Health Rankings. We find that states' rankings are somewhat sensitive to changes in the weight given to each attribute, but very sensitive to changes in aggregation methodology. Our results have broad implications for well-known health rankings such as the 2000 World Health Report, as well as other measurements of population and individual health levels and the measurement and decomposition of health inequality.

  17. A gas-kinetic theory based multidimensional high-order method for the compressible Navier-Stokes solutions

    NASA Astrophysics Data System (ADS)

    Ren, Xiaodong; Xu, Kun; Shyy, Wei

    2017-08-01

    This paper presents a gas-kinetic theory based multidimensional high-order method for the compressible Naiver-Stokes solutions. In our previous study, a spatially and temporally dependent third-order flux scheme with the use of a third-order gas distribution function is employed. However, the third-order flux scheme is quite complicated and less robust than the second-order scheme. In order to reduce its complexity and improve its robustness, the second-order flux scheme is adopted instead in this paper, while the temporal order of method is maintained by using a two stage temporal discretization. In addition, its CPU cost is relatively lower than the previous scheme. Several test cases in two and three dimensions, containing high Mach number compressible flows and low speed high Reynolds number laminar flows, are presented to demonstrate the method capacity.

  18. A second-order accurate kinetic-theory-based method for inviscid compressible flows

    NASA Technical Reports Server (NTRS)

    Deshpande, Suresh M.

    1986-01-01

    An upwind method for the numerical solution of the Euler equations is presented. This method, called the kinetic numerical method (KNM), is based on the fact that the Euler equations are moments of the Boltzmann equation of the kinetic theory of gases when the distribution function is Maxwellian. The KNM consists of two phases, the convection phase and the collision phase. The method is unconditionally stable and explicit. It is highly vectorizable and can be easily made total variation diminishing for the distribution function by a suitable choice of the interpolation strategy. The method is applied to a one-dimensional shock-propagation problem and to a two-dimensional shock-reflection problem.

  19. Perturbative method for the derivation of quantum kinetic theory based on closed-time-path formalism.

    PubMed

    Koide, Jun

    2002-02-01

    Within the closed-time-path formalism, a perturbative method is presented, which reduces the microscopic field theory to the quantum kinetic theory. In order to make this reduction, the expectation value of a physical quantity must be calculated under the condition that the Wigner distribution function is fixed, because it is the independent dynamical variable in the quantum kinetic theory. It is shown that when a nonequilibrium Green function in the form of the generalized Kadanoff-Baym ansatz is utilized, this condition appears as a cancellation of a certain part of contributions in the diagrammatic expression of the expectation value. Together with the quantum kinetic equation, which can be derived in the closed-time-path formalism, this method provides a basis for the kinetic-theoretical description.

  20. Gas-Kinetic Theory Based Flux Splitting Method for Ideal Magnetohydrodynamics

    NASA Technical Reports Server (NTRS)

    Xu, Kun

    1998-01-01

    A gas-kinetic solver is developed for the ideal magnetohydrodynamics (MHD) equations. The new scheme is based on the direct splitting of the flux function of the MHD equations with the inclusion of "particle" collisions in the transport process. Consequently, the artificial dissipation in the new scheme is much reduced in comparison with the MHD Flux Vector Splitting Scheme. At the same time, the new scheme is compared with the well-developed Roe-type MHD solver. It is concluded that the kinetic MHD scheme is more robust and efficient than the Roe- type method, and the accuracy is competitive. In this paper the general principle of splitting the macroscopic flux function based on the gas-kinetic theory is presented. The flux construction strategy may shed some light on the possible modification of AUSM- and CUSP-type schemes for the compressible Euler equations, as well as to the development of new schemes for a non-strictly hyperbolic system.

  1. Practical application of game theory based production flow planning method in virtual manufacturing networks

    NASA Astrophysics Data System (ADS)

    Olender, M.; Krenczyk, D.

    2016-08-01

    Modern enterprises have to react quickly to dynamic changes in the market, due to changing customer requirements and expectations. One of the key area of production management, that must continuously evolve by searching for new methods and tools for increasing the efficiency of manufacturing systems is the area of production flow planning and control. These aspects are closely connected with the ability to implement the concept of Virtual Enterprises (VE) and Virtual Manufacturing Network (VMN) in which integrated infrastructure of flexible resources are created. In the proposed approach, the players role perform the objects associated with the objective functions, allowing to solve the multiobjective production flow planning problems based on the game theory, which is based on the theory of the strategic situation. For defined production system and production order models ways of solving the problem of production route planning in VMN on computational examples for different variants of production flow is presented. Possible decision strategy to use together with an analysis of calculation results is shown.

  2. Assessing the density functional theory-based multireference configuration interaction (DFT/MRCI) method for transition metal complexes

    SciTech Connect

    Escudero, Daniel E-mail: thiel@kofo.mpg.de; Thiel, Walter E-mail: thiel@kofo.mpg.de

    2014-05-21

    We report an assessment of the performance of density functional theory-based multireference configuration interaction (DFT/MRCI) calculations for a set of 3d- and 4d-transition metal (TM) complexes. The DFT/MRCI results are compared to published reference data from reliable high-level multi-configurational ab initio studies. The assessment covers the relative energies of different ground-state minima of the highly correlated CrF{sub 6} complex, the singlet and triplet electronically excited states of seven typical TM complexes (MnO{sub 4}{sup −}, Cr(CO){sub 6}, [Fe(CN){sub 6}]{sup 4−}, four larger Fe and Ru complexes), and the corresponding electronic spectra (vertical excitation energies and oscillator strengths). It includes comparisons with results from different flavors of time-dependent DFT (TD-DFT) calculations using pure, hybrid, and long-range corrected functionals. The DFT/MRCI method is found to be superior to the tested TD-DFT approaches and is thus recommended for exploring the excited-state properties of TM complexes.

  3. Assessing the density functional theory-based multireference configuration interaction (DFT/MRCI) method for transition metal complexes.

    PubMed

    Escudero, Daniel; Thiel, Walter

    2014-05-21

    We report an assessment of the performance of density functional theory-based multireference configuration interaction (DFT/MRCI) calculations for a set of 3d- and 4d-transition metal (TM) complexes. The DFT/MRCI results are compared to published reference data from reliable high-level multi-configurational ab initio studies. The assessment covers the relative energies of different ground-state minima of the highly correlated CrF6 complex, the singlet and triplet electronically excited states of seven typical TM complexes (MnO4(-), Cr(CO)6, [Fe(CN)6](4-), four larger Fe and Ru complexes), and the corresponding electronic spectra (vertical excitation energies and oscillator strengths). It includes comparisons with results from different flavors of time-dependent DFT (TD-DFT) calculations using pure, hybrid, and long-range corrected functionals. The DFT/MRCI method is found to be superior to the tested TD-DFT approaches and is thus recommended for exploring the excited-state properties of TM complexes.

  4. Information theory-based scoring function for the structure-based prediction of protein-ligand binding affinity.

    PubMed

    Kulharia, Mahesh; Goody, Roger S; Jackson, Richard M

    2008-10-01

    The development and validation of a new knowledge based scoring function (SIScoreJE) to predict binding energy between proteins and ligands is presented. SIScoreJE efficiently predicts the binding energy between a small molecule and its protein receptor. Protein-ligand atomic contact information was derived from a Non-Redundant Data set (NRD) of over 3000 X-ray crystal structures of protein-ligand complexes. This information was classified for individual "atom contact pairs" (ACP) which is used to calculate the atomic contact preferences. In addition to the two schemes generated in this study we have assessed a number of other common atom-type classification schemes. The preferences were calculated using an information theoretic relationship of joint entropy. Among 18 different atom-type classification schemes "ScoreJE Atom Type set2" (SATs2) was found to be the most suitable for our approach. To test the sensitivity of the method to the inclusion of solvent, Single-body Solvation Potentials (SSP) were also derived from the atomic contacts between the protein atom types and water molecules modeled using AQUARIUS2. Validation was carried out using an evaluation data set of 100 protein-ligand complexes with known binding energies to test the ability of the scoring functions to reproduce known binding affinities. In summary, it was found that a combined SSP/ScoreJE (SIScoreJE) performed significantly better than ScoreJE alone, and SIScoreJE and ScoreJE performed better than GOLD::GoldScore, GOLD::ChemScore, and XScore.

  5. Battling the challenges of training nurses to use information systems through theory-based training material design.

    PubMed

    Galani, Malatsi; Yu, Ping; Paas, Fred; Chandler, Paul

    2014-01-01

    The attempts to train nurses to effectively use information systems have had mixed results. One problem is that training materials are not adequately designed to guide trainees to gradually learn to use a system without experiencing a heavy cognitive load. This is because training design often does not take into consideration a learner's cognitive ability to absorb new information in a short training period. Given the high cost and difficulty of organising training in healthcare organisations, there is an urgent need for information system trainers to be aware of how cognitive overload or information overload affect a trainee's capability to acquire new knowledge and skills, and what instructional techniques can be used to facilitate effective learning. This paper introduces the concept of cognitive load and how it affects nurses when learning to use a new health information system. This is followed by the relevant strategies for instructional design, underpinned by the principles of cognitive load theory, which may be helpful for the development of effective instructional materials and activities for training nurses to use information systems.

  6. Hybrid Evidence Theory-based Finite Element/Statistical Energy Analysis method for mid-frequency analysis of built-up systems with epistemic uncertainties

    NASA Astrophysics Data System (ADS)

    Yin, Shengwen; Yu, Dejie; Yin, Hui; Lü, Hui; Xia, Baizhan

    2017-09-01

    Considering the epistemic uncertainties within the hybrid Finite Element/Statistical Energy Analysis (FE/SEA) model when it is used for the response analysis of built-up systems in the mid-frequency range, the hybrid Evidence Theory-based Finite Element/Statistical Energy Analysis (ETFE/SEA) model is established by introducing the evidence theory. Based on the hybrid ETFE/SEA model and the sub-interval perturbation technique, the hybrid Sub-interval Perturbation and Evidence Theory-based Finite Element/Statistical Energy Analysis (SIP-ETFE/SEA) approach is proposed. In the hybrid ETFE/SEA model, the uncertainty in the SEA subsystem is modeled by a non-parametric ensemble, while the uncertainty in the FE subsystem is described by the focal element and basic probability assignment (BPA), and dealt with evidence theory. Within the hybrid SIP-ETFE/SEA approach, the mid-frequency response of interest, such as the ensemble average of the energy response and the cross-spectrum response, is calculated analytically by using the conventional hybrid FE/SEA method. Inspired by the probability theory, the intervals of the mean value, variance and cumulative distribution are used to describe the distribution characteristics of mid-frequency responses of built-up systems with epistemic uncertainties. In order to alleviate the computational burdens for the extreme value analysis, the sub-interval perturbation technique based on the first-order Taylor series expansion is used in ETFE/SEA model to acquire the lower and upper bounds of the mid-frequency responses over each focal element. Three numerical examples are given to illustrate the feasibility and effectiveness of the proposed method.

  7. What Do You Think You Are Measuring? A Mixed-Methods Procedure for Assessing the Content Validity of Test Items and Theory-Based Scaling.

    PubMed

    Koller, Ingrid; Levenson, Michael R; Glück, Judith

    2017-01-01

    The valid measurement of latent constructs is crucial for psychological research. Here, we present a mixed-methods procedure for improving the precision of construct definitions, determining the content validity of items, evaluating the representativeness of items for the target construct, generating test items, and analyzing items on a theoretical basis. To illustrate the mixed-methods content-scaling-structure (CSS) procedure, we analyze the Adult Self-Transcendence Inventory, a self-report measure of wisdom (ASTI, Levenson et al., 2005). A content-validity analysis of the ASTI items was used as the basis of psychometric analyses using multidimensional item response models (N = 1215). We found that the new procedure produced important suggestions concerning five subdimensions of the ASTI that were not identifiable using exploratory methods. The study shows that the application of the suggested procedure leads to a deeper understanding of latent constructs. It also demonstrates the advantages of theory-based item analysis.

  8. What Do You Think You Are Measuring? A Mixed-Methods Procedure for Assessing the Content Validity of Test Items and Theory-Based Scaling

    PubMed Central

    Koller, Ingrid; Levenson, Michael R.; Glück, Judith

    2017-01-01

    The valid measurement of latent constructs is crucial for psychological research. Here, we present a mixed-methods procedure for improving the precision of construct definitions, determining the content validity of items, evaluating the representativeness of items for the target construct, generating test items, and analyzing items on a theoretical basis. To illustrate the mixed-methods content-scaling-structure (CSS) procedure, we analyze the Adult Self-Transcendence Inventory, a self-report measure of wisdom (ASTI, Levenson et al., 2005). A content-validity analysis of the ASTI items was used as the basis of psychometric analyses using multidimensional item response models (N = 1215). We found that the new procedure produced important suggestions concerning five subdimensions of the ASTI that were not identifiable using exploratory methods. The study shows that the application of the suggested procedure leads to a deeper understanding of latent constructs. It also demonstrates the advantages of theory-based item analysis. PMID:28270777

  9. Nonlinear gyrokinetic theory based on a new method and computation of the guiding-center orbit in tokamaks

    SciTech Connect

    Xu, Yingfeng Dai, Zongliang; Wang, Shaojie

    2014-04-15

    The nonlinear gyrokinetic theory in the tokamak configuration based on the two-step transform is developed; in the first step, we transform the magnetic potential perturbation to the Hamiltonian part, and in the second step, we transform away the gyroangle-dependent part of the perturbed Hamiltonian. Then the I-transform method is used to decoupled the perturbation part of the motion from the unperturbed motion. The application of the I-transform method to the computation of the guiding-center orbit and the guiding-center distribution function in tokamaks is presented. It is demonstrated that the I-transform method of the orbit computation which involves integrating only along the unperturbed orbit agrees with the conventional method which integrates along the full orbit. A numerical code based on the I-transform method is developed and two numerical examples are given to verify the new method.

  10. Did you have an impact? A theory-based method for planning and evaluating knowledge-transfer and exchange activities in occupational health and safety.

    PubMed

    Kramer, Desré M; Wells, Richard P; Carlan, Nicolette; Aversa, Theresa; Bigelow, Philip P; Dixon, Shane M; McMillan, Keith

    2013-01-01

    Few evaluation tools are available to assess knowledge-transfer and exchange interventions. The objective of this paper is to develop and demonstrate a theory-based knowledge-transfer and exchange method of evaluation (KEME) that synthesizes 3 theoretical frameworks: the promoting action on research implementation of health services (PARiHS) model, the transtheoretical model of change, and a model of knowledge use. It proposes a new term, keme, to mean a unit of evidence-based transferable knowledge. The usefulness of the evaluation method is demonstrated with 4 occupational health and safety knowledge transfer and exchange (KTE) implementation case studies that are based upon the analysis of over 50 pre-existing interviews. The usefulness of the evaluation model has enabled us to better understand stakeholder feedback, frame our interpretation, and perform a more comprehensive evaluation of the knowledge use outcomes of our KTE efforts.

  11. A third-generation density-functional-theory-based method for calculating canonical molecular orbitals of large molecules.

    PubMed

    Hirano, Toshiyuki; Sato, Fumitoshi

    2014-07-28

    We used grid-free modified Cholesky decomposition (CD) to develop a density-functional-theory (DFT)-based method for calculating the canonical molecular orbitals (CMOs) of large molecules. Our method can be used to calculate standard CMOs, analytically compute exchange-correlation terms, and maximise the capacity of next-generation supercomputers. Cholesky vectors were first analytically downscaled using low-rank pivoted CD and CD with adaptive metric (CDAM). The obtained Cholesky vectors were distributed and stored on each computer node in a parallel computer, and the Coulomb, Fock exchange, and pure exchange-correlation terms were calculated by multiplying the Cholesky vectors without evaluating molecular integrals in self-consistent field iterations. Our method enables DFT and massively distributed memory parallel computers to be used in order to very efficiently calculate the CMOs of large molecules.

  12. Fuzzy theory based control method for an in-pipe robot to move in variable resistance environment

    NASA Astrophysics Data System (ADS)

    Li, Te; Ma, Shugen; Li, Bin; Wang, Minghui; Wang, Yuechao

    2015-11-01

    Most of the existing screw drive in-pipe robots cannot actively adjust the maximum traction capacity, which limits the adaptability to the wide range of variable environment resistance, especially in curved pipes. In order to solve this problem, a screw drive in-pipe robot based on adaptive linkage mechanism is proposed. The differential property of the adaptive linkage mechanism allows the robot to move without motion interference in the straight and varied curved pipes by adjusting inclining angles of rollers self-adaptively. The maximum traction capacity of the robot can be changed by actively adjusting the inclining angles of rollers. In order to improve the adaptability to the variable resistance, a torque control method based on the fuzzy controller is proposed. For the variable environment resistance, the proposed control method can not only ensure enough traction force, but also limit the output torque in a feasible region. In the simulations, the robot with the proposed control method is compared to the robot with fixed inclining angles of rollers. The results show that the combination of the torque control method and the proposed robot achieves the better adaptability to the variable resistance in the straight and curved pipes.

  13. Analytic second derivative of the energy for density functional theory based on the three-body fragment molecular orbital method.

    PubMed

    Nakata, Hiroya; Fedorov, Dmitri G; Zahariev, Federico; Schmidt, Michael W; Kitaura, Kazuo; Gordon, Mark S; Nakamura, Shinichiro

    2015-03-28

    Analytic second derivatives of the energy with respect to nuclear coordinates have been developed for spin restricted density functional theory (DFT) based on the fragment molecular orbital method (FMO). The derivations were carried out for the three-body expansion (FMO3), and the two-body expressions can be obtained by neglecting the three-body corrections. Also, the restricted Hartree-Fock (RHF) Hessian for FMO3 can be obtained by neglecting the density-functional related terms. In both the FMO-RHF and FMO-DFT Hessians, certain terms with small magnitudes are neglected for computational efficiency. The accuracy of the FMO-DFT Hessian in terms of the Gibbs free energy is evaluated for a set of polypeptides and water clusters and found to be within 1 kcal/mol of the corresponding full (non-fragmented) ab initio calculation. The FMO-DFT method is also applied to transition states in SN2 reactions and for the computation of the IR and Raman spectra of a small Trp-cage protein (PDB: 1L2Y). Some computational timing analysis is also presented.

  14. Analytic second derivative of the energy for density functional theory based on the three-body fragment molecular orbital method

    SciTech Connect

    Nakata, Hiroya; Fedorov, Dmitri G.; Zahariev, Federico; Schmidt, Michael W.; Gordon, Mark S.; Kitaura, Kazuo; Nakamura, Shinichiro

    2015-03-28

    Analytic second derivatives of the energy with respect to nuclear coordinates have been developed for spin restricted density functional theory (DFT) based on the fragment molecular orbital method (FMO). The derivations were carried out for the three-body expansion (FMO3), and the two-body expressions can be obtained by neglecting the three-body corrections. Also, the restricted Hartree-Fock (RHF) Hessian for FMO3 can be obtained by neglecting the density-functional related terms. In both the FMO-RHF and FMO-DFT Hessians, certain terms with small magnitudes are neglected for computational efficiency. The accuracy of the FMO-DFT Hessian in terms of the Gibbs free energy is evaluated for a set of polypeptides and water clusters and found to be within 1 kcal/mol of the corresponding full (non-fragmented) ab initio calculation. The FMO-DFT method is also applied to transition states in S{sub N}2 reactions and for the computation of the IR and Raman spectra of a small Trp-cage protein (PDB: 1L2Y). Some computational timing analysis is also presented.

  15. Analytic second derivative of the energy for density functional theory based on the three-body fragment molecular orbital method

    NASA Astrophysics Data System (ADS)

    Nakata, Hiroya; Fedorov, Dmitri G.; Zahariev, Federico; Schmidt, Michael W.; Kitaura, Kazuo; Gordon, Mark S.; Nakamura, Shinichiro

    2015-03-01

    Analytic second derivatives of the energy with respect to nuclear coordinates have been developed for spin restricted density functional theory (DFT) based on the fragment molecular orbital method (FMO). The derivations were carried out for the three-body expansion (FMO3), and the two-body expressions can be obtained by neglecting the three-body corrections. Also, the restricted Hartree-Fock (RHF) Hessian for FMO3 can be obtained by neglecting the density-functional related terms. In both the FMO-RHF and FMO-DFT Hessians, certain terms with small magnitudes are neglected for computational efficiency. The accuracy of the FMO-DFT Hessian in terms of the Gibbs free energy is evaluated for a set of polypeptides and water clusters and found to be within 1 kcal/mol of the corresponding full (non-fragmented) ab initio calculation. The FMO-DFT method is also applied to transition states in SN2 reactions and for the computation of the IR and Raman spectra of a small Trp-cage protein (PDB: 1L2Y). Some computational timing analysis is also presented.

  16. A theory-based method for the evaluation of individual quality of life: the SEIQoL.

    PubMed

    Joyce, C R B; Hickey, A; McGee, H M; O'Boyle, C A

    2003-05-01

    Few reports about methods of evaluating quality of life (QoL) among the thousands published since medical interest in the subject slowly began nearly 40 years ago are based upon theory. This paper, prepared in response to a request to furnish an exception (Meadows KA. Introduction to an Advanced Seminar: Assessing Health-Related Quality of Life. What can the Cognitive Sciences Contribute? Hull University, October 9, 2000) describes the origins of the Schedule for the Evaluation of Individual Quality of Life (SEIQoL). This derives its cognitive aspects from theoretical studies of perception by Egon Brunswik, their extension to Social Judgment Theory (SJT) by Kenneth Hammond and the application of these ideas to QoL by the present authors and their colleagues.

  17. Using community participation to assess acceptability of "Contra Caries", a theory-based, promotora-led oral health education program for rural Latino parents: a mixed methods study.

    PubMed

    Hoeft, Kristin S; Rios, Sarah M; Pantoja Guzman, Estela; Barker, Judith C

    2015-09-03

    Latino children experience more prevalent and severe tooth decay than non-Hispanic white and non-Hispanic black children. Few theory-based, evaluated and culturally appropriate interventions target parents of this vulnerable population. To fill this gap, the Contra Caries Oral Health Education Program, a theory-based, promotora-led education program for low-income, Spanish-speaking parents of children aged 1-5 years, was developed. This article describes qualitative findings of the acceptability of curriculum content and activities, presents the process of refinement of the curriculum through engaging the target population and promotoras, and presents results from the evaluation assessing the acceptability of the curriculum once implemented. Focus groups were conducted with low-income Spanish-speaking parents of children 1-5 years living in a city in an agricultural area of California. Interviews were digitally recorded, translated and transcribed, checked for accuracy and the resulting data was thematically coded and analyzed using a social constructionist approach. The Contra Caries Oral Health Education Program was then implemented with a separate but similar sample, and after completing the program, participants were administered surveys asking about acceptability and favorite activities of the education program. Data were entered into a database, checked for accuracy, open-ended questions were categorized, and responses to close-ended questions counted. Twelve focus groups were conducted (N = 51), 105 parents attended the Contra Caries Oral Health Education Program, and 83 parents filled out surveys. Complete attendance and retention was high (89% and 90%, respectively). This study found that their children's oral health is a high priority. Parents were not only interested in, but actually attended classes focused on increasing their knowledge and skills with respect to early childhood oral health. The Contra Caries content and format was perceived as

  18. A comparison of item response theory-based methods for examining differential item functioning in object naming test by language of assessment among older Latinos

    PubMed Central

    Yang, Frances M.; Heslin, Kevin C.; Mehta, Kala M.; Yang, Cheng-Wu; Ocepek-Welikson, Katja; Kleinman, Marjorie; Morales, Leo S.; Hays, Ron D.; Stewart, Anita L.; Mungas, Dan; Jones, Richard N.; Teresi, Jeanne A.

    2012-01-01

    Object naming tests are commonly included in neuropsychological test batteries. Differential item functioning (DIF) in these tests due to cultural and language differences may compromise the validity of cognitive measures in diverse populations. We evaluated 26 object naming items for DIF due to Spanish and English language translations among Latinos (n=1,159), mean age of 70.5 years old (Standard Deviation (SD)±7.2), using the following four item response theory-based approaches: Mplus/Multiple Indicator, Multiple Causes (Mplus/MIMIC; Muthén & Muthén, 1998–2011), Item Response Theory Likelihood Ratio Differential Item Functioning (IRTLRDIF/MULTILOG; Thissen, 1991, 2001), difwithpar/Parscale (Crane, Gibbons, Jolley, & van Belle, 2006; Muraki & Bock, 2003), and Differential Functioning of Items and Tests/MULTILOG (DFIT/MULTILOG; Flowers, Oshima, & Raju, 1999; Thissen, 1991). Overall, there was moderate to near perfect agreement across methods. Fourteen items were found to exhibit DIF and 5 items observed consistently across all methods, which were more likely to be answered correctly by individuals tested in Spanish after controlling for overall ability. PMID:23471423

  19. NbIT - A New Information Theory-Based Analysis of Allosteric Mechanisms Reveals Residues that Underlie Function in the Leucine Transporter LeuT

    PubMed Central

    LeVine, Michael V.; Weinstein, Harel

    2014-01-01

    Complex networks of interacting residues and microdomains in the structures of biomolecular systems underlie the reliable propagation of information from an input signal, such as the concentration of a ligand, to sites that generate the appropriate output signal, such as enzymatic activity. This information transduction often carries the signal across relatively large distances at the molecular scale in a form of allostery that is essential for the physiological functions performed by biomolecules. While allosteric behaviors have been documented from experiments and computation, the mechanism of this form of allostery proved difficult to identify at the molecular level. Here, we introduce a novel analysis framework, called N-body Information Theory (NbIT) analysis, which is based on information theory and uses measures of configurational entropy in a biomolecular system to identify microdomains and individual residues that act as (i)-channels for long-distance information sharing between functional sites, and (ii)-coordinators that organize dynamics within functional sites. Application of the new method to molecular dynamics (MD) trajectories of the occluded state of the bacterial leucine transporter LeuT identifies a channel of allosteric coupling between the functionally important intracellular gate and the substrate binding sites known to modulate it. NbIT analysis is shown also to differentiate residues involved primarily in stabilizing the functional sites, from those that contribute to allosteric couplings between sites. NbIT analysis of MD data thus reveals rigorous mechanistic elements of allostery underlying the dynamics of biomolecular systems. PMID:24785005

  20. NbIT--a new information theory-based analysis of allosteric mechanisms reveals residues that underlie function in the leucine transporter LeuT.

    PubMed

    LeVine, Michael V; Weinstein, Harel

    2014-05-01

    Complex networks of interacting residues and microdomains in the structures of biomolecular systems underlie the reliable propagation of information from an input signal, such as the concentration of a ligand, to sites that generate the appropriate output signal, such as enzymatic activity. This information transduction often carries the signal across relatively large distances at the molecular scale in a form of allostery that is essential for the physiological functions performed by biomolecules. While allosteric behaviors have been documented from experiments and computation, the mechanism of this form of allostery proved difficult to identify at the molecular level. Here, we introduce a novel analysis framework, called N-body Information Theory (NbIT) analysis, which is based on information theory and uses measures of configurational entropy in a biomolecular system to identify microdomains and individual residues that act as (i)-channels for long-distance information sharing between functional sites, and (ii)-coordinators that organize dynamics within functional sites. Application of the new method to molecular dynamics (MD) trajectories of the occluded state of the bacterial leucine transporter LeuT identifies a channel of allosteric coupling between the functionally important intracellular gate and the substrate binding sites known to modulate it. NbIT analysis is shown also to differentiate residues involved primarily in stabilizing the functional sites, from those that contribute to allosteric couplings between sites. NbIT analysis of MD data thus reveals rigorous mechanistic elements of allostery underlying the dynamics of biomolecular systems.

  1. Theory-Based Stakeholder Evaluation

    ERIC Educational Resources Information Center

    Hansen, Morten Balle; Vedung, Evert

    2010-01-01

    This article introduces a new approach to program theory evaluation called theory-based stakeholder evaluation or the TSE model for short. Most theory-based approaches are program theory driven and some are stakeholder oriented as well. Practically, all of the latter fuse the program perceptions of the various stakeholder groups into one unitary…

  2. Information storage media and method

    DOEpatents

    Miller, Steven D.; Endres, George W.

    1999-01-01

    Disclosed is a method for storing and retrieving information. More specifically, the present invention is a method for forming predetermined patterns, or data structures, using materials which exhibit enhanced absorption of light at certain wavelengths or, when interrogated with a light having a first wavelength, provide a luminescent response at a second wavelength. These materials may exhibit this response to light inherently, or may be made to exhibit this response by treating the materials with ionizing radiation.

  3. A recursively formulated first-order semianalytic artificial satellite theory based on the generalized method of averaging. Volume 1: The generalized method of averaging applied to the artificial satellite problem

    NASA Technical Reports Server (NTRS)

    Mcclain, W. D.

    1977-01-01

    A recursively formulated, first-order, semianalytic artificial satellite theory, based on the generalized method of averaging is presented in two volumes. Volume I comprehensively discusses the theory of the generalized method of averaging applied to the artificial satellite problem. Volume II presents the explicit development in the nonsingular equinoctial elements of the first-order average equations of motion. The recursive algorithms used to evaluate the first-order averaged equations of motion are also presented in Volume II. This semianalytic theory is, in principle, valid for a term of arbitrary degree in the expansion of the third-body disturbing function (nonresonant cases only) and for a term of arbitrary degree and order in the expansion of the nonspherical gravitational potential function.

  4. Theory-based interventions for contraception.

    PubMed

    Lopez, Laureen M; Tolley, Elizabeth E; Grimes, David A; Chen, Mario; Stockton, Laurie L

    2013-08-07

    The explicit use of theory in research helps expand the knowledge base. Theories and models have been used extensively in HIV-prevention research and in interventions for preventing sexually transmitted infections (STIs). The health behavior field uses many theories or models of change. However, educational interventions addressing contraception often have no stated theoretical base. Review randomized controlled trials (RCTs) that tested a theoretical approach to inform contraceptive choice; encourage contraceptive use; or promote adherence to, or continuation of, a contraceptive regimen. Through June 2013, we searched computerized databases for trials that tested a theory-based intervention for improving contraceptive use (MEDLINE, POPLINE, CENTRAL, PsycINFO, ClinicalTrials.gov, and ICTRP). Previous searches also included EMBASE. For the initial review, we wrote to investigators to find other trials. Trials tested a theory-based intervention for improving contraceptive use. We excluded trials focused on high-risk groups and preventing sexually transmitted infections or HIV. Interventions addressed the use of one or more contraceptive methods for contraception. The reports provided evidence that the intervention was based on a specific theory or model. The primary outcomes were pregnancy, contraceptive choice or use, and contraceptive adherence or continuation. The primary author evaluated abstracts for eligibility. Two authors extracted data from included studies. For the dichotomous outcomes, the Mantel-Haenszel odds ratio (OR) with 95% CI was calculated using a fixed-effect model. Cluster randomized trials used various methods of accounting for the clustering, such as multilevel modeling. Most reports did not provide information to calculate the effective sample size. Therefore, we presented the results as reported by the investigators. No meta-analysis was conducted due to differences in interventions and outcome measures. We included three new trials for a

  5. Derivation of a measure of systolic blood pressure mutability: a novel information theory-based metric from ambulatory blood pressure tests.

    PubMed

    Contreras, Danitza J; Vogel, Eugenio E; Saravia, Gonzalo; Stockins, Benjamin

    2016-03-01

    We provide ambulatory blood pressure (BP) exams with tools based on information theory to quantify fluctuations thus increasing the capture of dynamic test components. Data from 515 ambulatory 24-hour BP exams were considered. Average age was 54 years, 54% were women, and 53% were under BP treatment. The average systolic pressure (SP) was 127 ± 8 mm Hg. A data compressor (wlzip) designed to recognize meaningful information is invoked to measure mutability which is a form of dynamical variability. For patients with the same average SP, different mutability values are obtained which reflects the differences in dynamical variability. In unadjusted linear regression models, mutability had low association with the mean systolic BP (R(2) = 0.056; P < .000001) but larger association with the SP deviation (R(2) = 0.761; P < .001). Wlzip allows detecting levels of variability in SP that could be hazardous. This new indicator can be easily added to the 24-hour BP monitors improving information toward diagnosis. Copyright © 2016 American Society of Hypertension. Published by Elsevier Inc. All rights reserved.

  6. Readout method for stored information

    NASA Technical Reports Server (NTRS)

    Lewicki, G. W.

    1976-01-01

    Readout technique increases density of stored information for projection onto facsimile reproduction. Data stored on line structures is scanned at 90 deg. angle over area larger than recorded format to ensure complete recovery of information.

  7. Camp NERF: methods of a theory-based nutrition education recreation and fitness program aimed at preventing unhealthy weight gain in underserved elementary children during summer months.

    PubMed

    Hopkins, Laura C; Fristad, Mary; Goodway, Jacqueline D; Eneli, Ihuoma; Holloman, Chris; Kennel, Julie A; Melnyk, Bernadette; Gunther, Carolyn

    2016-10-26

    The number of obese children in the US remains high, which is problematic due to the mental, physical, and academic effects of obesity on child health. Data indicate that school-age children, particularly underserved children, experience unhealthy gains in BMI at a rate nearly twice as fast during the summer months. Few efforts have been directed at implementing evidence-based programming to prevent excess weight gain during the summer recess. Camp NERF is an 8-week, multi-component (nutrition, physical activity, and mental health), theory-based program for underserved school-age children in grades Kindergarten - 5th coupled with the USDA Summer Food Service Program. Twelve eligible elementary school sites will be randomized to one of the three programming groups: 1) Active Control (non-nutrition, physical activity, or mental health); 2) Standard Care (nutrition and physical activity); or 3) Enhanced Care (nutrition, physical activity, and mental health) programming. Anthropometric, behavioral, and psychosocial data will be collected from child-caregiver dyads pre- and post-intervention. Site-specific characteristics and process evaluation measures will also be collected. This is the first, evidence-based intervention to address the issue of weight gain during the summer months among underserved, school-aged children. Results from this study will provide researchers, practitioners, and public health professionals with insight on evidence-based programming to aid in childhood obesity prevention during this particular window of risk. NCT02908230/09-19-2016.

  8. Information technology equipment cooling method

    DOEpatents

    Schultz, Mark D.

    2015-10-20

    According to one embodiment, a system for removing heat from a rack of information technology equipment may include a sidecar indoor air to liquid heat exchanger that cools air utilized by the rack of information technology equipment to cool the rack of information technology equipment. The system may also include a liquid to liquid heat exchanger and an outdoor heat exchanger. The system may further include configurable pathways to connect and control fluid flow through the sidecar heat exchanger, the liquid to liquid heat exchanger, the rack of information technology equipment, and the outdoor heat exchanger based upon ambient temperature and/or ambient humidity to remove heat generated by the rack of information technology equipment.

  9. Theory-based explanation as intervention.

    PubMed

    Weisman, Kara; Markman, Ellen M

    2017-01-17

    Cogent explanations are an indispensable means of providing new information and an essential component of effective education. Beyond this, we argue that there is tremendous untapped potential in using explanations to motivate behavior change. In this article we focus on health interventions. We review four case studies that used carefully tailored explanations to address gaps and misconceptions in people's intuitive theories, providing participants with a conceptual framework for understanding how and why some recommended behavior is an effective way of achieving a health goal. These case studies targeted a variety of health-promoting behaviors: (1) children washing their hands to prevent viral epidemics; (2) parents vaccinating their children to stem the resurgence of infectious diseases; (3) adults completing the full course of an antibiotic prescription to reduce antibiotic resistance; and (4) children eating a variety of healthy foods to improve unhealthy diets. Simply telling people to engage in these behaviors has been largely ineffective-if anything, concern about these issues is mounting. But in each case, teaching participants coherent explanatory frameworks for understanding health recommendations has shown great promise, with such theory-based explanations outperforming state-of-the-art interventions from national health authorities. We contrast theory-based explanations both with simply listing facts, information, and advice and with providing a full-blown educational curriculum, and argue for providing the minimum amount of information required to understand the causal link between a target behavior and a health outcome. We argue that such theory-based explanations lend people the motivation and confidence to act on their new understanding.

  10. EFFECTIVENESS OF INFORMATION RETRIEVAL METHODS.

    ERIC Educational Resources Information Center

    SWETS, JOHN A.

    RESULTS OF FIFTY DIFFERENT RETRIEVAL METHODS AS APPLIED IN THREE EXPERIMENTAL RETRIEVAL SYSTEMS WERE SUBJECTED TO AN ANALYSIS SUGGESTED BY STATISTICAL DECISION THEORY. THE ANALYSIS USES A PREVIOUSLY-PROPOSED MEASURE OF EFFECTIVENESS AND DEMONSTRATES ITS SEVERAL PROPERTIES. SOME OF THESE PROPERTIES ARE--(1) IT ENABLES THE RETRIEVAL SYSTEM TO…

  11. Levels of reconstruction as complementarity in mixed methods research: a social theory-based conceptual framework for integrating qualitative and quantitative research.

    PubMed

    Carroll, Linda J; Rothe, J Peter

    2010-09-01

    Like other areas of health research, there has been increasing use of qualitative methods to study public health problems such as injuries and injury prevention. Likewise, the integration of qualitative and quantitative research (mixed-methods) is beginning to assume a more prominent role in public health studies. Likewise, using mixed-methods has great potential for gaining a broad and comprehensive understanding of injuries and their prevention. However, qualitative and quantitative research methods are based on two inherently different paradigms, and their integration requires a conceptual framework that permits the unity of these two methods. We present a theory-driven framework for viewing qualitative and quantitative research, which enables us to integrate them in a conceptually sound and useful manner. This framework has its foundation within the philosophical concept of complementarity, as espoused in the physical and social sciences, and draws on Bergson's metaphysical work on the 'ways of knowing'. Through understanding how data are constructed and reconstructed, and the different levels of meaning that can be ascribed to qualitative and quantitative findings, we can use a mixed-methods approach to gain a conceptually sound, holistic knowledge about injury phenomena that will enhance our development of relevant and successful interventions.

  12. Levels of Reconstruction as Complementarity in Mixed Methods Research: A Social Theory-Based Conceptual Framework for Integrating Qualitative and Quantitative Research

    PubMed Central

    Carroll, Linda J.; Rothe, J. Peter

    2010-01-01

    Like other areas of health research, there has been increasing use of qualitative methods to study public health problems such as injuries and injury prevention. Likewise, the integration of qualitative and quantitative research (mixed-methods) is beginning to assume a more prominent role in public health studies. Likewise, using mixed-methods has great potential for gaining a broad and comprehensive understanding of injuries and their prevention. However, qualitative and quantitative research methods are based on two inherently different paradigms, and their integration requires a conceptual framework that permits the unity of these two methods. We present a theory-driven framework for viewing qualitative and quantitative research, which enables us to integrate them in a conceptually sound and useful manner. This framework has its foundation within the philosophical concept of complementarity, as espoused in the physical and social sciences, and draws on Bergson’s metaphysical work on the ‘ways of knowing’. Through understanding how data are constructed and reconstructed, and the different levels of meaning that can be ascribed to qualitative and quantitative findings, we can use a mixed-methods approach to gain a conceptually sound, holistic knowledge about injury phenomena that will enhance our development of relevant and successful interventions. PMID:20948937

  13. Probability theory-based SNP association study method for identifying susceptibility loci and genetic disease models in human case-control data.

    PubMed

    Yuan, Xiguo; Zhang, Junying; Wang, Yue

    2010-12-01

    One of the most challenging points in studying human common complex diseases is to search for both strong and weak susceptibility single-nucleotide polymorphisms (SNPs) and identify forms of genetic disease models. Currently, a number of methods have been proposed for this purpose. Many of them have not been validated through applications into various genome datasets, so their abilities are not clear in real practice. In this paper, we present a novel SNP association study method based on probability theory, called ProbSNP. The method firstly detects SNPs by evaluating their joint probabilities in combining with disease status and selects those with the lowest joint probabilities as susceptibility ones, and then identifies some forms of genetic disease models through testing multiple-locus interactions among the selected SNPs. The joint probabilities of combined SNPs are estimated by establishing Gaussian distribution probability density functions, in which the related parameters (i.e., mean value and standard deviation) are evaluated based on allele and haplotype frequencies. Finally, we test and validate the method using various genome datasets. We find that ProbSNP has shown remarkable success in the applications to both simulated genome data and real genome-wide data.

  14. High-resolution wave-theory-based ultrasound reflection imaging using the split-step fourier and globally optimized fourier finite-difference methods

    DOEpatents

    Huang, Lianjie

    2013-10-29

    Methods for enhancing ultrasonic reflection imaging are taught utilizing a split-step Fourier propagator in which the reconstruction is based on recursive inward continuation of ultrasonic wavefields in the frequency-space and frequency-wave number domains. The inward continuation within each extrapolation interval consists of two steps. In the first step, a phase-shift term is applied to the data in the frequency-wave number domain for propagation in a reference medium. The second step consists of applying another phase-shift term to data in the frequency-space domain to approximately compensate for ultrasonic scattering effects of heterogeneities within the tissue being imaged (e.g., breast tissue). Results from various data input to the method indicate significant improvements are provided in both image quality and resolution.

  15. Unrestricted density functional theory based on the fragment molecular orbital method for the ground and excited state calculations of large systems

    SciTech Connect

    Nakata, Hiroya; Fedorov, Dmitri G.; Yokojima, Satoshi; Kitaura, Kazuo; Sakurai, Minoru; Nakamura, Shinichiro

    2014-04-14

    We extended the fragment molecular orbital (FMO) method interfaced with density functional theory (DFT) into spin unrestricted formalism (UDFT) and developed energy gradients for the ground state and single point excited state energies based on time-dependent DFT. The accuracy of FMO is evaluated in comparison to the full calculations without fragmentation. Electronic excitations in solvated organic radicals and in the blue copper protein, plastocyanin (PDB code: 1BXV), are reported. The contributions of solvent molecules to the electronic excitations are analyzed in terms of the fragment polarization and quantum effects such as interfragment charge transfer.

  16. Benchmarking Density Functional Theory Based Methods To Model NiOOH Material Properties: Hubbard and van der Waals Corrections vs Hybrid Functionals.

    PubMed

    Zaffran, Jeremie; Caspary Toroker, Maytal

    2016-08-09

    NiOOH has recently been used to catalyze water oxidation by way of electrochemical water splitting. Few experimental data are available to rationalize the successful catalytic capability of NiOOH. Thus, theory has a distinctive role for studying its properties. However, the unique layered structure of NiOOH is associated with the presence of essential dispersion forces within the lattice. Hence, the choice of an appropriate exchange-correlation functional within Density Functional Theory (DFT) is not straightforward. In this work, we will show that standard DFT is sufficient to evaluate the geometry, but DFT+U and hybrid functionals are required to calculate the oxidation states. Notably, the benefit of DFT with van der Waals correction is marginal. Furthermore, only hybrid functionals succeed in opening a bandgap, and such methods are necessary to study NiOOH electronic structure. In this work, we expect to give guidelines to theoreticians dealing with this material and to present a rational approach in the choice of the DFT method of calculation.

  17. Dynamic mean field theory for lattice gas models of fluids confined in porous materials: higher order theory based on the Bethe-Peierls and path probability method approximations.

    PubMed

    Edison, John R; Monson, Peter A

    2014-07-14

    Recently we have developed a dynamic mean field theory (DMFT) for lattice gas models of fluids in porous materials [P. A. Monson, J. Chem. Phys. 128(8), 084701 (2008)]. The theory can be used to describe the relaxation processes in the approach to equilibrium or metastable states for fluids in pores and is especially useful for studying system exhibiting adsorption/desorption hysteresis. In this paper we discuss the extension of the theory to higher order by means of the path probability method (PPM) of Kikuchi and co-workers. We show that this leads to a treatment of the dynamics that is consistent with thermodynamics coming from the Bethe-Peierls or Quasi-Chemical approximation for the equilibrium or metastable equilibrium states of the lattice model. We compare the results from the PPM with those from DMFT and from dynamic Monte Carlo simulations. We find that the predictions from PPM are qualitatively similar to those from DMFT but give somewhat improved quantitative accuracy, in part due to the superior treatment of the underlying thermodynamics. This comes at the cost of greater computational expense associated with the larger number of equations that must be solved.

  18. Dynamic mean field theory for lattice gas models of fluids confined in porous materials: Higher order theory based on the Bethe-Peierls and path probability method approximations

    SciTech Connect

    Edison, John R.; Monson, Peter A.

    2014-07-14

    Recently we have developed a dynamic mean field theory (DMFT) for lattice gas models of fluids in porous materials [P. A. Monson, J. Chem. Phys. 128(8), 084701 (2008)]. The theory can be used to describe the relaxation processes in the approach to equilibrium or metastable states for fluids in pores and is especially useful for studying system exhibiting adsorption/desorption hysteresis. In this paper we discuss the extension of the theory to higher order by means of the path probability method (PPM) of Kikuchi and co-workers. We show that this leads to a treatment of the dynamics that is consistent with thermodynamics coming from the Bethe-Peierls or Quasi-Chemical approximation for the equilibrium or metastable equilibrium states of the lattice model. We compare the results from the PPM with those from DMFT and from dynamic Monte Carlo simulations. We find that the predictions from PPM are qualitatively similar to those from DMFT but give somewhat improved quantitative accuracy, in part due to the superior treatment of the underlying thermodynamics. This comes at the cost of greater computational expense associated with the larger number of equations that must be solved.

  19. Discovering the Optimal Route for Alane Synthesis on Ti doped Al Surfaces Using Density Functional Theory Based Kinetic Monte Carlo Methods

    NASA Astrophysics Data System (ADS)

    Karim, Altaf; Muckerman, James T.

    2010-03-01

    Issues such as catalytic dissociation of hydrogen and the mobility of alane species on Ti-doped Al surfaces are major challenges in the synthesis of aluminum hydride. Our recently developed modeling framework (DFT-based KMC simulation) enabled us to study the steady-state conditions of dissociative adsorption of hydrogen, its diffusion, and its reaction with Al adatoms leading to the formation of alane species on Ti-doped Al surfaces. Our studies show that the doping of Ti atoms in the top layer of Al surfaces significantly reduces the mobility of alane species. On the other hand, the doping of Ti atoms beneath the top layer of Al surfaces enhances the mobility of alane species. The arrangement of dopant Ti atoms in different layers not only affects the diffusion barriers of alane species but it also affects hydrogen dissociation barriers when Ti-Ti pairs are arranged in different ways in the top layer. Using our theoretical methods, we identified a few configurations of dopant Ti atoms having lower barriers for alane diffusion and hydrogen dissociation. Further, we discovered the optimal values of Ti concentration, temperature, and pressure under which the rate of alane formation is maximized.

  20. Theory-based interventions for contraception.

    PubMed

    Lopez, Laureen M; Grey, Thomas W; Chen, Mario; Tolley, Elizabeth E; Stockton, Laurie L

    2016-11-23

    The explicit use of theory in research helps expand the knowledge base. Theories and models have been used extensively in HIV-prevention research and in interventions for preventing sexually transmitted infections (STIs). The health behavior field uses many theories or models of change. However, many educational interventions addressing contraception have no explicit theoretical base. To review randomized controlled trials (RCTs) that tested a theoretical approach to inform contraceptive choice and encourage or improve contraceptive use. To 1 November 2016, we searched for trials that tested a theory-based intervention for improving contraceptive use in PubMed, CENTRAL, POPLINE, Web of Science, ClinicalTrials.gov, and ICTRP. For the initial review, we wrote to investigators to find other trials. Included trials tested a theory-based intervention for improving contraceptive use. Interventions addressed the use of one or more methods for contraception. The reports provided evidence that the intervention was based on a specific theory or model. The primary outcomes were pregnancy and contraceptive choice or use. We assessed titles and abstracts identified during the searches. One author extracted and entered the data into Review Manager; a second author verified accuracy. We examined studies for methodological quality.For unadjusted dichotomous outcomes, we calculated the Mantel-Haenszel odds ratio (OR) with 95% confidence interval (CI). Cluster randomized trials used various methods of accounting for the clustering, such as multilevel modeling. Most reports did not provide information to calculate the effective sample size. Therefore, we presented the results as reported by the investigators. We did not conduct meta-analysis due to varied interventions and outcome measures. We included 10 new trials for a total of 25. Five were conducted outside the USA. Fifteen randomly assigned individuals and 10 randomized clusters. This section focuses on nine trials with high or

  1. Method for gathering and summarizing internet information

    DOEpatents

    Potok, Thomas E.; Elmore, Mark Thomas; Reed, Joel Wesley; Treadwell, Jim N.; Samatova, Nagiza Faridovna

    2010-04-06

    A computer method of gathering and summarizing large amounts of information comprises collecting information from a plurality of information sources (14, 51) according to respective maps (52) of the information sources (14), converting the collected information from a storage format to XML-language documents (26, 53) and storing the XML-language documents in a storage medium, searching for documents (55) according to a search query (13) having at least one term and identifying the documents (26) found in the search, and displaying the documents as nodes (33) of a tree structure (32) having links (34) and nodes (33) so as to indicate similarity of the documents to each other.

  2. Method for gathering and summarizing internet information

    DOEpatents

    Potok, Thomas E [Oak Ridge, TN; Elmore, Mark Thomas [Oak Ridge, TN; Reed, Joel Wesley [Knoxville, TN; Treadwell, Jim N; Samatova, Nagiza Faridovna [Oak Ridge, TN

    2008-01-01

    A computer method of gathering and summarizing large amounts of information comprises collecting information from a plurality of information sources (14, 51) according to respective maps (52) of the information sources (14), converting the collected information from a storage format to XML-language documents (26, 53) and storing the XML-language documents in a storage medium, searching for documents (55) according to a search query (13) having at least one term and identifying the documents (26) found in the search, and displaying the documents as nodes (33) of a tree structure (32) having links (34) and nodes (33) so as to indicate similarity of the documents to each other.

  3. Method for gathering and summarizing internet information

    DOEpatents

    Potok, Thomas E [Oak Ridge, TN; Elmore, Mark Thomas [Oak Ridge, TN; Reed, Joel Wesley [Knoxville, TN; Treadwell, Jim N [Louisville, TN; Samatova, Nagiza Faridovna [Oak Ridge, TN

    2008-01-01

    A computer method of gathering and summarizing large amounts of information comprises collecting information from a plurality of information sources (14, 51) according to respective maps (52) of the information sources (14), converting the collected information from a storage format to XML-language documents (26, 53) and storing the XML-language documents in a storage medium, searching for documents (55) according to a search query (13) having at least one term and identifying the documents (26) found in the search, and displaying the documents as nodes (33) of a tree structure (32) having links (34) and nodes (33) so as to indicate similarity of the documents to each other.

  4. Research on polarization imaging information parsing method

    NASA Astrophysics Data System (ADS)

    Yuan, Hongwu; Zhou, Pucheng; Wang, Xiaolong

    2016-11-01

    Polarization information parsing plays an important role in polarization imaging detection. This paper focus on the polarization information parsing method: Firstly, the general process of polarization information parsing is given, mainly including polarization image preprocessing, multiple polarization parameters calculation, polarization image fusion and polarization image tracking, etc.; And then the research achievements of the polarization information parsing method are presented, in terms of polarization image preprocessing, the polarization image registration method based on the maximum mutual information is designed. The experiment shows that this method can improve the precision of registration and be satisfied the need of polarization information parsing; In terms of multiple polarization parameters calculation, based on the omnidirectional polarization inversion model is built, a variety of polarization parameter images are obtained and the precision of inversion is to be improve obviously; In terms of polarization image fusion , using fuzzy integral and sparse representation, the multiple polarization parameters adaptive optimal fusion method is given, and the targets detection in complex scene is completed by using the clustering image segmentation algorithm based on fractal characters; In polarization image tracking, the average displacement polarization image characteristics of auxiliary particle filtering fusion tracking algorithm is put forward to achieve the smooth tracking of moving targets. Finally, the polarization information parsing method is applied to the polarization imaging detection of typical targets such as the camouflage target, the fog and latent fingerprints.

  5. Theory-Based Considerations Influence the Interpretation of Generic Sentences

    ERIC Educational Resources Information Center

    Cimpian, Andrei; Gelman, Susan A.; Brandone, Amanda C.

    2010-01-01

    Under what circumstances do people agree that a kind-referring generic sentence (e.g., "Swans are beautiful") is true? We hypothesised that theory-based considerations are sufficient, independently of prevalence/frequency information, to lead to acceptance of a generic statement. To provide evidence for this general point, we focused on…

  6. Theory-Based Considerations Influence the Interpretation of Generic Sentences

    ERIC Educational Resources Information Center

    Cimpian, Andrei; Gelman, Susan A.; Brandone, Amanda C.

    2010-01-01

    Under what circumstances do people agree that a kind-referring generic sentence (e.g., "Swans are beautiful") is true? We hypothesised that theory-based considerations are sufficient, independently of prevalence/frequency information, to lead to acceptance of a generic statement. To provide evidence for this general point, we focused on…

  7. Analysis of methods. [information systems evolution environment

    NASA Technical Reports Server (NTRS)

    Mayer, Richard J. (Editor); Ackley, Keith A.; Wells, M. Sue; Mayer, Paula S. D.; Blinn, Thomas M.; Decker, Louis P.; Toland, Joel A.; Crump, J. Wesley; Menzel, Christopher P.; Bodenmiller, Charles A.

    1991-01-01

    Information is one of an organization's most important assets. For this reason the development and maintenance of an integrated information system environment is one of the most important functions within a large organization. The Integrated Information Systems Evolution Environment (IISEE) project has as one of its primary goals a computerized solution to the difficulties involved in the development of integrated information systems. To develop such an environment a thorough understanding of the enterprise's information needs and requirements is of paramount importance. This document is the current release of the research performed by the Integrated Development Support Environment (IDSE) Research Team in support of the IISEE project. Research indicates that an integral part of any information system environment would be multiple modeling methods to support the management of the organization's information. Automated tool support for these methods is necessary to facilitate their use in an integrated environment. An integrated environment makes it necessary to maintain an integrated database which contains the different kinds of models developed under the various methodologies. In addition, to speed the process of development of models, a procedure or technique is needed to allow automatic translation from one methodology's representation to another while maintaining the integrity of both. The purpose for the analysis of the modeling methods included in this document is to examine these methods with the goal being to include them in an integrated development support environment. To accomplish this and to develop a method for allowing intra-methodology and inter-methodology model element reuse, a thorough understanding of multiple modeling methodologies is necessary. Currently the IDSE Research Team is investigating the family of Integrated Computer Aided Manufacturing (ICAM) DEFinition (IDEF) languages IDEF(0), IDEF(1), and IDEF(1x), as well as ENALIM, Entity

  8. Research Investigation of Information Access Methods

    ERIC Educational Resources Information Center

    Heinrichs, John H.; Sharkey, Thomas W.; Lim, Jeen-Su

    2006-01-01

    This study investigates the satisfaction of library users at Wayne State University who utilize alternative information access methods. The LibQUAL+[TM] desired and perceived that satisfaction ratings are used to determine the user's "superiority gap." By focusing limited library resources to address "superiority gap" issues identified by each…

  9. Research Investigation of Information Access Methods

    ERIC Educational Resources Information Center

    Heinrichs, John H.; Sharkey, Thomas W.; Lim, Jeen-Su

    2006-01-01

    This study investigates the satisfaction of library users at Wayne State University who utilize alternative information access methods. The LibQUAL+[TM] desired and perceived that satisfaction ratings are used to determine the user's "superiority gap." By focusing limited library resources to address "superiority gap" issues identified by each…

  10. Switching theory-based steganographic system for JPEG images

    NASA Astrophysics Data System (ADS)

    Cherukuri, Ravindranath C.; Agaian, Sos S.

    2007-04-01

    Cellular communications constitute a significant portion of the global telecommunications market. Therefore, the need for secured communication over a mobile platform has increased exponentially. Steganography is an art of hiding critical data into an innocuous signal, which provide answers to the above needs. The JPEG is one of commonly used format for storing and transmitting images on the web. In addition, the pictures captured using mobile cameras are in mostly in JPEG format. In this article, we introduce a switching theory based steganographic system for JPEG images which is applicable for mobile and computer platforms. The proposed algorithm uses the fact that energy distribution among the quantized AC coefficients varies from block to block and coefficient to coefficient. Existing approaches are effective with a part of these coefficients but when employed over all the coefficients they show there ineffectiveness. Therefore, we propose an approach that works each set of AC coefficients with different frame work thus enhancing the performance of the approach. The proposed system offers a high capacity and embedding efficiency simultaneously withstanding to simple statistical attacks. In addition, the embedded information could be retrieved without prior knowledge of the cover image. Based on simulation results, the proposed method demonstrates an improved embedding capacity over existing algorithms while maintaining a high embedding efficiency and preserving the statistics of the JPEG image after hiding information.

  11. Method and apparatus for displaying information

    NASA Technical Reports Server (NTRS)

    Ingber, Donald E. (Inventor); Huang, Sui (Inventor); Eichler, Gabriel (Inventor)

    2010-01-01

    A method for displaying large amounts of information. The method includes the steps of forming a spatial layout of tiles each corresponding to a representative reference element; mapping observed elements onto the spatial layout of tiles of representative reference elements; assigning a respective value to each respective tile of the spatial layout of the representative elements; and displaying an image of the spatial layout of tiles of representative elements. Each tile includes atomic attributes of representative elements. The invention also relates to an apparatus for displaying large amounts of information. The apparatus includes a tiler forming a spatial layout of tiles, each corresponding to a representative reference element; a comparator mapping observed elements onto said spatial layout of tiles of representative reference elements; an assigner assigning a respective value to each respective tile of said spatial layout of representative reference elements; and a display displaying an image of the spatial layout of tiles of representative reference elements.

  12. Communication style and exercise compliance in physiotherapy (CONNECT). A cluster randomized controlled trial to test a theory-based intervention to increase chronic low back pain patients’ adherence to physiotherapists’ recommendations: study rationale, design, and methods

    PubMed Central

    2012-01-01

    if their physiotherapist has received the communication skills training. Outcome assessors will also be blinded. We will use linear mixed modeling to test between arm differences both in the mean levels and the rates of change of the outcome variables. We will employ structural equation modeling to examine the process of change, including hypothesized mediation effects. Discussion This trial will be the first to test the effect of a self-determination theory-based communication skills training program for physiotherapists on their low back pain patients’ adherence to rehabilitation recommendations. Trial Registration Current Controlled Trials ISRCTN63723433 PMID:22703639

  13. A Method for Analyzing Volunteered Geographic Information ...

    EPA Pesticide Factsheets

    Volunteered geographic information (VGI) can be used to identify public valuation of ecosystem services in a defined geographic area using photos as a representation of lived experiences. This method can help researchers better survey and report on the values and preferences of stakeholders involved in rehabilitation and revitalization projects. Current research utilizes VGI in the form of geotagged social media photos from three platforms: Flickr, Instagram, and Panaramio. Social media photos have been obtained for the neighborhoods next to the St. Louis River in Duluth, Minnesota, and are being analyzed along several dimensions. These dimensions include the spatial distribution of each platform, the characteristics of the physical environment portrayed in the photos, and finally, the ecosystem service depicted. In this poster, we focus on the photos from the Irving and Fairmount neighborhoods of Duluth, MN to demonstrate the method at the neighborhood scale. This study demonstrates a method for translating the values expressed in social media photos into ecosystem services and spatially-explicit data to be used in multiple settings, including the City of Duluth’s Comprehensive Planning and community revitalization efforts, habitat restoration in a Great Lakes Area of Concern, and the USEPA’s Office of Research and Development. This poster will demonstrate a method for translating values expressed in social media photos into ecosystem services and spatially

  14. A Method for Analyzing Volunteered Geographic Information ...

    EPA Pesticide Factsheets

    Volunteered geographic information (VGI) can be used to identify public valuation of ecosystem services in a defined geographic area using photos as a representation of lived experiences. This method can help researchers better survey and report on the values and preferences of stakeholders involved in rehabilitation and revitalization projects. Current research utilizes VGI in the form of geotagged social media photos from three platforms: Flickr, Instagram, and Panaramio. Social media photos have been obtained for the neighborhoods next to the St. Louis River in Duluth, Minnesota, and are being analyzed along several dimensions. These dimensions include the spatial distribution of each platform, the characteristics of the physical environment portrayed in the photos, and finally, the ecosystem service depicted. In this poster, we focus on the photos from the Irving and Fairmount neighborhoods of Duluth, MN to demonstrate the method at the neighborhood scale. This study demonstrates a method for translating the values expressed in social media photos into ecosystem services and spatially-explicit data to be used in multiple settings, including the City of Duluth’s Comprehensive Planning and community revitalization efforts, habitat restoration in a Great Lakes Area of Concern, and the USEPA’s Office of Research and Development. This poster will demonstrate a method for translating values expressed in social media photos into ecosystem services and spatially

  15. Knowledge information management toolkit and method

    DOEpatents

    Hempstead, Antoinette R.; Brown, Kenneth L.

    2006-08-15

    A system is provided for managing user entry and/or modification of knowledge information into a knowledge base file having an integrator support component and a data source access support component. The system includes processing circuitry, memory, a user interface, and a knowledge base toolkit. The memory communicates with the processing circuitry and is configured to store at least one knowledge base. The user interface communicates with the processing circuitry and is configured for user entry and/or modification of knowledge pieces within a knowledge base. The knowledge base toolkit is configured for converting knowledge in at least one knowledge base from a first knowledge base form into a second knowledge base form. A method is also provided.

  16. [Information method of EEG analysis in anesthesiology].

    PubMed

    Likhvantsev, V V; Subbotin, V V; Petrov, O V; Sitnikov, A V; Kazanikova, A N; Zhuravel', S V

    2003-01-01

    An information concept of nociceptive impulses is proposed. The device for quantitative analysis of information coming to CNS has been constructed and approved. It is demonstrated that monitoring of informative loading of EEG may be used for evaluation of anesthesia adequacy. Level of this index from 40 to 50% corresponds to effective protection.

  17. Scenistic Methods for Training: Applications and Practice

    ERIC Educational Resources Information Center

    Lyons, Paul R.

    2011-01-01

    Purpose: This paper aims to complement an earlier article (2010) in "Journal of European Industrial Training" in which the description and theory bases of scenistic methods were presented. This paper also offers a description of scenistic methods and information on theory bases. However, the main thrust of this paper is to describe, give suggested…

  18. Advanced Feedback Methods in Information Retrieval.

    ERIC Educational Resources Information Center

    Salton, G.; And Others

    1985-01-01

    In this study, automatic feedback techniques are applied to Boolean query statements in online information retrieval to generate improved query statements based on information contained in previously retrieved documents. Feedback operations are carried out using conventional Boolean logic and extended logic. Experimental output is included to…

  19. Protection method for an optical information carrier

    NASA Astrophysics Data System (ADS)

    Pitsyuga, Vitaly V.; Kolesnikov, Michael Y.; Kosyak, Igor V.

    1997-02-01

    Now information protection on personal carriers (for example, cards) from an unauthorized access (UA) is a very important problem in connection with wide introduction of proper automatic systems for information processing in different spheres of human activity. These are financial, medical and information services, an access to restricted units and so on. There is proposed to use physical parameters of the special coating part (so-called restricted zone) to information protection on optical carriers (laser cards). There is formed restricted zone on the surface of the recording coating of a laser card. The unique information about every laser card to creating a protective passport from UA is obtained by readout of defects parameters.

  20. Information Work Analysis: An Approach to Research on Information Interactions and Information Behaviour in Context

    ERIC Educational Resources Information Center

    Huvila, Isto

    2008-01-01

    Introduction: A work roles and role theory-based approach to conceptualise human information activity, denoted information work analysis is discussed. The present article explicates the approach and its special characteristics and benefits in comparison to earlier methods of analysing human information work. Method: The approach is discussed in…

  1. Governance Methods Used in Externalizing Information Technology

    ERIC Educational Resources Information Center

    Chan, Steven King-Lun

    2012-01-01

    Information technology (IT) is the largest capital expenditure in many firms and is an integral part of many organizations' strategies. However, the benefits that each company receives from its IT investments vary. One study by Weill (2004) found that the top performer in the sample was estimated to have as high as a 40% greater return on its…

  2. Methods of Eliciting Information from Experts

    DTIC Science & Technology

    1987-10-01

    Itzhak Perlman) or a concertmeister in an orchestra, or simply one of its violinists . Differences in amount of expertise may supply different...would lead to becoming a world class violinist . Underlying Assumptions In attempting to elicit information from experts, one makes a number of

  3. Governance Methods Used in Externalizing Information Technology

    ERIC Educational Resources Information Center

    Chan, Steven King-Lun

    2012-01-01

    Information technology (IT) is the largest capital expenditure in many firms and is an integral part of many organizations' strategies. However, the benefits that each company receives from its IT investments vary. One study by Weill (2004) found that the top performer in the sample was estimated to have as high as a 40% greater return on its…

  4. Vienna development method: An informal production

    SciTech Connect

    Petrenko, A.K.

    1992-09-01

    This article presents a brief description of the Vienna Development Method The article contains the fundamental notions, methodological concepts of VDM, and simple examples. The description of the development method covers the phases of stepwise specification and programming. 2 refs., 2 figs.

  5. On Methods for Higher Order Information Fusion

    DTIC Science & Technology

    2005-02-01

    situation in which our information about reasonableness is pointed and captured by a fuzzy subset, a mapping R: X → T and thus has a possibilistic nature...grading of this idea. If B is a fuzzy subset, then one meaning of the proposition V1 is B is that for any x ∈ X1, B(x) indicates the possibility that V1...of uncertainties. 15. NUMBER OF PAGES 103 14. SUBJECT TERMS Indicator and Warnings, Situation Awareness, Fuzzy Sets, Fusion 16. PRICE CODE 17

  6. Cable Television: A Method for Delivering Information.

    ERIC Educational Resources Information Center

    Nebraska Univ., Lincoln. Cooperative Extension Service.

    This report presents the recommendations of a committee that was formed to explore the possibility of using cable television networks as a method of delivering extension education programs to urban audiences. After developing and testing a pilot project that used cable television as a mode to disseminate horticulture and 4-H leader training…

  7. Using Qualitative Methods to Inform Scale Development

    ERIC Educational Resources Information Center

    Rowan, Noell; Wulff, Dan

    2007-01-01

    This article describes the process by which one study utilized qualitative methods to create items for a multi dimensional scale to measure twelve step program affiliation. The process included interviewing fourteen addicted persons while in twelve step focused treatment about specific pros (things they like or would miss out on by not being…

  8. Applying Human Computation Methods to Information Science

    ERIC Educational Resources Information Center

    Harris, Christopher Glenn

    2013-01-01

    Human Computation methods such as crowdsourcing and games with a purpose (GWAP) have each recently drawn considerable attention for their ability to synergize the strengths of people and technology to accomplish tasks that are challenging for either to do well alone. Despite this increased attention, much of this transformation has been focused on…

  9. Applying Human Computation Methods to Information Science

    ERIC Educational Resources Information Center

    Harris, Christopher Glenn

    2013-01-01

    Human Computation methods such as crowdsourcing and games with a purpose (GWAP) have each recently drawn considerable attention for their ability to synergize the strengths of people and technology to accomplish tasks that are challenging for either to do well alone. Despite this increased attention, much of this transformation has been focused on…

  10. 48 CFR 2905.101 - Methods of disseminating information.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 7 2010-10-01 2010-10-01 false Methods of disseminating information. 2905.101 Section 2905.101 Federal Acquisition Regulations System DEPARTMENT OF LABOR ACQUISITION PLANNING PUBLICIZING CONTRACT ACTIONS Dissemination of Information 2905.101 Methods of disseminating information. Contracting officers...

  11. Information in Our World: Conceptions of Information and Problems of Method in Information Science

    ERIC Educational Resources Information Center

    Ma, Lai

    2012-01-01

    Many concepts of information have been proposed and discussed in library and information science. These concepts of information can be broadly categorized as empirical and situational information. Unlike nomenclatures in many sciences, however, the concept of information in library and information science does not bear a generally accepted…

  12. Information in Our World: Conceptions of Information and Problems of Method in Information Science

    ERIC Educational Resources Information Center

    Ma, Lai

    2012-01-01

    Many concepts of information have been proposed and discussed in library and information science. These concepts of information can be broadly categorized as empirical and situational information. Unlike nomenclatures in many sciences, however, the concept of information in library and information science does not bear a generally accepted…

  13. A Systematic Review of Rural, Theory-based Physical Activity Interventions.

    PubMed

    Walsh, Shana M; Meyer, M Renée Umstattd; Gamble, Abigail; Patterson, Megan S; Moore, Justin B

    2017-05-01

    This systematic review synthesized the scientific literature on theory-based physical activity (PA) interventions in rural populations. PubMed, PsycINFO, and Web of Science databases were searched to identify studies with a rural study sample, PA as a primary outcome, use of a behavioral theory or model, randomized or quasi-experimental research design, and application at the primary and/or secondary level of prevention. Thirty-one studies met our inclusion criteria. The Social Cognitive Theory (N = 14) and Transtheoretical Model (N = 10) were the most frequently identified theories; however, most intervention studies were informed by theory but lacked higher-level theoretical application and testing. Interventions largely took place in schools (N = 10) and with female-only samples (N = 8). Findings demonstrated that theory-based PA interventions are mostly successful at increasing PA in rural populations but require improvement. Future studies should incorporate higher levels of theoretical application, and should explore adapting or developing rural-specific theories. Study designs should employ more rigorous research methods to decrease bias and increase validity of findings. Follow-up assessments to determine behavioral maintenance and/or intervention sustainability are warranted. Finally, funding agencies and journals are encouraged to adopt rural-urban commuting area codes as the standard for defining rural.

  14. Communication style and exercise compliance in physiotherapy (CONNECT): a cluster randomized controlled trial to test a theory-based intervention to increase chronic low back pain patients' adherence to physiotherapists' recommendations: study rationale, design, and methods.

    PubMed

    Lonsdale, Chris; Hall, Amanda M; Williams, Geoffrey C; McDonough, Suzanne M; Ntoumanis, Nikos; Murray, Aileen; Hurley, Deirdre A

    2012-06-15

    received the communication skills training. Outcome assessors will also be blinded.We will use linear mixed modeling to test between arm differences both in the mean levels and the rates of change of the outcome variables. We will employ structural equation modeling to examine the process of change, including hypothesized mediation effects. This trial will be the first to test the effect of a self-determination theory-based communication skills training program for physiotherapists on their low back pain patients' adherence to rehabilitation recommendations.

  15. Versatile Formal Methods Applied to Quantum Information.

    SciTech Connect

    Witzel, Wayne; Rudinger, Kenneth Michael; Sarovar, Mohan

    2015-11-01

    Using a novel formal methods approach, we have generated computer-veri ed proofs of major theorems pertinent to the quantum phase estimation algorithm. This was accomplished using our Prove-It software package in Python. While many formal methods tools are available, their practical utility is limited. Translating a problem of interest into these systems and working through the steps of a proof is an art form that requires much expertise. One must surrender to the preferences and restrictions of the tool regarding how mathematical notions are expressed and what deductions are allowed. Automation is a major driver that forces restrictions. Our focus, on the other hand, is to produce a tool that allows users the ability to con rm proofs that are essentially known already. This goal is valuable in itself. We demonstrate the viability of our approach that allows the user great exibility in expressing state- ments and composing derivations. There were no major obstacles in following a textbook proof of the quantum phase estimation algorithm. There were tedious details of algebraic manipulations that we needed to implement (and a few that we did not have time to enter into our system) and some basic components that we needed to rethink, but there were no serious roadblocks. In the process, we made a number of convenient additions to our Prove-It package that will make certain algebraic manipulations easier to perform in the future. In fact, our intent is for our system to build upon itself in this manner.

  16. Informing Patients About Placebo Effects: Using Evidence, Theory, and Qualitative Methods to Develop a New Website

    PubMed Central

    Greville-Harris, Maddy; Bostock, Jennifer; Din, Amy; Graham, Cynthia A; Lewith, George; Liossi, Christina; O’Riordan, Tim; White, Peter; Yardley, Lucy

    2016-01-01

    Background According to established ethical principles and guidelines, patients in clinical trials should be fully informed about the interventions they might receive. However, information about placebo-controlled clinical trials typically focuses on the new intervention being tested and provides limited and at times misleading information about placebos. Objective We aimed to create an informative, scientifically accurate, and engaging website that could be used to improve understanding of placebo effects among patients who might be considering taking part in a placebo-controlled clinical trial. Methods Our approach drew on evidence-, theory-, and person-based intervention development. We used existing evidence and theory about placebo effects to develop content that was scientifically accurate. We used existing evidence and theory of health behavior to ensure our content would be communicated persuasively, to an audience who might currently be ignorant or misinformed about placebo effects. A qualitative ‘think aloud’ study was conducted in which 10 participants viewed prototypes of the website and spoke their thoughts out loud in the presence of a researcher. Results The website provides information about 10 key topics and uses text, evidence summaries, quizzes, audio clips of patients’ stories, and a short film to convey key messages. Comments from participants in the think aloud study highlighted occasional misunderstandings and off-putting/confusing features. These were addressed by modifying elements of content, style, and navigation to improve participants’ experiences of using the website. Conclusions We have developed an evidence-based website that incorporates theory-based techniques to inform members of the public about placebos and placebo effects. Qualitative research ensured our website was engaging and convincing for our target audience who might not perceive a need to learn about placebo effects. Before using the website in clinical trials

  17. Application of geo-information science methods in ecotourism exploitation

    NASA Astrophysics Data System (ADS)

    Dong, Suocheng; Hou, Xiaoli

    2004-11-01

    Application of geo-information science methods in ecotourism development was discussed in the article. Since 1990s, geo-information science methods, which take the 3S (Geographic Information System, Global Positioning System, and Remote Sensing) as core techniques, has played an important role in resources reconnaissance, data management, environment monitoring, and regional planning. Geo-information science methods can easily analyze and convert geographic spatial data. The application of 3S methods is helpful to sustainable development in tourism. Various assignments are involved in the development of ecotourism, such as reconnaissance of ecotourism resources, drawing of tourism maps, dealing with mass data, and also tourism information inquire, employee management, quality management of products. The utilization of geo-information methods in ecotourism can make the development more efficient by promoting the sustainable development of tourism and the protection of eco-environment.

  18. Evaluating Theory-Based Evaluation: Information, Norms, and Adherence

    ERIC Educational Resources Information Center

    Jacobs, W. Jake; Sisco, Melissa; Hill, Dawn; Malter, Frederic; Figueredo, Aurelio Jose

    2012-01-01

    Programmatic social interventions attempt to produce appropriate social-norm-guided behavior in an open environment. A marriage of applicable psychological theory, appropriate program evaluation theory, and outcome of evaluations of specific social interventions assures the acquisition of cumulative theory and the production of successful social…

  19. Evaluating Theory-Based Evaluation: Information, Norms, and Adherence

    ERIC Educational Resources Information Center

    Jacobs, W. Jake; Sisco, Melissa; Hill, Dawn; Malter, Frederic; Figueredo, Aurelio Jose

    2012-01-01

    Programmatic social interventions attempt to produce appropriate social-norm-guided behavior in an open environment. A marriage of applicable psychological theory, appropriate program evaluation theory, and outcome of evaluations of specific social interventions assures the acquisition of cumulative theory and the production of successful social…

  20. Axiomatic Evaluation Method and Content Structure for Information Appliances

    ERIC Educational Resources Information Center

    Guo, Yinni

    2010-01-01

    Extensive studies have been conducted to determine how best to present information in order to enhance usability, but not what information is needed to be presented for effective decision making. Hence, this dissertation addresses the factor structure of the nature of information needed for presentation and proposes a more effective method than…

  1. Axiomatic Evaluation Method and Content Structure for Information Appliances

    ERIC Educational Resources Information Center

    Guo, Yinni

    2010-01-01

    Extensive studies have been conducted to determine how best to present information in order to enhance usability, but not what information is needed to be presented for effective decision making. Hence, this dissertation addresses the factor structure of the nature of information needed for presentation and proposes a more effective method than…

  2. Database design using NIAM (Nijssen Information Analysis Method) modeling

    SciTech Connect

    Stevens, N.H.

    1989-01-01

    The Nissjen Information Analysis Method (NIAM) is an information modeling technique based on semantics and founded in set theory. A NIAM information model is a graphical representation of the information requirements for some universe of discourse. Information models facilitate data integration and communication within an organization about data semantics. An information model is sometimes referred to as the semantic model or the conceptual schema. It helps in the logical and physical design and implementation of databases. NIAM information modeling is used at Sandia National Laboratories to design and implement relational databases containing engineering information which meet the users' information requirements. The paper focuses on the design of one database which satisfied the data needs of four disjoint but closely related applications. The applications as they existed before did not talk to each other even though they stored much of the same data redundantly. NIAM was used to determine the information requirements and design the integrated database. 6 refs., 7 figs.

  3. Method and system of integrating information from multiple sources

    DOEpatents

    Alford, Francine A.; Brinkerhoff, David L.

    2006-08-15

    A system and method of integrating information from multiple sources in a document centric application system. A plurality of application systems are connected through an object request broker to a central repository. The information may then be posted on a webpage. An example of an implementation of the method and system is an online procurement system.

  4. 19 CFR 201.9 - Methods employed in obtaining information.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 19 Customs Duties 3 2010-04-01 2010-04-01 false Methods employed in obtaining information. 201.9... APPLICATION Initiation and Conduct of Investigations § 201.9 Methods employed in obtaining information. In... agencies of the Government, through questionnaires and correspondence, through field work by members of...

  5. Collecting Information for Rating Global Assessment of Functioning (GAF): Sources of Information and Methods for Information Collection

    PubMed Central

    Aas, I. H. Monrad

    2014-01-01

    Introduction: Global Assessment of Functioning (GAF) is an assessment instrument that is known worldwide. It is widely used for rating the severity of illness. Results from evaluations in psychiatry should characterize the patients. Rating of GAF is based on collected information. The aim of the study is to identify the factors involved in collecting information that is relevant for rating GAF, and gaps in knowledge where it is likely that further development would play a role for improved scoring. Methods: A literature search was conducted with a combination of thorough hand search and search in the bibliographic databases PubMed, PsycINFO, Google Scholar, and Campbell Collaboration Library of Systematic Reviews. Results: Collection of information for rating GAF depends on two fundamental factors: the sources of information and the methods for information collection. Sources of information are patients, informants, health personnel, medical records, letters of referral and police records about violence and substance abuse. Methods for information collection include the many different types of interview – unstructured, semi-structured, structured, interviews for Axis I and II disorders, semistructured interviews for rating GAF, and interviews of informants – as well as instruments for rating symptoms and functioning, and observation. The different sources of information, and methods for collection, frequently result in inconsistencies in the information collected. The variation in collected information, and lack of a generally accepted algorithm for combining collected information, is likely to be important for rated GAF values, but there is a fundamental lack of knowledge about the degree of importance. Conclusions: Research to improve GAF has not reached a high level. Rated GAF values are likely to be influenced by both the sources of information used and the methods employed for information collection, but the lack of research-based information about these

  6. Remarks on the information entropy maximization method and extended thermodynamics

    NASA Astrophysics Data System (ADS)

    Eu, Byung Chan

    1998-04-01

    The information entropy maximization method was applied by Jou et al. [J. Phys. A 17, 2799 (1984)] to heat conduction in the past. Advancing this method one more step, Nettleton [J. Chem. Phys. 106, 10311 (1997)] combined the method with a projection operator technique to derive a set of evolution equations for macroscopic variables from the Liouville equation for a simple liquid, and a claim was made that the method provides a statistical mechanical theory basis of irreversible processes and, in particular, of extended thermodynamics which is consistent with the laws of thermodynamics. This line of information entropy maximization method is analyzed from the viewpoint of the laws of thermodynamics in this paper.

  7. Compressed sensing theory-based channel estimation for optical orthogonal frequency division multiplexing communication system

    NASA Astrophysics Data System (ADS)

    Zhao, Hui; Li, Minghui; Wang, Ruyan; Liu, Yuanni; Song, Daiping

    2014-09-01

    Due to the spare multipath property of the channel, a channel estimation method, which is based on partial superimposed training sequence and compressed sensing theory, is proposed for line of sight optical orthogonal frequency division multiplexing communication systems. First, a continuous training sequence is added at variable power ratio to the cyclic prefix of orthogonal frequency division multiplexing symbols at the transmitter prior to transmission. Then the observation matrix of compressed sensing theory is structured by the use of the training symbols at receiver. Finally, channel state information is estimated using sparse signal reconstruction algorithm. Compared to traditional training sequences, the proposed partial superimposed training sequence not only improves the spectral efficiency, but also reduces the influence to information symbols. In addition, compared with classical least squares and linear minimum mean square error methods, the proposed compressed sensing theory based channel estimation method can improve both the estimation accuracy and the system performance. Simulation results are given to demonstrate the performance of the proposed method.

  8. Discourse and Practice in Information Literacy and Information Seeking: Gaps and Opportunities

    ERIC Educational Resources Information Center

    Julien, H.; Williamson, K.

    2010-01-01

    Introduction: This paper argues for increased research consideration of the conceptual overlap between information seeking and information literacy, and for scholarly attention to theory-based empirical research that has potential value to practitioners. Method: The paper reviews information seeking and information literacy research, and…

  9. Collecting Information for Rating Global Assessment of Functioning (GAF): Sources of Information and Methods for Information Collection.

    PubMed

    I H, Monrad Aas

    2014-11-01

    Global Assessment of Functioning (GAF) is an assessment instrument that is known worldwide. It is widely used for rating the severity of illness. Results from evaluations in psychiatry should characterize the patients. Rating of GAF is based on collected information. The aim of the study is to identify the factors involved in collecting information that is relevant for rating GAF, and gaps in knowledge where it is likely that further development would play a role for improved scoring. A literature search was conducted with a combination of thorough hand search and search in the bibliographic databases PubMed, PsycINFO, Google Scholar, and Campbell Collaboration Library of Systematic Reviews. Collection of information for rating GAF depends on two fundamental factors: the sources of information and the methods for information collection. Sources of information are patients, informants, health personnel, medical records, letters of referral and police records about violence and substance abuse. Methods for information collection include the many different types of interview - unstructured, semi-structured, structured, interviews for Axis I and II disorders, semistructured interviews for rating GAF, and interviews of informants - as well as instruments for rating symptoms and functioning, and observation. The different sources of information, and methods for collection, frequently result in inconsistencies in the information collected. The variation in collected information, and lack of a generally accepted algorithm for combining collected information, is likely to be important for rated GAF values, but there is a fundamental lack of knowledge about the degree of importance. Research to improve GAF has not reached a high level. Rated GAF values are likely to be influenced by both the sources of information used and the methods employed for information collection, but the lack of research-based information about these influences is fundamental. Further development of

  10. XML-based product information processing method for product design

    NASA Astrophysics Data System (ADS)

    Zhang, Zhen Yu

    2012-01-01

    Design knowledge of modern mechatronics product is based on information processing as the center of the knowledge-intensive engineering, thus product design innovation is essentially the knowledge and information processing innovation. Analysis of the role of mechatronics product design knowledge and information management features, a unified model of XML-based product information processing method is proposed. Information processing model of product design includes functional knowledge, structural knowledge and their relationships. For the expression of product function element, product structure element, product mapping relationship between function and structure based on the XML model are proposed. The information processing of a parallel friction roller is given as an example, which demonstrates that this method is obviously helpful for knowledge-based design system and product innovation.

  11. XML-based product information processing method for product design

    NASA Astrophysics Data System (ADS)

    Zhang, Zhen Yu

    2011-12-01

    Design knowledge of modern mechatronics product is based on information processing as the center of the knowledge-intensive engineering, thus product design innovation is essentially the knowledge and information processing innovation. Analysis of the role of mechatronics product design knowledge and information management features, a unified model of XML-based product information processing method is proposed. Information processing model of product design includes functional knowledge, structural knowledge and their relationships. For the expression of product function element, product structure element, product mapping relationship between function and structure based on the XML model are proposed. The information processing of a parallel friction roller is given as an example, which demonstrates that this method is obviously helpful for knowledge-based design system and product innovation.

  12. Opinion: Clarifying Two Controversies about Information Mapping's Method.

    ERIC Educational Resources Information Center

    Horn, Robert E.

    1992-01-01

    Describes Information Mapping, a methodology for the analysis, organization, sequencing, and presentation of information and explains three major parts of the method: (1) content analysis, (2) project life-cycle synthesis and integration of the content analysis, and (3) sequencing and formatting. Major criticisms of the methodology are addressed.…

  13. Consent, Informal Organization and Job Rewards: A Mixed Methods Analysis

    ERIC Educational Resources Information Center

    Laubach, Marty

    2005-01-01

    This study uses a mixed methods approach to workplace dynamics. Ethnographic observations show that the consent deal underlies an informal stratification that divides the workplace into an "informal periphery," a "conventional core" and an "administrative clan." The "consent deal" is defined as an exchange of autonomy, voice and schedule…

  14. Consent, Informal Organization and Job Rewards: A Mixed Methods Analysis

    ERIC Educational Resources Information Center

    Laubach, Marty

    2005-01-01

    This study uses a mixed methods approach to workplace dynamics. Ethnographic observations show that the consent deal underlies an informal stratification that divides the workplace into an "informal periphery," a "conventional core" and an "administrative clan." The "consent deal" is defined as an exchange of autonomy, voice and schedule…

  15. Information theory in living systems, methods, applications, and challenges.

    PubMed

    Gatenby, Robert A; Frieden, B Roy

    2007-02-01

    Living systems are distinguished in nature by their ability to maintain stable, ordered states far from equilibrium. This is despite constant buffeting by thermodynamic forces that, if unopposed, will inevitably increase disorder. Cells maintain a steep transmembrane entropy gradient by continuous application of information that permits cellular components to carry out highly specific tasks that import energy and export entropy. Thus, the study of information storage, flow and utilization is critical for understanding first principles that govern the dynamics of life. Initial biological applications of information theory (IT) used Shannon's methods to measure the information content in strings of monomers such as genes, RNA, and proteins. Recent work has used bioinformatic and dynamical systems to provide remarkable insights into the topology and dynamics of intracellular information networks. Novel applications of Fisher-, Shannon-, and Kullback-Leibler informations are promoting increased understanding of the mechanisms by which genetic information is converted to work and order. Insights into evolution may be gained by analysis of the the fitness contributions from specific segments of genetic information as well as the optimization process in which the fitness are constrained by the substrate cost for its storage and utilization. Recent IT applications have recognized the possible role of nontraditional information storage structures including lipids and ion gradients as well as information transmission by molecular flux across cell membranes. Many fascinating challenges remain, including defining the intercellular information dynamics of multicellular organisms and the role of disordered information storage and flow in disease.

  16. Assessment of density functional theory based Î'SCF (self-consistent field) and linear response methods for longest wavelength excited states of extended π-conjugated molecular systems

    NASA Astrophysics Data System (ADS)

    Filatov, Michael; Huix-Rotllant, Miquel

    2014-07-01

    Computational investigation of the longest wavelength excitations in a series of cyanines and linear n-acenes is undertaken with the use of standard spin-conserving linear response time-dependent density functional theory (TD-DFT) as well as its spin-flip variant and a ΔSCF method based on the ensemble DFT. The spin-conserving linear response TD-DFT fails to accurately reproduce the lowest excitation energy in these π-conjugated systems by strongly overestimating the excitation energies of cyanines and underestimating the excitation energies of n-acenes. The spin-flip TD-DFT is capable of correcting the underestimation of excitation energies of n-acenes by bringing in the non-dynamic electron correlation into the ground state; however, it does not fully correct for the overestimation of the excitation energies of cyanines, for which the non-dynamic correlation does not seem to play a role. The ensemble DFT method employed in this work is capable of correcting for the effect of missing non-dynamic correlation in the ground state of n-acenes and for the deficient description of differential correlation effects between the ground and excited states of cyanines and yields the excitation energies of both types of extended π-conjugated systems with the accuracy matching high-level ab initio multireference calculations.

  17. Assessment of density functional theory based ΔSCF (self-consistent field) and linear response methods for longest wavelength excited states of extended π-conjugated molecular systems.

    PubMed

    Filatov, Michael; Huix-Rotllant, Miquel

    2014-07-14

    Computational investigation of the longest wavelength excitations in a series of cyanines and linear n-acenes is undertaken with the use of standard spin-conserving linear response time-dependent density functional theory (TD-DFT) as well as its spin-flip variant and a ΔSCF method based on the ensemble DFT. The spin-conserving linear response TD-DFT fails to accurately reproduce the lowest excitation energy in these π-conjugated systems by strongly overestimating the excitation energies of cyanines and underestimating the excitation energies of n-acenes. The spin-flip TD-DFT is capable of correcting the underestimation of excitation energies of n-acenes by bringing in the non-dynamic electron correlation into the ground state; however, it does not fully correct for the overestimation of the excitation energies of cyanines, for which the non-dynamic correlation does not seem to play a role. The ensemble DFT method employed in this work is capable of correcting for the effect of missing non-dynamic correlation in the ground state of n-acenes and for the deficient description of differential correlation effects between the ground and excited states of cyanines and yields the excitation energies of both types of extended π-conjugated systems with the accuracy matching high-level ab initio multireference calculations.

  18. Assessment of density functional theory based ΔSCF (self-consistent field) and linear response methods for longest wavelength excited states of extended π-conjugated molecular systems

    SciTech Connect

    Filatov, Michael; Huix-Rotllant, Miquel

    2014-07-14

    Computational investigation of the longest wavelength excitations in a series of cyanines and linear n-acenes is undertaken with the use of standard spin-conserving linear response time-dependent density functional theory (TD-DFT) as well as its spin-flip variant and a ΔSCF method based on the ensemble DFT. The spin-conserving linear response TD-DFT fails to accurately reproduce the lowest excitation energy in these π-conjugated systems by strongly overestimating the excitation energies of cyanines and underestimating the excitation energies of n-acenes. The spin-flip TD-DFT is capable of correcting the underestimation of excitation energies of n-acenes by bringing in the non-dynamic electron correlation into the ground state; however, it does not fully correct for the overestimation of the excitation energies of cyanines, for which the non-dynamic correlation does not seem to play a role. The ensemble DFT method employed in this work is capable of correcting for the effect of missing non-dynamic correlation in the ground state of n-acenes and for the deficient description of differential correlation effects between the ground and excited states of cyanines and yields the excitation energies of both types of extended π-conjugated systems with the accuracy matching high-level ab initio multireference calculations.

  19. A Method of Integrated Description of Design Information for Reusability

    NASA Astrophysics Data System (ADS)

    Tsumaya, Akira; Nagae, Masao; Wakamatsu, Hidefumi; Shirase, Keiichi; Arai, Eiji

    Much of product design is executed concurrently these days. For such concurrent design, the method which can share and ueuse varioud kind of design information among designers is needed. However, complete understanding of the design information among designers have been a difficult issue. In this paper, design process model with use of designers’ intention is proposed. A method to combine the design process information and the design object information is also proposed. We introduce how to describe designers’ intention by providing some databases. Keyword Database consists of ontological data related to design object/activities. Designers select suitable keyword(s) from Keyword Database and explain the reason/ideas for their design activities by the description with use of keyword(s). We also developed the integration design information management system architecture by using a method of integrated description with designers’ intension. This system realizes connections between the information related to design process and that related to design object through designers’ intention. Designers can communicate with each other to understand how others make decision in design through that. Designers also can re-use both design process information data and design object information data through detabase management sub-system.

  20. A New Method for Conceptual Modelling of Information Systems

    NASA Astrophysics Data System (ADS)

    Gustas, Remigijus; Gustiene, Prima

    Service architecture is not necessarily bound to the technical aspects of information system development. It can be defined by using conceptual models that are independent of any implementation technology. Unfortunately, the conventional information system analysis and design methods cover just a part of required modelling notations for engineering of service architectures. They do not provide effective support to maintain semantic integrity between business processes and data. Service orientation is a paradigm that can be applied for conceptual modelling of information systems. The concept of service is rather well understood in different domains. It can be applied equally well for conceptualization of organizational and technical information system components. This chapter concentrates on analysis of the differences between service-oriented modelling and object-oriented modelling. Service-oriented method is used for semantic integration of information system static and dynamic aspects.

  1. How Qualitative Methods Can be Used to Inform Model Development.

    PubMed

    Husbands, Samantha; Jowett, Susan; Barton, Pelham; Coast, Joanna

    2017-06-01

    Decision-analytic models play a key role in informing healthcare resource allocation decisions. However, there are ongoing concerns with the credibility of models. Modelling methods guidance can encourage good practice within model development, but its value is dependent on its ability to address the areas that modellers find most challenging. Further, it is important that modelling methods and related guidance are continually updated in light of any new approaches that could potentially enhance model credibility. The objective of this article was to highlight the ways in which qualitative methods have been used and recommended to inform decision-analytic model development and enhance modelling practices. With reference to the literature, the article discusses two key ways in which qualitative methods can be, and have been, applied. The first approach involves using qualitative methods to understand and inform general and future processes of model development, and the second, using qualitative techniques to directly inform the development of individual models. The literature suggests that qualitative methods can improve the validity and credibility of modelling processes by providing a means to understand existing modelling approaches that identifies where problems are occurring and further guidance is needed. It can also be applied within model development to facilitate the input of experts to structural development. We recommend that current and future model development would benefit from the greater integration of qualitative methods, specifically by studying 'real' modelling processes, and by developing recommendations around how qualitative methods can be adopted within everyday modelling practice.

  2. Interface methods for using intranet portal organizational memory information system.

    PubMed

    Ji, Yong Gu; Salvendy, Gavriel

    2004-12-01

    In this paper, an intranet portal is considered as an information infrastructure (organizational memory information system, OMIS) supporting organizational learning. The properties and the hierarchical structure of information and knowledge in an intranet portal OMIS was identified as a problem for navigation tools of an intranet portal interface. The problem relates to navigation and retrieval functions of intranet portal OMIS and is expected to adversely affect user performance, satisfaction, and usefulness. To solve the problem, a conceptual model for navigation tools of an intranet portal interface was proposed and an experiment using a crossover design was conducted with 10 participants. In the experiment, a separate access method (tabbed tree tool) was compared to an unified access method (single tree tool). The results indicate that each information/knowledge repository for which a user has a different structural knowledge should be handled separately with a separate access to increase user satisfaction and the usefulness of the OMIS and to improve user performance in navigation.

  3. Observational attachment theory-based parenting measures predict children's attachment narratives independently from social learning theory-based measures.

    PubMed

    Matias, Carla; O'Connor, Thomas G; Futh, Annabel; Scott, Stephen

    2014-01-01

    Conceptually and methodologically distinct models exist for assessing quality of parent-child relationships, but few studies contrast competing models or assess their overlap in predicting developmental outcomes. Using observational methodology, the current study examined the distinctiveness of attachment theory-based and social learning theory-based measures of parenting in predicting two key measures of child adjustment: security of attachment narratives and social acceptance in peer nominations. A total of 113 5-6-year-old children from ethnically diverse families participated. Parent-child relationships were rated using standard paradigms. Measures derived from attachment theory included sensitive responding and mutuality; measures derived from social learning theory included positive attending, directives, and criticism. Child outcomes were independently-rated attachment narrative representations and peer nominations. Results indicated that Attachment theory-based and Social Learning theory-based measures were modestly correlated; nonetheless, parent-child mutuality predicted secure child attachment narratives independently of social learning theory-based measures; in contrast, criticism predicted peer-nominated fighting independently of attachment theory-based measures. In young children, there is some evidence that attachment theory-based measures may be particularly predictive of attachment narratives; however, no single model of measuring parent-child relationships is likely to best predict multiple developmental outcomes. Assessment in research and applied settings may benefit from integration of different theoretical and methodological paradigms.

  4. System and method for acquisition management of subject position information

    DOEpatents

    Carrender, Curt

    2005-12-13

    A system and method for acquisition management of subject position information that utilizes radio frequency identification (RF ID) to store position information in position tags. Tag programmers receive position information from external positioning systems, such as the Global Positioning System (GPS), from manual inputs, such as keypads, or other tag programmers. The tag programmers program each position tag with the received position information. Both the tag programmers and the position tags can be portable or fixed. Implementations include portable tag programmers and fixed position tags for subject position guidance, and portable tag programmers for collection sample labeling. Other implementations include fixed tag programmers and portable position tags for subject route recordation. Position tags can contain other associated information such as destination address of an affixed subject for subject routing.

  5. System and method for acquisition management of subject position information

    DOEpatents

    Carrender, Curt

    2007-01-23

    A system and method for acquisition management of subject position information that utilizes radio frequency identification (RF ID) to store position information in position tags. Tag programmers receive position information from external positioning systems, such as the Global Positioning System (GPS), from manual inputs, such as keypads, or other tag programmers. The tag programmers program each position tag with the received position information. Both the tag programmers and the position tags can be portable or fixed. Implementations include portable tag programmers and fixed position tags for subject position guidance, and portable tag programmers for collection sample labeling. Other implementations include fixed tag programmers and portable position tags for subject route recordation. Position tags can contain other associated information such as destination address of an affixed subject for subject routing.

  6. Adaptive windowed range-constrained Otsu method using local information

    NASA Astrophysics Data System (ADS)

    Zheng, Jia; Zhang, Dinghua; Huang, Kuidong; Sun, Yuanxi; Tang, Shaojie

    2016-01-01

    An adaptive windowed range-constrained Otsu method using local information is proposed for improving the performance of image segmentation. First, the reason why traditional thresholding methods do not perform well in the segmentation of complicated images is analyzed. Therein, the influences of global and local thresholdings on the image segmentation are compared. Second, two methods that can adaptively change the size of the local window according to local information are proposed by us. The characteristics of the proposed methods are analyzed. Thereby, the information on the number of edge pixels in the local window of the binarized variance image is employed to adaptively change the local window size. Finally, the superiority of the proposed method over other methods such as the range-constrained Otsu, the active contour model, the double Otsu, the Bradley's, and the distance-regularized level set evolution is demonstrated. It is validated by the experiments that the proposed method can keep more details and acquire much more satisfying area overlap measure as compared with the other conventional methods.

  7. Density functional Theory Based Generalized Effective Fragment Potential Method (Postprint)

    DTIC Science & Technology

    2014-07-01

    Pyrazine )2 − 4.20 1.35 − 1.75 − 3.65 13, (Uracil)2 stack − 9.74 − 11.98 − 6.74 − 10.85 14, Indole · benzene stack − 4.59 7.70 − 1.39 − 3.44 15, Adenine...localized orbitals and D3 dispersion correction. Polarization terms for formic acid, uracil, 2-pyridoxine, 2-aminopyridine thymine, indole, C6H6, pyrazine ...11 (C6H6)2 stack 12 ( Pyrazine )2 13 (Uracil)2 stack 14 Indole·Benzene stack 15 Adenine·Thymine stack Mixed

  8. Application of Mutual Information Methods in Time-Distance Helioseismology

    NASA Astrophysics Data System (ADS)

    Keys, Dustin; Kholikov, Shukur; Pevtsov, Alexei A.

    2015-03-01

    We apply a new technique, the mutual information (MI) from information theory, to time-distance helioseismology, and demonstrate that it can successfully reproduce several classic results based on the widely used cross-covariance method. MI quantifies the deviation of two random variables from complete independence and represents a more general method for detecting dependencies in time series than the cross-covariance function, which only detects linear relationships. We briefly describe the MI-based technique and discuss the results of applying MI to derive the solar differential profile, a travel-time deviation map for a sunspot, and a time-distance diagram from quiet-Sun measurements.

  9. Financial time series analysis based on information categorization method

    NASA Astrophysics Data System (ADS)

    Tian, Qiang; Shang, Pengjian; Feng, Guochen

    2014-12-01

    The paper mainly applies the information categorization method to analyze the financial time series. The method is used to examine the similarity of different sequences by calculating the distances between them. We apply this method to quantify the similarity of different stock markets. And we report the results of similarity in US and Chinese stock markets in periods 1991-1998 (before the Asian currency crisis), 1999-2006 (after the Asian currency crisis and before the global financial crisis), and 2007-2013 (during and after global financial crisis) by using this method. The results show the difference of similarity between different stock markets in different time periods and the similarity of the two stock markets become larger after these two crises. Also we acquire the results of similarity of 10 stock indices in three areas; it means the method can distinguish different areas' markets from the phylogenetic trees. The results show that we can get satisfactory information from financial markets by this method. The information categorization method can not only be used in physiologic time series, but also in financial time series.

  10. A theory-based approach to nursing shared governance.

    PubMed

    Joseph, M Lindell; Bogue, Richard J

    2016-01-01

    The discipline of nursing uses a general definition of shared governance. The discipline's lack of a specified theory with precepts and propositions contributes to persistent barriers in progress toward building evidence-based knowledge through systematic study. The purposes of this article were to describe the development and elements of a program theory approach for nursing shared governance implementation and to recommend further testing. Five studies using multiple methods are described using a structured framework. The studies led to the use of Lipsey's method of theory development for program implementation to develop a theory for shared governance for nursing. Nine competencies were verified to define nursing practice council effectiveness. Other findings reveal that nurse empowerment results from alignment between the competencies of self- directed work teams and the competencies of organizational leaders. Implementation of GEMS theory based nursing shared governance can advance goals at the individual, unit, department, and organization level. Advancing professional nursing practice requires that nursing concepts are systematically studied and then formalized for implementation. This article describes the development of a theoretical foundation for the systematic study and implementation of nursing shared governance. Crown Copyright © 2016. Published by Elsevier Inc. All rights reserved.

  11. The value of value of information: best informing research design and prioritization using current methods.

    PubMed

    Eckermann, Simon; Karnon, Jon; Willan, Andrew R

    2010-01-01

    Value of information (VOI) methods have been proposed as a systematic approach to inform optimal research design and prioritization. Four related questions arise that VOI methods could address. (i) Is further research for a health technology assessment (HTA) potentially worthwhile? (ii) Is the cost of a given research design less than its expected value? (iii) What is the optimal research design for an HTA? (iv) How can research funding be best prioritized across alternative HTAs? Following Occam's razor, we consider the usefulness of VOI methods in informing questions 1-4 relative to their simplicity of use. Expected value of perfect information (EVPI) with current information, while simple to calculate, is shown to provide neither a necessary nor a sufficient condition to address question 1, given that what EVPI needs to exceed varies with the cost of research design, which can vary from very large down to negligible. Hence, for any given HTA, EVPI does not discriminate, as it can be large and further research not worthwhile or small and further research worthwhile. In contrast, each of questions 1-4 are shown to be fully addressed (necessary and sufficient) where VOI methods are applied to maximize expected value of sample information (EVSI) minus expected costs across designs. In comparing complexity in use of VOI methods, applying the central limit theorem (CLT) simplifies analysis to enable easy estimation of EVSI and optimal overall research design, and has been shown to outperform bootstrapping, particularly with small samples. Consequently, VOI methods applying the CLT to inform optimal overall research design satisfy Occam's razor in both improving decision making and reducing complexity. Furthermore, they enable consideration of relevant decision contexts, including option value and opportunity cost of delay, time, imperfect implementation and optimal design across jurisdictions. More complex VOI methods such as bootstrapping of the expected value of

  12. Integrating Informative Priors from Experimental Research with Bayesian Methods

    PubMed Central

    Hamra, Ghassan; Richardson, David; MacLehose, Richard; Wing, Steve

    2013-01-01

    Informative priors can be a useful tool for epidemiologists to handle problems of sparse data in regression modeling. It is sometimes the case that an investigator is studying a population exposed to two agents, X and Y, where Y is the agent of primary interest. Previous research may suggest that the exposures have different effects on the health outcome of interest, one being more harmful than the other. Such information may be derived from epidemiologic analyses; however, in the case where such evidence is unavailable, knowledge can be drawn from toxicologic studies or other experimental research. Unfortunately, using toxicologic findings to develop informative priors in epidemiologic analyses requires strong assumptions, with no established method for its utilization. We present a method to help bridge the gap between animal and cellular studies and epidemiologic research by specification of an order-constrained prior. We illustrate this approach using an example from radiation epidemiology. PMID:23222512

  13. Reverse Engineering Cellular Networks with Information Theoretic Methods

    PubMed Central

    Villaverde, Alejandro F.; Ross, John; Banga, Julio R.

    2013-01-01

    Building mathematical models of cellular networks lies at the core of systems biology. It involves, among other tasks, the reconstruction of the structure of interactions between molecular components, which is known as network inference or reverse engineering. Information theory can help in the goal of extracting as much information as possible from the available data. A large number of methods founded on these concepts have been proposed in the literature, not only in biology journals, but in a wide range of areas. Their critical comparison is difficult due to the different focuses and the adoption of different terminologies. Here we attempt to review some of the existing information theoretic methodologies for network inference, and clarify their differences. While some of these methods have achieved notable success, many challenges remain, among which we can mention dealing with incomplete measurements, noisy data, counterintuitive behaviour emerging from nonlinear relations or feedback loops, and computational burden of dealing with large data sets. PMID:24709703

  14. 48 CFR 1205.101 - Methods of disseminating information.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Methods of disseminating information. 1205.101 Section 1205.101 Federal Acquisition Regulations System DEPARTMENT OF TRANSPORTATION... year on their Web site at: http://osdbuweb.dot.gov/business/procurement/forecast.html. ...

  15. Verbal Information Processing Paradigms: A Review of Theory and Methods.

    ERIC Educational Resources Information Center

    Mitchell, Karen J.

    The purpose of this resarch was to develop a model of verbal information processing for use in subsequent analyses of the construct and predictive validity of the current Department of Defense military selection and classification battery, the Armed Services Vocational Aptitude Battery (ASVAB) 8/9/10. The theory and research methods of selected…

  16. Transfer mutual information: A new method for measuring information transfer to the interactions of time series

    NASA Astrophysics Data System (ADS)

    Zhao, Xiaojun; Shang, Pengjian; Lin, Aijing

    2017-02-01

    In this paper, we propose a new method to measure the influence of a third variable on the interactions of two variables. The method called transfer mutual information (TMI) is defined by the difference between the mutual information and the partial mutual information. It is established on the assumption that if the presence or the absence of one variable does make change to the interactions of another two variables, then quantifying this change is supposed to be the influence from this variable to those two variables. Moreover, a normalized TMI and other derivatives of the TMI are introduced as well. The empirical analysis including the simulations as well as real-world applications is investigated to examine this measure and to reveal more information among variables.

  17. Learning styles of baccalaureate nursing students and attitudes toward theory-based nursing.

    PubMed

    Laschinger, H K; Boss, M K

    1989-01-01

    The purpose of this study was to investigate personal and environmental factors related to undergraduate and post-RN nursing students' attitudes toward theory-based nursing from Kolb's experimental learning theory perspective. The study is part of a larger project designed to test aspects of Kolb's theory in the nursing population. Hypotheses about relationships among learning styles, perception of environmental press, experience in nursing, attitudes toward theory-based nursing, preferred nursing theory, and preferred method of learning theory were proposed for investigation. Seventy-six post-RN and 121 upper-level generic baccalaureate nursing students each completed two measures of personal learning style, a measure of perception of environmental press of nursing learning environments, and a nursing theories questionnaire. Learning style and environmental press perceptions were found to be significantly related to attitudes toward theory-based nursing. Concrete learners and subjects who perceived nursing environments to be predominantly concrete were significantly less positive toward theory-based nursing than abstract learners. Experience in nursing was found to be related to perception of environmental press. Learning style was not found to be significantly related to preferred method of learning nursing theories nor to preferred nursing theory for practice. Implications for nursing education are discussed.

  18. Evaluation methods for retrieving information from interferograms of biomedical objects

    NASA Astrophysics Data System (ADS)

    Podbielska, Halina; Rottenkolber, Matthias

    1996-04-01

    Interferograms in the form of fringe patterns can be produced in two-beam interferometers, holographic or speckle interferometers, in setups realizing moire techniques or in deflectometers. Optical metrology based on the principle of interference can be applied as a testing tool in biomedical research. By analyzing of the fringe pattern images, information about the shape or mechanical behavior of the object under study can be retrieved. Here, some of the techniques for creating fringe pattern images were presented along with methods of analysis. Intensity based analysis as well as methods of phase measurements, are mentioned. Applications of inteferometric methods, especially in the field of experimental orthopedics, endoscopy and ophthalmology are pointed out.

  19. A Model-Driven Development Method for Management Information Systems

    NASA Astrophysics Data System (ADS)

    Mizuno, Tomoki; Matsumoto, Keinosuke; Mori, Naoki

    Traditionally, a Management Information System (MIS) has been developed without using formal methods. By the informal methods, the MIS is developed on its lifecycle without having any models. It causes many problems such as lack of the reliability of system design specifications. In order to overcome these problems, a model theory approach was proposed. The approach is based on an idea that a system can be modeled by automata and set theory. However, it is very difficult to generate automata of the system to be developed right from the start. On the other hand, there is a model-driven development method that can flexibly correspond to changes of business logics or implementing technologies. In the model-driven development, a system is modeled using a modeling language such as UML. This paper proposes a new development method for management information systems applying the model-driven development method to a component of the model theory approach. The experiment has shown that a reduced amount of efforts is more than 30% of all the efforts.

  20. Development of a hospital information system using the TAD method.

    PubMed

    Damij, T

    1998-01-01

    To examine the capability of a new object-oriented method called Tabular Application Development (TAD) in developing a hospital information system for a gastroenterology clinic. TAD has five phases. The first phase identifies the problem to be solved. The second phase defines the business processes and activities involved. The third phase develops the object model. The fourth phase designs the application model. The final phase deals with implementation. Eight requirements for the system were identified by hospital management; 17 specific tasks were grouped into three activity categories. The process model, the object model, and the application model of the system are described. The TAD method is capable of developing such an information system without any problem. Furthermore, the method minimizes the time needed to do this in such a way that the process is completely visible to the analyst.

  1. Quantitative methods to direct exploration based on hydrogeologic information

    USGS Publications Warehouse

    Graettinger, A.J.; Lee, J.; Reeves, H.W.; Dethan, D.

    2006-01-01

    Quantitatively Directed Exploration (QDE) approaches based on information such as model sensitivity, input data covariance and model output covariance are presented. Seven approaches for directing exploration are developed, applied, and evaluated on a synthetic hydrogeologic site. The QDE approaches evaluate input information uncertainty, subsurface model sensitivity and, most importantly, output covariance to identify the next location to sample. Spatial input parameter values and covariances are calculated with the multivariate conditional probability calculation from a limited number of samples. A variogram structure is used during data extrapolation to describe the spatial continuity, or correlation, of subsurface information. Model sensitivity can be determined by perturbing input data and evaluating output response or, as in this work, sensitivities can be programmed directly into an analysis model. Output covariance is calculated by the First-Order Second Moment (FOSM) method, which combines the covariance of input information with model sensitivity. A groundwater flow example, modeled in MODFLOW-2000, is chosen to demonstrate the seven QDE approaches. MODFLOW-2000 is used to obtain the piezometric head and the model sensitivity simultaneously. The seven QDE approaches are evaluated based on the accuracy of the modeled piezometric head after information from a QDE sample is added. For the synthetic site used in this study, the QDE approach that identifies the location of hydraulic conductivity that contributes the most to the overall piezometric head variance proved to be the best method to quantitatively direct exploration. ?? IWA Publishing 2006.

  2. Effects of presentation method on the understanding of informed consent.

    PubMed

    Moseley, T H; Wiggins, M N; O'Sullivan, P

    2006-08-01

    Knowledge of which presentation methods impart the most information to patients can improve the informed consent discussion. The purpose of this study was to determine if the comprehension and recall of the informed consent discussion varied with presentation method. Randomised, prospective study at the University of Arkansas for Medical Sciences. 90 freshmen medical students were randomly assigned to one of three groups and separately went through an informed consent on cataract surgery. Group A heard an informed consent presentation. Group B was shown diagrams while hearing the same presentation. Group C heard the consent and then watched an informational video on cataract surgery. A 10 point multiple choice quiz was administered after the presentation and repeated again 1 week later. Scores from each group were averaged as number correct out of 10 questions. For same day scores, group C scores (7.70 (SD 1.24)) were significantly higher than group A (6.39 (1.63)). One week testing revealed that group C (6.96 (1.62)) recalled more between the two time periods and scored significantly higher than groups A (5.15 (2.11)) and B (5.54 (1.64)). This study found differences in the participants' ability to recall facts based on the manner in which the material was presented. It clearly demonstrated that the use of visual aids improved the ability to remember facts and risks associated with cataract surgery beyond a verbal presentation alone. It also showed a benefit of the repetition of information as provided by audiovisual presentations that can be used in conjunction with the physician-patient discussion.

  3. Information System Selection: Methods for Comparing Service Benefits

    PubMed Central

    Bradley, Evelyn; Campbell, James G.

    1981-01-01

    Automated hospital information systems are purchased both for their potential impact on costs (economic benefits) and for their potential impact on the efficiency and effectiveness of hospital performance (Service Benefits). This paper defines and describes Service Benefits and describes their importance in information system selection. Comparing various systems' Service Benefit contributions implies developing a composite measure of potential Service Benefits; this necessitates expressing Service Benefits in a single unit of measure. This paper concludes with discussion of alternative methods for translating Service Benefits into a common unit of measure, so they may be summed and compared for each system under consideration.

  4. Enhancing subsurface information from the fusion of multiple geophysical methods

    NASA Astrophysics Data System (ADS)

    Jafargandomi, A.; Binley, A.

    2011-12-01

    Characterization of hydrologic systems is a key element in understanding and predicting their behaviour. Geophysical methods especially electrical methods (e.g., electrical resistivity tomography (ERT), induced polarization (IP) and electromagnetic (EM)) are becoming popular for such purpose due to their non-invasive nature, high sensitivity to hydrological parameters and the speed of measurements. However, interrogation of each geophysical method provides only limited information about some of the subsurface parameters. Therefore, in order to achieve a comprehensive picture from the hydrologic system, fusion of multiple geophysical data sets can be beneficial. Although a number of fusion approaches have been proposed in the literature, an aspect that has been generally overlooked is the assessment of information content from each measurement approach. Such an assessment provides useful insight for the design of future surveys. We develop a fusion strategy based on the capability of multiple geophysical methods to provide enough resolution to identify subsurface material parameters and structure. We apply a Bayesian framework to analyse the information in multiple geophysical data sets. In this approach multiple geophysical data sets are fed into a Markov chain Monte Carlo (McMC) inversion algorithm and the information content of the post-inversion result (posterior probability distribution) is quantified. We use Shannon's information measure to quantify the information obtained from the inversion of different combinations of geophysical data sets. In this strategy, information from multiple methods is brought together via introducing joint likelihood function and/or constraining the prior information. We apply the fusion tool to one of the target sites of the EU FP7 project ModelProbe which aims to develop technologies and tools for soil contamination assessment and site characterization. The target site is located close to Trecate (Novara - NW Italy). At this

  5. Mixed-methods exploration of parents' health information understanding.

    PubMed

    Lehna, Carlee; McNeil, Jack

    2008-05-01

    Health literacy--the ability to read, understand, and use health information to make health care decisions--affects health care outcomes, hospitalization costs, and readmission. The purpose of this exploratory mixed-methods study is to determine how two different parent groups (English speaking and Spanish speaking) understand medical care for their children and the procedural and research consent forms required by that care. Quantitative and qualitative data are gathered and compared concurrently. Differences between groups are found in age, grade completed, Short Test of Functional Health Literacy in Adults scores, and ways of understanding health information. Identifying how parents understand health information is the first step in providing effective family-centered health care education.

  6. Responses of older adults to theory-based nutrition newsletters.

    PubMed

    Taylor-Davis, S; Smiciklas-Wright, H; Warland, R; Achterberg, C; Jensen, G L; Sayer, A; Shannon, B

    2000-06-01

    To evaluate the effect of a theory-based newsletter on knowledge, attitude, and behavior change in older adults. Pretest-posttest, random assignment, and treatment-control design with 2 treatment groups: 1 that received newsletters only and 1 that received newsletters with follow-up telephone interviews. Control group completed pretest-posttest surveys only. Four hundred eighty men and women, aged 60 to 74 years, were recruited to participate in a home-based educational intervention using a patient list generated from a rural tertiary care hospital database, Geisinger Medical Center in Danville, Pa. Five nutrition newsletters designed using the nutrition communication model and adult learning theory principles were mailed biweekly. Telephone interviews followed each of the 5 newsletters 10 to 14 days after distribution. Nutrition knowledge and interest, food behavior related to dietary fat, and stages of change for dietary fat and fiber. Analysis of covariance was used to determine group differences in posttest outcome measures using pretest as covariate. In addition to achieving higher scores than the control group, the treatment groups were significantly different from each other in correct and perceived nutrition knowledge at posttest. Those in the treatment group receiving telephone calls scored higher (mean change = 19.0% for correct and 20.3% for perceived) than those who received the newsletters only (mean change = 12.5% for correct and 14.3% for perceived; P < .05). Treatment groups also rated their interest in nutrition higher than the control group did; there was no between-treatment difference. Treatment groups performed significantly better than the control group for dietary fiber stage of change (P < .05). Those receiving only newsletters scored significantly better than the control for the "avoid fat" food behavior (P < .05). This study provides an example of the incorporation of a theoretical model in development and evaluation of newsletters. Home

  7. Methods of obtaining meaningful information from disperse media holograms

    NASA Astrophysics Data System (ADS)

    Dyomin, Victor V.

    1997-05-01

    The problem of nondestructive testing of microstructure parameters, both aerosols and water suspension, is actual for biology, medicine, and environmental control. Among the methods of optical investigations and diagnostics of light scattering media the holographic method plays a special role. A hologram of scattering volume allows us to reproduce the optical wave field to obtain information on the parameters of microparticles: size, shape, and spatial position. Usually this is done by analysis of the particle images reconstructed from the hologram. On the basis of calculated and experimental results, characteristics of holographic methods are analyzed in this paper. These estimations demonstrate a possibility to use the above methods for investigation of media in biomedical science and clinical practice. A lot of micro-organisms and other living particles are transparent or semitransparent ones. In this case the reconstructed image of the particle will show a spot formed due to light focusing by the particle in addition to its cross section. This circumstance allowed us to propose a method of determining of refractive index of transparent and semitransparent microparticles, that, in turn, can provide identification of the particles type. The development of this method is presented. To make measurement of the size-distribution of particles one can do this simultaneously with the reconstruction of scattering optical field from the hologram. In this case a small angle optical meter (for example, focusing lens) can be placed just behind the illuminated hologram. The reconstructed field is composed of the initial one and its conjugate. Each of these components as well as interference between them can bear an additional information on the medium. The possibility of extraction of this information is also discussed.

  8. Application of information theory methods to food web reconstruction

    USGS Publications Warehouse

    Moniz, L.J.; Cooch, E.G.; Ellner, S.P.; Nichols, J.D.; Nichols, J.M.

    2007-01-01

    In this paper we use information theory techniques on time series of abundances to determine the topology of a food web. At the outset, the food web participants (two consumers, two resources) are known; in addition we know that each consumer prefers one of the resources over the other. However, we do not know which consumer prefers which resource, and if this preference is absolute (i.e., whether or not the consumer will consume the non-preferred resource). Although the consumers and resources are identified at the beginning of the experiment, we also provide evidence that the consumers are not resources for each other, and the resources do not consume each other. We do show that there is significant mutual information between resources; the model is seasonally forced and some shared information between resources is expected. Similarly, because the model is seasonally forced, we expect shared information between consumers as they respond to the forcing of the resources. The model that we consider does include noise, and in an effort to demonstrate that these methods may be of some use in other than model data, we show the efficacy of our methods with decreasing time series size; in this particular case we obtain reasonably clear results with a time series length of 400 points. This approaches ecological time series lengths from real systems.

  9. Using key informant methods in organizational survey research: assessing for informant bias.

    PubMed

    Hughes, L C; Preski, S

    1997-02-01

    Specification of variables that reflect organizational processes can add an important dimension to the investigation of outcomes. However, many contextual variables are conceptualized at a macro unit of analysis and may not be amenable to direct measurement. In these situations, proxy measurement is obtained by treating organizational members as key informants who report about properties of the work group or organization. Potential sources of bias when using key informant methods in organizational survey research are discussed. Statistical procedures for assessment of rater-trait interaction as a type of informant bias are illustrated using data from a study in which multiple key informants were sampled to obtain proxy measurement of the organizational climate for caring among baccalaureate schools of nursing.

  10. Theory-informed design of values clarification methods: a cognitive psychological perspective on patient health-related decision making.

    PubMed

    Pieterse, Arwen H; de Vries, Marieke; Kunneman, Marleen; Stiggelbout, Anne M; Feldman-Stewart, Deb

    2013-01-01

    Healthcare decisions, particularly those involving weighing benefits and harms that may significantly affect quality and/or length of life, should reflect patients' preferences. To support patients in making choices, patient decision aids and values clarification methods (VCM) in particular have been developed. VCM intend to help patients to determine the aspects of the choices that are important to their selection of a preferred option. Several types of VCM exist. However, they are often designed without clear reference to theory, which makes it difficult for their development to be systematic and internally coherent. Our goal was to provide theory-informed recommendations for the design of VCM. Process theories of decision making specify components of decision processes, thus, identify particular processes that VCM could aim to facilitate. We conducted a review of the MEDLINE and PsycINFO databases and of references to theories included in retrieved papers, to identify process theories of decision making. We selected a theory if (a) it fulfilled criteria for a process theory; (b) provided a coherent description of the whole process of decision making; and (c) empirical evidence supports at least some of its postulates. Four theories met our criteria: Image Theory, Differentiation and Consolidation theory, Parallel Constraint Satisfaction theory, and Fuzzy-trace Theory. Based on these, we propose that VCM should: help optimize mental representations; encourage considering all potentially appropriate options; delay selection of an initially favoured option; facilitate the retrieval of relevant values from memory; facilitate the comparison of options and their attributes; and offer time to decide. In conclusion, our theory-based design recommendations are explicit and transparent, providing an opportunity to test each in a systematic manner.

  11. Smokefree streets: a pilot study of methods to inform policy.

    PubMed

    Parry, Rhys; Prior, Bridget; Sykes, Adrian J; Tay, Jo-Lyn; Walsh, Beth; Wright, Nicholas; Pearce, Karina; Richmond, Georgia; Robertson, Andrew; Roselan, Jalilah; Shum, Puai Yee; Taylor, Greg; Thachanamurthy, Praveene; Zheng, Tony Tianwei; Wilson, Nick; Thomson, George

    2011-05-01

    Smokefree street policies are relatively rare, and little has been published on the methods for establishing an evidence base to inform such policy making. We aimed to (a) pilot methods for such data collection in New Zealand, a country where local governments are actively pursuing outdoor smokefree policies and (a) to provide data on smoking behavior, attitudes toward smokefree policies, and levels of smoke exposure on streets in Wellington. Three methods were piloted: (a) systematic observation of smoking behavior by observers walking a standard route of major streets, the "Golden Mile" (GM) in Wellington (n = 42 observation runs); (b) measurement of fine particulate levels (PM(2.5)) along this route and with purposeful sampling in selected settings; and (c) an attitudinal survey of pedestrians along sections of this route. Each of the 3 methods proved to be feasible in this urban setting. A total of 932 smokers were observed during 21 hr of observation, an average of 7 smokers every 10 min of walking. Air monitoring indicated fine particulate exposure. Levels of (mean) PM(2.5) were 1.5 times higher during periods when smoking was observed than when they were not (9.3 vs. 6.3 μg/m(3), p = .002). Dose-response patterns were observed for smoking proximity and for smoker numbers. Surveying pedestrians (n = 220) with a brief questionnaire achieved an 81% response rate and was able to identify variation in support for a smokefree GM by different groups (overall support was 55.9%, 95% CI = 49.3%-62.4%). Reasons for support were also identified, for example, perceived health hazards, at 34.1%, was the main reason. These methods can provide information that may contribute to the smokefree streets policymaking process and may also be relevant to informing other smokefree outdoor policies.

  12. A queuing-theory-based interval-fuzzy robust two-stage programming model for environmental management under uncertainty

    NASA Astrophysics Data System (ADS)

    Sun, Y.; Li, Y. P.; Huang, G. H.

    2012-06-01

    In this study, a queuing-theory-based interval-fuzzy robust two-stage programming (QB-IRTP) model is developed through introducing queuing theory into an interval-fuzzy robust two-stage (IRTP) optimization framework. The developed QB-IRTP model can not only address highly uncertain information for the lower and upper bounds of interval parameters but also be used for analysing a variety of policy scenarios that are associated with different levels of economic penalties when the promised targets are violated. Moreover, it can reflect uncertainties in queuing theory problems. The developed method has been applied to a case of long-term municipal solid waste (MSW) management planning. Interval solutions associated with different waste-generation rates, different waiting costs and different arriving rates have been obtained. They can be used for generating decision alternatives and thus help managers to identify desired MSW management policies under various economic objectives and system reliability constraints.

  13. Informed consent in colonoscopy: A comparative analysis of 2 methods.

    PubMed

    Sanguinetti, J M; Lotero Polesel, J C; Iriarte, S M; Ledesma, C; Canseco Fuentes, S E; Caro, L E

    2015-01-01

    The manner in which informed consent is obtained varies. The aim of this study is to evaluate the level of knowledge about colonoscopy and comparing 2 methods of obtaining informed consent. A comparative, cross-sectional, observational study was conducted on patients that underwent colonoscopy in a public hospital (Group A) and in a private hospital (Group B). Group A received information verbally from a physician, as well as in the form of printed material, and Group B only received printed material. A telephone survey was carried out one or 2 weeks later. The study included a total of 176 subjects (group A [n=55] and group B [n=121]). As regards education level, 69.88% (n=123) of the patients had completed university education, 23.29% (n= 41) secondary level, 5.68% (n=10) primary level, and the remaining subjects (n=2) had not completed any level of education. All (100%) of the subjects knew the characteristics of the procedure, and 99.43% were aware of its benefits. A total of 97.7% received information about complications, 93.7% named some of them, and 25% (n=44) remembered major complications. All the subjects received, read, and signed the informed consent statement before the study. There were no differences between the groups with respect to knowledge of the characteristics and benefits of the procedure, or the receipt and reading of the consent form. Group B responded better in relation to complications (P=.0027) and group A had a better recollection of the major complications (P<.0001). Group A had a higher number of affirmative answers (P<.0001). The combination of verbal and written information provides the patient with a more comprehensive level of knowledge about the procedure. Copyright © 2014 Asociación Mexicana de Gastroenterología. Published by Masson Doyma México S.A. All rights reserved.

  14. A mixed model reduction method for preserving selected physical information

    NASA Astrophysics Data System (ADS)

    Zhang, Jing; Zheng, Gangtie

    2017-03-01

    A new model reduction method in the frequency domain is presented. By mixedly using the model reduction techniques from both the time domain and the frequency domain, the dynamic model is condensed to selected physical coordinates, and the contribution of slave degrees of freedom is taken as a modification to the model in the form of effective modal mass of virtually constrained modes. The reduced model can preserve the physical information related to the selected physical coordinates such as physical parameters and physical space positions of corresponding structure components. For the cases of non-classical damping, the method is extended to the model reduction in the state space but still only contains the selected physical coordinates. Numerical results are presented to validate the method and show the effectiveness of the model reduction.

  15. A Rapid Usability Evaluation (RUE) Method for Health Information Technology.

    PubMed

    Russ, Alissa L; Baker, Darrell A; Fahner, W Jeffrey; Milligan, Bryce S; Cox, Leeann; Hagg, Heather K; Saleem, Jason J

    2010-11-13

    Usability testing can help generate design ideas to enhance the quality and safety of health information technology. Despite these potential benefits, few healthcare organizations conduct systematic usability testing prior to software implementation. We used a Rapid Usability Evaluation (RUE) method to apply usability testing to software development at a major VA Medical Center. We describe the development of the RUE method, provide two examples of how it was successfully applied, and discuss key insights gained from this work. Clinical informaticists with limited usability training were able to apply RUE to improve software evaluation and elected to continue to use this technique. RUE methods are relatively simple, do not require advanced training or usability software, and should be easy to adopt. Other healthcare organizations may be able to implement RUE to improve software effectiveness, efficiency, and safety.

  16. Emotion identification method using RGB information of human face

    NASA Astrophysics Data System (ADS)

    Kita, Shinya; Mita, Akira

    2015-03-01

    Recently, the number of single households is drastically increased due to the growth of the aging society and the diversity of lifestyle. Therefore, the evolution of building spaces is demanded. Biofied Building we propose can help to avoid this situation. It helps interaction between the building and residents' conscious and unconscious information using robots. The unconscious information includes emotion, condition, and behavior. One of the important information is thermal comfort. We assume we can estimate it from human face. There are many researchs about face color analysis, but a few of them are conducted in real situations. In other words, the existing methods were not used with disturbance such as room lumps. In this study, Kinect was used with face-tracking. Room lumps and task lumps were used to verify that our method could be applicable to real situation. In this research, two rooms at 22 and 28 degrees C were prepared. We showed that the transition of thermal comfort by changing temperature can be observed from human face. Thus, distinction between the data of 22 and 28 degrees C condition from face color was proved to be possible.

  17. A method to stabilize linear systems using eigenvalue gradient information

    NASA Technical Reports Server (NTRS)

    Wieseman, C. D.

    1985-01-01

    Formal optimization methods and eigenvalue gradient information are used to develop a stabilizing control law for a closed loop linear system that is initially unstable. The method was originally formulated by using direct, constrained optimization methods with the constraints being the real parts of the eigenvalues. However, because of problems in trying to achieve stabilizing control laws, the problem was reformulated to be solved differently. The method described uses the Davidon-Fletcher-Powell minimization technique to solve an indirect, constrained minimization problem in which the performance index is the Kreisselmeier-Steinhauser function of the real parts of all the eigenvalues. The method is applied successfully to solve two different problems: the determination of a fourth-order control law stabilizes a single-input single-output active flutter suppression system and the determination of a second-order control law for a multi-input multi-output lateral-directional flight control system. Various sets of design variables and initial starting points were chosen to show the robustness of the method.

  18. Information processing systems, reasoning modules, and reasoning system design methods

    SciTech Connect

    Hohimer, Ryan E.; Greitzer, Frank L.; Hampton, Shawn D.

    2016-08-23

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  19. Information processing systems, reasoning modules, and reasoning system design methods

    SciTech Connect

    Hohimer, Ryan E.; Greitzer, Frank L.; Hampton, Shawn D.

    2015-08-18

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  20. Information processing systems, reasoning modules, and reasoning system design methods

    SciTech Connect

    Hohimer, Ryan E; Greitzer, Frank L; Hampton, Shawn D

    2014-03-04

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  1. Information bias in health research: definition, pitfalls, and adjustment methods

    PubMed Central

    Althubaiti, Alaa

    2016-01-01

    As with other fields, medical sciences are subject to different sources of bias. While understanding sources of bias is a key element for drawing valid conclusions, bias in health research continues to be a very sensitive issue that can affect the focus and outcome of investigations. Information bias, otherwise known as misclassification, is one of the most common sources of bias that affects the validity of health research. It originates from the approach that is utilized to obtain or confirm study measurements. This paper seeks to raise awareness of information bias in observational and experimental research study designs as well as to enrich discussions concerning bias problems. Specifying the types of bias can be essential to limit its effects and, the use of adjustment methods might serve to improve clinical evaluation and health care practice. PMID:27217764

  2. Information bias in health research: definition, pitfalls, and adjustment methods.

    PubMed

    Althubaiti, Alaa

    2016-01-01

    As with other fields, medical sciences are subject to different sources of bias. While understanding sources of bias is a key element for drawing valid conclusions, bias in health research continues to be a very sensitive issue that can affect the focus and outcome of investigations. Information bias, otherwise known as misclassification, is one of the most common sources of bias that affects the validity of health research. It originates from the approach that is utilized to obtain or confirm study measurements. This paper seeks to raise awareness of information bias in observational and experimental research study designs as well as to enrich discussions concerning bias problems. Specifying the types of bias can be essential to limit its effects and, the use of adjustment methods might serve to improve clinical evaluation and health care practice.

  3. a Task-Oriented Disaster Information Correlation Method

    NASA Astrophysics Data System (ADS)

    Linyao, Q.; Zhiqiang, D.; Qing, Z.

    2015-07-01

    With the rapid development of sensor networks and Earth observation technology, a large quantity of disaster-related data is available, such as remotely sensed data, historic data, case data, simulated data, and disaster products. However, the efficiency of current data management and service systems has become increasingly difficult due to the task variety and heterogeneous data. For emergency task-oriented applications, the data searches primarily rely on artificial experience based on simple metadata indices, the high time consumption and low accuracy of which cannot satisfy the speed and veracity requirements for disaster products. In this paper, a task-oriented correlation method is proposed for efficient disaster data management and intelligent service with the objectives of 1) putting forward disaster task ontology and data ontology to unify the different semantics of multi-source information, 2) identifying the semantic mapping from emergency tasks to multiple data sources on the basis of uniform description in 1), and 3) linking task-related data automatically and calculating the correlation between each data set and a certain task. The method goes beyond traditional static management of disaster data and establishes a basis for intelligent retrieval and active dissemination of disaster information. The case study presented in this paper illustrates the use of the method on an example flood emergency relief task.

  4. Determination of nuclear level densities from experimental information

    SciTech Connect

    Cole, B.J. ); Davidson, N.J. , P.O. Box 88, Manchester M60 1QD ); Miller, H.G. )

    1994-10-01

    A novel information theory based method for determining the density of states from prior information is presented. The energy dependence of the density of states is determined from the observed number of states per energy interval, and model calculations suggest that the method is sufficiently reliable to calculate the thermal properties of nuclei over a reasonable temperature range.

  5. How Can Theory-Based Evaluation Make Greater Headway?

    ERIC Educational Resources Information Center

    Weiss, Carol H.

    1997-01-01

    Explores the problems of theory-based evaluation, describes the nature of potential benefits, and suggests that the benefits are significant enough to warrant continued effort to overcome the obstacles and advance its use. Many of the problems are related to inadequate theories about pathways to desired program outcomes. (SLD)

  6. Theory-Based Approaches to the Concept of Life

    ERIC Educational Resources Information Center

    El-Hani, Charbel Nino

    2008-01-01

    In this paper, I argue that characterisations of life through lists of properties have several shortcomings and should be replaced by theory-based accounts that explain the coexistence of a set of properties in living beings. The concept of life should acquire its meaning from its relationships with other concepts inside a theory. I illustrate…

  7. Theory-Based University Admissions Testing for a New Millennium

    ERIC Educational Resources Information Center

    Sternberg, Robert J.

    2004-01-01

    This article describes two projects based on Robert J. Sternberg's theory of successful intelligence and designed to provide theory-based testing for university admissions. The first, Rainbow Project, provided a supplementary test of analytical, practical, and creative skills to augment the SAT in predicting college performance. The Rainbow…

  8. Theory-Based Approaches to the Concept of Life

    ERIC Educational Resources Information Center

    El-Hani, Charbel Nino

    2008-01-01

    In this paper, I argue that characterisations of life through lists of properties have several shortcomings and should be replaced by theory-based accounts that explain the coexistence of a set of properties in living beings. The concept of life should acquire its meaning from its relationships with other concepts inside a theory. I illustrate…

  9. Theory-Based Diagnosis and Remediation of Writing Disabilities.

    ERIC Educational Resources Information Center

    Berninger, Virginia W.; And Others

    1991-01-01

    Briefly reviews recent trends in research on writing; introduces theory-based model being developed for differential diagnosis of writing disabilities at neuropsychological, linguistic, and cognitive levels; presents cases and patterns in cases that illustrate differential diagnosis of writing disabilities at linguistic level; and suggests…

  10. Index Theory-Based Algorithm for the Gradiometer Inverse Problem

    DTIC Science & Technology

    2015-03-28

    based gravity gradiometer inverse problem algorithm. This algorithm relates changes in the index value computed on a closed curve containing a line...account for the bounds. Key Words: Gravity Gradiometer, Inverse Problem, Index Theory Mathematics Subject Classification 31A99...Theory based gravity gradiometer inverse problem algorithm. This algorithm relates changes in the index value computed on a closed curve containing a

  11. Theory Based Approaches to Learning. Implications for Adult Educators.

    ERIC Educational Resources Information Center

    Bolton, Elizabeth B.; Jones, Edward V.

    This paper presents a codification of theory-based approaches that are applicable to adult learning situations. It also lists some general guidelines that can be used when selecting a particular approach or theory as a basis for planning instruction. Adult education's emphasis on practicality and the relationship between theory and practice is…

  12. Continuing Bonds in Bereavement: An Attachment Theory Based Perspective

    ERIC Educational Resources Information Center

    Field, Nigel P.; Gao, Beryl; Paderna, Lisa

    2005-01-01

    An attachment theory based perspective on the continuing bond to the deceased (CB) is proposed. The value of attachment theory in specifying the normative course of CB expression and in identifying adaptive versus maladaptive variants of CB expression based on their deviation from this normative course is outlined. The role of individual…

  13. Theory-Based University Admissions Testing for a New Millennium

    ERIC Educational Resources Information Center

    Sternberg, Robert J.

    2004-01-01

    This article describes two projects based on Robert J. Sternberg's theory of successful intelligence and designed to provide theory-based testing for university admissions. The first, Rainbow Project, provided a supplementary test of analytical, practical, and creative skills to augment the SAT in predicting college performance. The Rainbow…

  14. The analysis of network transmission method for welding robot information

    NASA Astrophysics Data System (ADS)

    Cheng, Weide; Zhang, Hua; Liu, Donghua; Wang, Hongbo

    2011-12-01

    On the asis of User Datagram Protocol (UserDatagram Protocol, UDP), to do some improvement and design a welding robot network communication protocol (welding robot network communicate protocol: WRNCP), working on the fields of the transport layer and application layer of TCP / IP protocol. According to the characteristics of video data, to design the radio push-type (Broadcast Push Model, BPM) transmission method, improving the efficiency and stability of video transmission.and to designed the network information transmission system, used for real-time control of welding robot network.

  15. The analysis of network transmission method for welding robot information

    NASA Astrophysics Data System (ADS)

    Cheng, Weide; Zhang, Hua; Liu, Donghua; Wang, Hongbo

    2012-01-01

    On the asis of User Datagram Protocol (UserDatagram Protocol, UDP), to do some improvement and design a welding robot network communication protocol (welding robot network communicate protocol: WRNCP), working on the fields of the transport layer and application layer of TCP / IP protocol. According to the characteristics of video data, to design the radio push-type (Broadcast Push Model, BPM) transmission method, improving the efficiency and stability of video transmission.and to designed the network information transmission system, used for real-time control of welding robot network.

  16. Methods and Systems for Advanced Spaceport Information Management

    NASA Technical Reports Server (NTRS)

    Fussell, Ronald M. (Inventor); Ely, Donald W. (Inventor); Meier, Gary M. (Inventor); Halpin, Paul C. (Inventor); Meade, Phillip T. (Inventor); Jacobson, Craig A. (Inventor); Blackwell-Thompson, Charlie (Inventor)

    2007-01-01

    Advanced spaceport information management methods and systems are disclosed. In one embodiment, a method includes coupling a test system to the payload and transmitting one or more test signals that emulate an anticipated condition from the test system to the payload. One or more responsive signals are received from the payload into the test system and are analyzed to determine whether one or more of the responsive signals comprises an anomalous signal. At least one of the steps of transmitting, receiving, analyzing and determining includes transmitting at least one of the test signals and the responsive signals via a communications link from a payload processing facility to a remotely located facility. In one particular embodiment, the communications link is an Internet link from a payload processing facility to a remotely located facility (e.g. a launch facility, university, etc.).

  17. Methods and systems for advanced spaceport information management

    NASA Technical Reports Server (NTRS)

    Fussell, Ronald M. (Inventor); Ely, Donald W. (Inventor); Meier, Gary M. (Inventor); Halpin, Paul C. (Inventor); Meade, Phillip T. (Inventor); Jacobson, Craig A. (Inventor); Blackwell-Thompson, Charlie (Inventor)

    2007-01-01

    Advanced spaceport information management methods and systems are disclosed. In one embodiment, a method includes coupling a test system to the payload and transmitting one or more test signals that emulate an anticipated condition from the test system to the payload. One or more responsive signals are received from the payload into the test system and are analyzed to determine whether one or more of the responsive signals comprises an anomalous signal. At least one of the steps of transmitting, receiving, analyzing and determining includes transmitting at least one of the test signals and the responsive signals via a communications link from a payload processing facility to a remotely located facility. In one particular embodiment, the communications link is an Internet link from a payload processing facility to a remotely located facility (e.g. a launch facility, university, etc.).

  18. Linear reduction method for predictive and informative tag SNP selection.

    PubMed

    He, Jingwu; Westbrooks, Kelly; Zelikovsky, Alexander

    2005-01-01

    Constructing a complete human haplotype map is helpful when associating complex diseases with their related SNPs. Unfortunately, the number of SNPs is very large and it is costly to sequence many individuals. Therefore, it is desirable to reduce the number of SNPs that should be sequenced to a small number of informative representatives called tag SNPs. In this paper, we propose a new linear algebra-based method for selecting and using tag SNPs. We measure the quality of our tag SNP selection algorithm by comparing actual SNPs with SNPs predicted from selected linearly independent tag SNPs. Our experiments show that for sufficiently long haplotypes, knowing only 0.4% of all SNPs the proposed linear reduction method predicts an unknown haplotype with the error rate below 2% based on 10% of the population.

  19. A diffusive information preservation method for small Knudsen number flows

    SciTech Connect

    Fei, Fei; Fan, Jing

    2013-06-15

    The direct simulation Monte Carlo (DSMC) method is a powerful particle-based method for modeling gas flows. It works well for relatively large Knudsen (Kn) numbers, typically larger than 0.01, but quickly becomes computationally intensive as Kn decreases due to its time step and cell size limitations. An alternative approach was proposed to relax or remove these limitations, based on replacing pairwise collisions with a stochastic model corresponding to the Fokker–Planck equation [J. Comput. Phys., 229, 1077 (2010); J. Fluid Mech., 680, 574 (2011)]. Similar to the DSMC method, the downside of that approach suffers from computationally statistical noise. To solve the problem, a diffusion-based information preservation (D-IP) method has been developed. The main idea is to track the motion of a simulated molecule from the diffusive standpoint, and obtain the flow velocity and temperature through sampling and averaging the IP quantities. To validate the idea and the corresponding model, several benchmark problems with Kn ∼ 10{sup −3}–10{sup −4} have been investigated. It is shown that the IP calculations are not only accurate, but also efficient because they make possible using a time step and cell size over an order of magnitude larger than the mean collision time and mean free path, respectively.

  20. A diffusive information preservation method for small Knudsen number flows

    NASA Astrophysics Data System (ADS)

    Fei, Fei; Fan, Jing

    2013-06-01

    The direct simulation Monte Carlo (DSMC) method is a powerful particle-based method for modeling gas flows. It works well for relatively large Knudsen (Kn) numbers, typically larger than 0.01, but quickly becomes computationally intensive as Kn decreases due to its time step and cell size limitations. An alternative approach was proposed to relax or remove these limitations, based on replacing pairwise collisions with a stochastic model corresponding to the Fokker-Planck equation [J. Comput. Phys., 229, 1077 (2010); J. Fluid Mech., 680, 574 (2011)]. Similar to the DSMC method, the downside of that approach suffers from computationally statistical noise. To solve the problem, a diffusion-based information preservation (D-IP) method has been developed. The main idea is to track the motion of a simulated molecule from the diffusive standpoint, and obtain the flow velocity and temperature through sampling and averaging the IP quantities. To validate the idea and the corresponding model, several benchmark problems with Kn ˜ 10-3-10-4 have been investigated. It is shown that the IP calculations are not only accurate, but also efficient because they make possible using a time step and cell size over an order of magnitude larger than the mean collision time and mean free path, respectively.

  1. Liposome/water lipophilicity: methods, information content, and pharmaceutical applications.

    PubMed

    van Balen, Georgette Plemper; Martinet, Catherine a Marca; Caron, Giulia; Bouchard, Géraldine; Reist, Marianne; Carrupt, Pierre-Alain; Fruttero, Roberta; Gasco, Alberto; Testa, Bernard

    2004-05-01

    This review discusses liposome/water lipophilicity in terms of the structure of liposomes, experimental methods, and information content. In a first part, the structural properties of the hydrophobic core and polar surface of liposomes are examined in the light of potential interactions with solute molecules. Particular emphasis is placed on the physicochemical properties of polar headgroups of lipids in liposomes. A second part is dedicated to three useful methods to study liposome/water partitioning, namely potentiometry, equilibrium dialysis, and (1)H-NMR relaxation rates. In each case, the principle and limitations of the method are discussed. The next part presents the structural information encoded in liposome/water lipophilicity, in other words the solutes' structural and physicochemical properties that determine their behavior and hence their partitioning in such systems. This presentation is based on a comparison between isotropic (i.e., solvent/water) and anisotropic (e.g., liposome/water) systems. An important factor to be considered is whether the anisotropic lipid phase is ionized or not. Three examples taken from the authors' laboratories are discussed to illustrate the factors or combinations thereof that govern liposome/water lipophilicity, namely (a) hydrophobic interactions alone, (b) hydrophobic and polar interactions, and (c) conformational effects plus hydrophobic and ionic interactions. The next part presents two studies taken from the field of QSAR to exemplify the use of liposome/water lipophilicity in structure-disposition and structure-activity relationships. In the conclusion, we summarize the interests and limitations of this technology and point to promising developments.

  2. GA and Lyapunov theory-based hybrid adaptive fuzzy controller for non-linear systems

    NASA Astrophysics Data System (ADS)

    Roy, Ananya; Das Sharma, Kaushik

    2015-02-01

    In this present article, a new hybrid methodology for designing stable adaptive fuzzy logic controllers (AFLCs) for a class of non-linear system is proposed. The proposed design strategy exploits the features of genetic algorithm (GA)-based stochastic evolutionary global search technique and Lyapunov theory-based local adaptation scheme. The objective is to develop a methodology for designing AFLCs with optimised free parameters and guaranteed closed-loop stability. Simultaneously, the proposed method introduces automation in the design process. The stand-alone Lyapunov theory-based design, GA-based design and proposed hybrid GA-Lyapunov design methodologies are implemented for two benchmark non-linear plants in simulation case studies with different reference signals and one experimental case study. The results demonstrate that the hybrid design methodology outperforms the other control strategies on the whole.

  3. A Specification Method for Interactive Medical Information Systems

    PubMed Central

    Wasserman, Anthony I.; Stinson, Susan K.

    1980-01-01

    This paper presents the User Software Engineering (USE) approach for developing specifications for an interactive information system (IIS) and shows how the method is applied to the specification of a Perinatal Data Registry system. Two linked views of the system are developed: a user view suitable for computer-naive users, and a design/verification view, suitable for computer-knowledgeable users. The user view is intended to facilitate user participation in the analysis task and in the definition of the user/system dialogue. The verification view is intended to facilitate design and testing of the resulting system. The two notations share their notations for data base definition and for specification of the user/system dialogue; however, the user view may utilize narrative text for describing the operations, while the design/verification view relies on a more formal specification method. The specification method encourages effective communication between users and developers and permits refinement of the specification in order to ensure that the resulting specification is as complete, consistent, and accurate as possible before proceeding with design and implementation.

  4. Theory-based research of high fidelity simulation use in nursing education: a review of the literature.

    PubMed

    Rourke, Liam; Schmidt, Megan; Garga, Neera

    2010-01-01

    In this article, we explore the extent to which theory-based research is informing our understanding of high-fidelity simulation use in nursing education. We reviewed the primary literature archived in the Cumulative Index of Nursing and Applied Health Literature (CINAHL) and Proquest Dissertation and Theses for empirical reports using the key terms high-fidelity simulation and nursing from the years 1989 to 2009. Of the articles that matched our inclusion criteria: 45% made no use of theory; 45% made minimal use; and 10% made adequate use. We argue that theory-based research could bring coherence and external validity to this domain.

  5. Methods of Information Geometry to model complex shapes

    NASA Astrophysics Data System (ADS)

    De Sanctis, A.; Gattone, S. A.

    2016-09-01

    In this paper, a new statistical method to model patterns emerging in complex systems is proposed. A framework for shape analysis of 2- dimensional landmark data is introduced, in which each landmark is represented by a bivariate Gaussian distribution. From Information Geometry we know that Fisher-Rao metric endows the statistical manifold of parameters of a family of probability distributions with a Riemannian metric. Thus this approach allows to reconstruct the intermediate steps in the evolution between observed shapes by computing the geodesic, with respect to the Fisher-Rao metric, between the corresponding distributions. Furthermore, the geodesic path can be used for shape predictions. As application, we study the evolution of the rat skull shape. A future application in Ophthalmology is introduced.

  6. System and Method for RFID-Enabled Information Collection

    NASA Technical Reports Server (NTRS)

    Fink, Patrick W. (Inventor); Lin, Gregory Y. (Inventor); Kennedy, Timothy F. (Inventor); Ngo, Phong H. (Inventor); Byerly, Diane (Inventor)

    2016-01-01

    Methods, apparatuses and systems for radio frequency identification (RFID)-enabled information collection are disclosed, including an enclosure, a collector coupled to the enclosure, an interrogator, a processor, and one or more RFID field sensors, each having an individual identification, disposed within the enclosure. In operation, the interrogator transmits an incident signal to the collector, causing the collector to generate an electromagnetic field within the enclosure. The electromagnetic field is affected by one or more influences. RFID sensors respond to the electromagnetic field by transmitting reflected signals containing the individual identifications of the responding RFID sensors to the interrogator. The interrogator receives the reflected signals, measures one or more returned signal strength indications ("RSSI") of the reflected signals and sends the RSSI measurements and identification of the responding RFID sensors to the processor to determine one or more facts about the influences. Other embodiments are also described.

  7. Measurement Theory Based on the Truth Values Violates Local Realism

    NASA Astrophysics Data System (ADS)

    Nagata, Koji

    2017-02-01

    We investigate the violation factor of the Bell-Mermin inequality. Until now, we use an assumption that the results of measurement are ±1. In this case, the maximum violation factor is 2( n-1)/2. The quantum predictions by n-partite Greenberger-Horne-Zeilinger (GHZ) state violate the Bell-Mermin inequality by an amount that grows exponentially with n. Recently, a new measurement theory based on the truth values is proposed (Nagata and Nakamura, Int. J. Theor. Phys. 55:3616, 2016). The values of measurement outcome are either +1 or 0. Here we use the new measurement theory. We consider multipartite GHZ state. It turns out that the Bell-Mermin inequality is violated by the amount of 2( n-1)/2. The measurement theory based on the truth values provides the maximum violation of the Bell-Mermin inequality.

  8. Theory-Based Bayesian Models of Inductive Inference

    DTIC Science & Technology

    2010-06-30

    Oxford University Press . 28. Griffiths, T. L. and Tenenbaum, J.B. (2007). Two proposals for causal grammar. In A. Gopnik and L. Schulz (eds.). ( ausal Learning. Oxford University Press . 29. Tenenbaum. J. B.. Kemp, C, Shafto. P. (2007). Theory-based Bayesian models for inductive reasoning. In A. Feeney and E. Heit (eds.). Induction. Cambridge University Press. 30. Goodman, N. D., Tenenbaum, J. B., Griffiths. T. L.. & Feldman, J. (2008). Compositionality in rational analysis: Grammar-based induction for concept

  9. A new template matching method based on contour information

    NASA Astrophysics Data System (ADS)

    Cai, Huiying; Zhu, Feng; Wu, Qingxiao; Li, Sicong

    2014-11-01

    Template matching is a significant approach in machine vision due to its effectiveness and robustness. However, most of the template matching methods are so time consuming that they can't be used to many real time applications. The closed contour matching method is a popular kind of template matching methods. This paper presents a new closed contour template matching method which is suitable for two dimensional objects. Coarse-to-fine searching strategy is used to improve the matching efficiency and a partial computation elimination scheme is proposed to further speed up the searching process. The method consists of offline model construction and online matching. In the process of model construction, triples and distance image are obtained from the template image. A certain number of triples which are composed by three points are created from the contour information that is extracted from the template image. The rule to select the three points is that the template contour is divided equally into three parts by these points. The distance image is obtained here by distance transform. Each point on the distance image represents the nearest distance between current point and the points on the template contour. During the process of matching, triples of the searching image are created with the same rule as the triples of the model. Through the similarity that is invariant to rotation, translation and scaling between triangles, the triples corresponding to the triples of the model are found. Then we can obtain the initial RST (rotation, translation and scaling) parameters mapping the searching contour to the template contour. In order to speed up the searching process, the points on the searching contour are sampled to reduce the number of the triples. To verify the RST parameters, the searching contour is projected into the distance image, and the mean distance can be computed rapidly by simple operations of addition and multiplication. In the fine searching process

  10. Constructing DEM from characteristic terrain information using HASM method

    NASA Astrophysics Data System (ADS)

    Song, Dunjiang; Yue, Tianxiang; Du, Zhengping; Wang, Qingguo

    2010-11-01

    In the construction of DEM, terrain features (e.g. valleys or stream lines, ridges, peaks, saddle points) are important for improving DEM accuracy and saw many applications in hydrology, precision agriculture, military trajectory planning, etc. HASM (High Accuracy Surface Modeling) is a method for surface modeling, which is based on the theory of surface. Presently, HASM is only used for scattered point's interpolation. So the work in this paper attempts to construct DEM based on the characteristic terrain information as stream lines and scattered points by HASM method. The course is described as the following steps. Firstly TIN (Triangulated Irregular Network) from the scattered points is generated. Secondly, each segment of the stream lines is well oriented to represent stream lines' flow direction, and a tree data structure (that has parent, children and brothers) is used to represent the whole stream lines' segments. A segment is a curve which does not intersect with other segments. A Water Course Flow (WCF) line is a set of segment lines connected piecewise but without overlapping or repetition, from the most upper reaches to the most lower reaches. From the stream lines' tree data structure, all the possible WCF lines are enumerated, and the start point and end point of each WCF lines is predicted from searching among the TIN. Thirdly, given a cell size, a 2-D matrix for the research region is built, and the values of the cells who were traversed by the stream lines by linear interpolation among each WCF lines. Fourthly, all the valued cells that were passed through by the stream line and that were from the scattered points are gathered as known scattered sampling points, and then HASM is used to construct the final DEM. A case study on the typical landform of plateau of China, KongTong gully of Dongzhi Plateau, Qingyang, Gausu province, is presented. The original data is manually vecterized from scanned maps 1:10,000, includes scattered points, stream lines

  11. Constructing DEM from characteristic terrain information using HASM method

    NASA Astrophysics Data System (ADS)

    Song, Dunjiang; Yue, Tianxiang; Du, Zhengping; Wang, Qingguo

    2009-09-01

    In the construction of DEM, terrain features (e.g. valleys or stream lines, ridges, peaks, saddle points) are important for improving DEM accuracy and saw many applications in hydrology, precision agriculture, military trajectory planning, etc. HASM (High Accuracy Surface Modeling) is a method for surface modeling, which is based on the theory of surface. Presently, HASM is only used for scattered point's interpolation. So the work in this paper attempts to construct DEM based on the characteristic terrain information as stream lines and scattered points by HASM method. The course is described as the following steps. Firstly TIN (Triangulated Irregular Network) from the scattered points is generated. Secondly, each segment of the stream lines is well oriented to represent stream lines' flow direction, and a tree data structure (that has parent, children and brothers) is used to represent the whole stream lines' segments. A segment is a curve which does not intersect with other segments. A Water Course Flow (WCF) line is a set of segment lines connected piecewise but without overlapping or repetition, from the most upper reaches to the most lower reaches. From the stream lines' tree data structure, all the possible WCF lines are enumerated, and the start point and end point of each WCF lines is predicted from searching among the TIN. Thirdly, given a cell size, a 2-D matrix for the research region is built, and the values of the cells who were traversed by the stream lines by linear interpolation among each WCF lines. Fourthly, all the valued cells that were passed through by the stream line and that were from the scattered points are gathered as known scattered sampling points, and then HASM is used to construct the final DEM. A case study on the typical landform of plateau of China, KongTong gully of Dongzhi Plateau, Qingyang, Gausu province, is presented. The original data is manually vecterized from scanned maps 1:10,000, includes scattered points, stream lines

  12. In pursuit of a valid Information Assessment Method for continuing education: a mixed methods study.

    PubMed

    Bindiganavile Sridhar, Soumya; Pluye, Pierre; Grad, Roland

    2013-10-07

    The Information Assessment Method (IAM) is a popular tool for continuing education and knowledge translation. After a search for information, the IAM allows the health professional to report what was the search objective, its cognitive impact, as well as any use and patient health benefit associated with the retrieved health information. In continuing education programs, professionals read health information, rate it using the IAM, and earn continuing education credit for this brief individual reflective learning activity. IAM items have been iteratively developed using literature reviews and qualitative studies. Thus, our research question was: what is the content validity of IAM items from the users' perspective? A two-step content validation study was conducted. In Step 1, we followed a mixed methods research design, and assessed the relevance and representativeness of IAM items. In this step, data from a longitudinal quantitative study and a qualitative multiple case study involving 40 family physicians were analyzed. In Step 2, IAM items were analyzed and modified based on a set of guiding principles by a multi-disciplinary expert panel. The content validity of 16 IAM items was supported, and these items were not changed. Nine other items were modified. Three new items were added, including two that were extensions of an existing item. A content validated version of the IAM (IAM 2011) is available for the continuing education of health professionals.

  13. Combining Methods in Evaluating Information Systems: Case Study of a Clinical Laboratory Information System

    PubMed Central

    Kaplan, Bonnie; Duchon, Dennis

    1989-01-01

    This paper reports how quantitative and qualitative methods were combined in a case study of a clinical laboratory information system. The study assessed effects of the system on work in the laboratories, as reported by laboratory technologists seven months post implementation. Primary changes caused by the computer system were increases in the amount of paperwork performed by technologists and improvements in laboratory results reporting. Individual technologists, as well as laboratories, differed in their assessments of the system according to their perceptions of how it affected their jobs. The combination of qualitative and quantitative data led to the development of a theoretical explanatory model of these differences. The paper discusses methodological implications and the need for an approach to evaluating medical computer systems that takes into account the inter-relationships and processes of which the system is a part.

  14. Agent-based method for distributed clustering of textual information

    DOEpatents

    Potok, Thomas E [Oak Ridge, TN; Reed, Joel W [Knoxville, TN; Elmore, Mark T [Oak Ridge, TN; Treadwell, Jim N [Louisville, TN

    2010-09-28

    A computer method and system for storing, retrieving and displaying information has a multiplexing agent (20) that calculates a new document vector (25) for a new document (21) to be added to the system and transmits the new document vector (25) to master cluster agents (22) and cluster agents (23) for evaluation. These agents (22, 23) perform the evaluation and return values upstream to the multiplexing agent (20) based on the similarity of the document to documents stored under their control. The multiplexing agent (20) then sends the document (21) and the document vector (25) to the master cluster agent (22), which then forwards it to a cluster agent (23) or creates a new cluster agent (23) to manage the document (21). The system also searches for stored documents according to a search query having at least one term and identifying the documents found in the search, and displays the documents in a clustering display (80) of similarity so as to indicate similarity of the documents to each other.

  15. An address geocoding method for improving rural spatial information infrastructure

    NASA Astrophysics Data System (ADS)

    Pan, Yuchun; Chen, Baisong; Lu, Zhou; Li, Shuhua; Zhang, Jingbo; Zhou, YanBing

    2010-11-01

    The transition of rural and agricultural management from divisional to integrated mode has highlighted the importance of data integration and sharing. Current data are mostly collected by specific department to satisfy their own needs and lake of considering on wider potential uses. This led to great difference in data format, semantic, and precision even in same area, which is a significant barrier for constructing an integrated rural spatial information system to support integrated management and decision-making. Considering the rural cadastral management system and postal zones, the paper designs a rural address geocoding method based on rural cadastral parcel. It puts forward a geocoding standard which consists of absolute position code, relative position code and extended code. It designs a rural geocoding database model, and addresses collection and update model. Then, based on the rural address geocoding model, it proposed a data model for rural agricultural resources management. The results show that the address coding based on postal code is stable and easy to memorize, two-dimensional coding based on the direction and distance is easy to be located and memorized, while extended code can enhance the extensibility and flexibility of address geocoding.

  16. An address geocoding method for improving rural spatial information infrastructure

    NASA Astrophysics Data System (ADS)

    Pan, Yuchun; Chen, Baisong; Lu, Zhou; Li, Shuhua; Zhang, Jingbo; Zhou, Yanbing

    2009-09-01

    The transition of rural and agricultural management from divisional to integrated mode has highlighted the importance of data integration and sharing. Current data are mostly collected by specific department to satisfy their own needs and lake of considering on wider potential uses. This led to great difference in data format, semantic, and precision even in same area, which is a significant barrier for constructing an integrated rural spatial information system to support integrated management and decision-making. Considering the rural cadastral management system and postal zones, the paper designs a rural address geocoding method based on rural cadastral parcel. It puts forward a geocoding standard which consists of absolute position code, relative position code and extended code. It designs a rural geocoding database model, and addresses collection and update model. Then, based on the rural address geocoding model, it proposed a data model for rural agricultural resources management. The results show that the address coding based on postal code is stable and easy to memorize, two-dimensional coding based on the direction and distance is easy to be located and memorized, while extended code can enhance the extensibility and flexibility of address geocoding.

  17. Formative research to develop theory-based messages for a Western Australian child drowning prevention television campaign: study protocol

    PubMed Central

    Denehy, Mel; Crawford, Gemma; Leavy, Justine; Nimmo, Lauren; Jancey, Jonine

    2016-01-01

    Introduction Worldwide, children under the age of 5 years are at particular risk of drowning. Responding to this need requires the development of evidence-informed drowning prevention strategies. Historically, drowning prevention strategies have included denying access, learning survival skills and providing supervision, as well as education and information which includes the use of mass media. Interventions underpinned by behavioural theory and formative evaluation tend to be more effective, yet few practical examples exist in the drowning and/or injury prevention literature. The Health Belief Model and Social Cognitive Theory will be used to explore participants' perspectives regarding proposed mass media messaging. This paper describes a qualitative protocol to undertake formative research to develop theory-based messages for a child drowning prevention campaign. Methods and analysis The primary data source will be focus group interviews with parents and caregivers of children under 5 years of age in metropolitan and regional Western Australia. Qualitative content analysis will be used to analyse the data. Ethics and dissemination This study will contribute to the drowning prevention literature to inform the development of future child drowning prevention mass media campaigns. Findings from the study will be disseminated to practitioners, policymakers and researchers via international conferences, peer and non-peer-reviewed journals and evidence summaries. The study was submitted and approved by the Curtin University Human Research Ethics Committee. PMID:27207621

  18. Informative Parameters of Dynamic Geo-electricity Methods

    NASA Astrophysics Data System (ADS)

    Tursunmetov, R.

    With growing complexity of geological tasks and revealing abnormality zones con- nected with ore, oil, gas and water availability, methods of dynamic geo-electricity started to be used. In these methods geological environment is considered as inter- phase irregular one. Main dynamic element of this environment is double electric layer, which develops on the boundary between solid and liquid phase. In ore or wa- ter saturated environment double electric layers become electrochemical or electro- kinetic active elements of geo-electric environment, which, in turn, form natural elec- tric field. Mentioned field influences artificially created field distribution and inter- action bear complicated super-position or non-linear character. Therefore, geological environment is considered as active one, which is able to accumulate and transform artificially superpositioned fields. Main dynamic property of this environment is non- liner behavior of specific electric resistance and soil polarization depending on current density and measurements frequency, which serve as informative parameters for dy- namic geo-electricity methods. Study of disperse soil electric properties in impulse- frequency regime with study of temporal and frequency characteristics of electric field is of main interest for definition of geo-electric abnormality. Volt-amperic characteris- tics of electromagnetic field study has big practical significance. These characteristics are determined by electric-chemically active ore and water saturated fields. Mentioned parameters depend on initiated field polarity, in particular on ore saturated zone's character, composition and mineralization and natural electric field availability un- der cathode and anode mineralization. Non-linear behavior of environment's dynamic properties impacts initiated field structure that allows to define abnormal zone loca- tion. And, finally, study of soil anisotropy dynamic properties in space will allow to identify filtration flows

  19. An inventory-theory-based interval-parameter two-stage stochastic programming model for water resources management

    NASA Astrophysics Data System (ADS)

    Suo, M. Q.; Li, Y. P.; Huang, G. H.

    2011-09-01

    In this study, an inventory-theory-based interval-parameter two-stage stochastic programming (IB-ITSP) model is proposed through integrating inventory theory into an interval-parameter two-stage stochastic optimization framework. This method can not only address system uncertainties with complex presentation but also reflect transferring batch (the transferring quantity at once) and period (the corresponding cycle time) in decision making problems. A case of water allocation problems in water resources management planning is studied to demonstrate the applicability of this method. Under different flow levels, different transferring measures are generated by this method when the promised water cannot be met. Moreover, interval solutions associated with different transferring costs also have been provided. They can be used for generating decision alternatives and thus help water resources managers to identify desired policies. Compared with the ITSP method, the IB-ITSP model can provide a positive measure for solving water shortage problems and afford useful information for decision makers under uncertainty.

  20. Theory-Based Interventions to Improve Medication Adherence in Older Adults Prescribed Polypharmacy: A Systematic Review.

    PubMed

    Patton, Deborah E; Hughes, Carmel M; Cadogan, Cathal A; Ryan, Cristín A

    2017-02-01

    Previous interventions have shown limited success in improving medication adherence in older adults, and this may be due to the lack of a theoretical underpinning. This review sought to determine the effectiveness of theory-based interventions aimed at improving medication adherence in older adults prescribed polypharmacy and to explore the extent to which psychological theory informed their development. Eight electronic databases were searched from inception to March 2015, and extensive hand-searching was conducted. Interventions delivered to older adults (populations with a mean/median age of ≥65 years) prescribed polypharmacy (four or more regular oral/non-oral medicines) were eligible. Studies had to report an underpinning theory and measure at least one adherence and one clinical/humanistic outcome. Data were extracted independently by two reviewers and included details of intervention content, delivery, providers, participants, outcomes and theories used. The theory coding scheme (TCS) was used to assess the extent of theory use. Five studies cited theory as the basis for intervention development (social cognitive theory, health belief model, transtheoretical model, self-regulation model). The extent of theory use and intervention effectiveness in terms of adherence and clinical/humanistic outcomes varied across studies. No study made optimal use of theory as recommended in the TCS. The heterogeneity observed and inclusion of pilot designs mean conclusions regarding effectiveness of theory-based interventions targeting older adults prescribed polypharmacy could not be drawn. Further primary research involving theory as a central component of intervention development is required. The review findings will help inform the design of future theory-based adherence interventions.

  1. Remote multi-position information gathering system and method

    DOEpatents

    Hirschfeld, Tomas B.

    1989-01-01

    A technique for gathering specific information from various remote locations, especially fluorimetric information characteristic of particular materials at the various locations is disclosed herein. This technique uses a single source of light disposed at still a different, central location and an overall optical network including an arrangement of optical fibers cooperating with the light source for directing individual light beams into the different information bearing locations. The incoming light beams result in corresponding displays of light, e.g., fluorescent light, containing the information to be obtained. The optical network cooperates with these light displays at the various locations for directing outgoing light beams containing the same information as their cooperating displays from these locations to the central location. Each of these outgoing beams is applied to a detection arrangement, e.g., a fluorescence spectroscope, for retrieving the information contained thereby.

  2. Remote multi-position information gathering system and method

    DOEpatents

    Hirschfeld, T.B.

    1986-12-02

    A technique for gathering specific information from various remote locations, especially fluorimetric information characteristic of particular materials at the various locations is disclosed herein. This technique uses a single source of light disposed at still a different, central location and an overall optical network including an arrangement of optical fibers cooperating with the light source for directing individual light beams into the different information bearing locations. The incoming light beams result in corresponding displays of light, e.g., fluorescent light, containing the information to be obtained. The optical network cooperates with these light displays at the various locations for directing outgoing light beams containing the same information as their cooperating displays from these locations to the central location. Each of these outgoing beams is applied to a detection arrangement, e.g., a fluorescence spectroscope, for retrieving the information contained thereby. 9 figs.

  3. Remote multi-position information gathering system and method

    DOEpatents

    Hirschfeld, T.B.

    1989-01-24

    A technique for gathering specific information from various remote locations, especially fluorimetric information characteristic of particular materials at the various locations is disclosed herein. This technique uses a single source of light disposed at still a different, central location and an overall optical network including an arrangement of optical fibers cooperating with the light source for directing individual light beams into the different information bearing locations. The incoming light beams result in corresponding displays of light, e.g., fluorescent light, containing the information to be obtained. The optical network cooperates with these light displays at the various locations for directing outgoing light beams containing the same information as their cooperating displays from these locations to the central location. Each of these outgoing beams is applied to a detection arrangement, e.g., a fluorescence spectroscope, for retrieving the information contained thereby. 9 figs.

  4. Remote multi-position information gathering system and method

    DOEpatents

    Hirschfeld, Tomas B.

    1986-01-01

    A technique for gathering specific information from various remote locations, especially fluorimetric information characteristic of particular materials at the various locations is disclosed herein. This technique uses a single source of light disposed at still a different, central location and an overall optical network including an arrangement of optical fibers cooperating with the light source for directing individual light beams into the different information bearing locations. The incoming light beams result in corresponding displays of light, e.g., fluorescent light, containing the information to be obtained. The optical network cooperates with these light displays at the various locations for directing outgoing light beams containing the same information as their cooperating displays from these locations to the central location. Each of these outgoing beams is applied to a detection arrangement, e.g., a fluorescence spectroscope, for retrieving the information contained thereby.

  5. Remote multi-position information gathering system and method

    DOEpatents

    Hirschfeld, Tomas B.

    1986-01-01

    A technique for gathering specific information from various remote locations, especially fluorimetric information characteristic of particular materials at the various locations is disclosed herein. This technique uses a single source of light disposed at still a different, central location and an overall optical network including an arrangement of optical fibers cooperating with the light source for directing individual light beams into the different information bearing locations. The incoming light beams result in corresponding displays of light, e.g., fluorescent light, containing the information to be obtained. The optical network cooperates with these light displays at the various locations for directing ongoing light beams containing the same information as their cooperating displays from these locations to the central location. Each of these outgoing beams is applied to a detection arrangement, e.g., a fluorescence spectroscope, for retrieving the information contained thereby.

  6. A Structural Optimization Method for Information Resource Management.

    DTIC Science & Technology

    1985-12-01

    1982; Hirschheim, 1983). According to Matlin (1980), management of a resource suggests opportunities to conserve that resource, be effective and...1982; Matlin , 1980). It is this continuing and expanding reuse of the information resource that determines the well being of organizations in an...Rita. "Records, Words, Data...The Stage of Information Management - Part 2." Information and R rCLds _ng1Huin=Me 16 (July 1982): 18-20. Matlin , Gerald

  7. When Educational Material Is Delivered: A Mixed Methods Content Validation Study of the Information Assessment Method

    PubMed Central

    2017-01-01

    Background The Information Assessment Method (IAM) allows clinicians to report the cognitive impact, clinical relevance, intention to use, and expected patient health benefits associated with clinical information received by email. More than 15,000 Canadian physicians and pharmacists use the IAM in continuing education programs. In addition, information providers can use IAM ratings and feedback comments from clinicians to improve their products. Objective Our general objective was to validate the IAM questionnaire for the delivery of educational material (ecological and logical content validity). Our specific objectives were to measure the relevance and evaluate the representativeness of IAM items for assessing information received by email. Methods A 3-part mixed methods study was conducted (convergent design). In part 1 (quantitative longitudinal study), the relevance of IAM items was measured. Participants were 5596 physician members of the Canadian Medical Association who used the IAM. A total of 234,196 ratings were collected in 2012. The relevance of IAM items with respect to their main construct was calculated using descriptive statistics (relevance ratio R). In part 2 (qualitative descriptive study), the representativeness of IAM items was evaluated. A total of 15 family physicians completed semistructured face-to-face interviews. For each construct, we evaluated the representativeness of IAM items using a deductive-inductive thematic qualitative data analysis. In part 3 (mixing quantitative and qualitative parts), results from quantitative and qualitative analyses were reviewed, juxtaposed in a table, discussed with experts, and integrated. Thus, our final results are derived from the views of users (ecological content validation) and experts (logical content validation). Results Of the 23 IAM items, 21 were validated for content, while 2 were removed. In part 1 (quantitative results), 21 items were deemed relevant, while 2 items were deemed not relevant

  8. Theory-based Bayesian models of inductive learning and reasoning.

    PubMed

    Tenenbaum, Joshua B; Griffiths, Thomas L; Kemp, Charles

    2006-07-01

    Inductive inference allows humans to make powerful generalizations from sparse data when learning about word meanings, unobserved properties, causal relationships, and many other aspects of the world. Traditional accounts of induction emphasize either the power of statistical learning, or the importance of strong constraints from structured domain knowledge, intuitive theories or schemas. We argue that both components are necessary to explain the nature, use and acquisition of human knowledge, and we introduce a theory-based Bayesian framework for modeling inductive learning and reasoning as statistical inferences over structured knowledge representations.

  9. An Organizational Model to Distinguish between and Integrate Research and Evaluation Activities in a Theory Based Evaluation

    ERIC Educational Resources Information Center

    Sample McMeeking, Laura B.; Basile, Carole; Cobb, R. Brian

    2012-01-01

    Theory-based evaluation (TBE) is an evaluation method that shows how a program will work under certain conditions and has been supported as a viable, evidence-based option in cases where randomized trials or high-quality quasi-experiments are not feasible. Despite the model's widely accepted theoretical appeal there are few examples of its…

  10. Using the Work System Method with Freshman Information Systems Students

    ERIC Educational Resources Information Center

    Recker, Jan; Alter, Steven

    2012-01-01

    Recent surveys of information technology management professionals show that understanding business domains in terms of business productivity and cost reduction potential, knowledge of different vertical industry segments and their information requirements, understanding of business processes and client-facing skills are more critical for…

  11. Method and system for analyzing and classifying electronic information

    DOEpatents

    McGaffey, Robert W.; Bell, Michael Allen; Kortman, Peter J.; Wilson, Charles H.

    2003-04-29

    A data analysis and classification system that reads the electronic information, analyzes the electronic information according to a user-defined set of logical rules, and returns a classification result. The data analysis and classification system may accept any form of computer-readable electronic information. The system creates a hash table wherein each entry of the hash table contains a concept corresponding to a word or phrase which the system has previously encountered. The system creates an object model based on the user-defined logical associations, used for reviewing each concept contained in the electronic information in order to determine whether the electronic information is classified. The data analysis and classification system extracts each concept in turn from the electronic information, locates it in the hash table, and propagates it through the object model. In the event that the system can not find the electronic information token in the hash table, that token is added to a missing terms list. If any rule is satisfied during propagation of the concept through the object model, the electronic information is classified.

  12. Using the Work System Method with Freshman Information Systems Students

    ERIC Educational Resources Information Center

    Recker, Jan; Alter, Steven

    2012-01-01

    Recent surveys of information technology management professionals show that understanding business domains in terms of business productivity and cost reduction potential, knowledge of different vertical industry segments and their information requirements, understanding of business processes and client-facing skills are more critical for…

  13. Informal Learning of Social Workers: A Method of Narrative Inquiry

    ERIC Educational Resources Information Center

    Gola, Giancarlo

    2009-01-01

    Purpose: The purpose of this paper is to investigate social workers' processes of informal learning, through their narration of their professional experience, in order to understand how social workers learn. Informal learning is any individual practice or activity that is able to produce continuous learning; it is often non-intentional and…

  14. When Educational Material Is Delivered: A Mixed Methods Content Validation Study of the Information Assessment Method.

    PubMed

    Badran, Hani; Pluye, Pierre; Grad, Roland

    2017-03-14

    The Information Assessment Method (IAM) allows clinicians to report the cognitive impact, clinical relevance, intention to use, and expected patient health benefits associated with clinical information received by email. More than 15,000 Canadian physicians and pharmacists use the IAM in continuing education programs. In addition, information providers can use IAM ratings and feedback comments from clinicians to improve their products. Our general objective was to validate the IAM questionnaire for the delivery of educational material (ecological and logical content validity). Our specific objectives were to measure the relevance and evaluate the representativeness of IAM items for assessing information received by email. A 3-part mixed methods study was conducted (convergent design). In part 1 (quantitative longitudinal study), the relevance of IAM items was measured. Participants were 5596 physician members of the Canadian Medical Association who used the IAM. A total of 234,196 ratings were collected in 2012. The relevance of IAM items with respect to their main construct was calculated using descriptive statistics (relevance ratio R). In part 2 (qualitative descriptive study), the representativeness of IAM items was evaluated. A total of 15 family physicians completed semistructured face-to-face interviews. For each construct, we evaluated the representativeness of IAM items using a deductive-inductive thematic qualitative data analysis. In part 3 (mixing quantitative and qualitative parts), results from quantitative and qualitative analyses were reviewed, juxtaposed in a table, discussed with experts, and integrated. Thus, our final results are derived from the views of users (ecological content validation) and experts (logical content validation). Of the 23 IAM items, 21 were validated for content, while 2 were removed. In part 1 (quantitative results), 21 items were deemed relevant, while 2 items were deemed not relevant (R=4.86% [N=234,196] and R=3.04% [n

  15. Properties of hypothesis testing techniques and (Bayesian) model selection for exploration-based and theory-based (order-restricted) hypotheses.

    PubMed

    Kuiper, Rebecca M; Nederhoff, Tim; Klugkist, Irene

    2015-05-01

    In this paper, the performance of six types of techniques for comparisons of means is examined. These six emerge from the distinction between the method employed (hypothesis testing, model selection using information criteria, or Bayesian model selection) and the set of hypotheses that is investigated (a classical, exploration-based set of hypotheses containing equality constraints on the means, or a theory-based limited set of hypotheses with equality and/or order restrictions). A simulation study is conducted to examine the performance of these techniques. We demonstrate that, if one has specific, a priori specified hypotheses, confirmation (i.e., investigating theory-based hypotheses) has advantages over exploration (i.e., examining all possible equality-constrained hypotheses). Furthermore, examining reasonable order-restricted hypotheses has more power to detect the true effect/non-null hypothesis than evaluating only equality restrictions. Additionally, when investigating more than one theory-based hypothesis, model selection is preferred over hypothesis testing. Because of the first two results, we further examine the techniques that are able to evaluate order restrictions in a confirmatory fashion by examining their performance when the homogeneity of variance assumption is violated. Results show that the techniques are robust to heterogeneity when the sample sizes are equal. When the sample sizes are unequal, the performance is affected by heterogeneity. The size and direction of the deviations from the baseline, where there is no heterogeneity, depend on the effect size (of the means) and on the trend in the group variances with respect to the ordering of the group sizes. Importantly, the deviations are less pronounced when the group variances and sizes exhibit the same trend (e.g., are both increasing with group number). © 2014 The British Psychological Society.

  16. Informing Patients About Placebo Effects: Using Evidence, Theory, and Qualitative Methods to Develop a New Website.

    PubMed

    Greville-Harris, Maddy; Bostock, Jennifer; Din, Amy; Graham, Cynthia A; Lewith, George; Liossi, Christina; O'Riordan, Tim; White, Peter; Yardley, Lucy; Bishop, Felicity L

    2016-06-10

    According to established ethical principles and guidelines, patients in clinical trials should be fully informed about the interventions they might receive. However, information about placebo-controlled clinical trials typically focuses on the new intervention being tested and provides limited and at times misleading information about placebos. We aimed to create an informative, scientifically accurate, and engaging website that could be used to improve understanding of placebo effects among patients who might be considering taking part in a placebo-controlled clinical trial. Our approach drew on evidence-, theory-, and person-based intervention development. We used existing evidence and theory about placebo effects to develop content that was scientifically accurate. We used existing evidence and theory of health behavior to ensure our content would be communicated persuasively, to an audience who might currently be ignorant or misinformed about placebo effects. A qualitative 'think aloud' study was conducted in which 10 participants viewed prototypes of the website and spoke their thoughts out loud in the presence of a researcher. The website provides information about 10 key topics and uses text, evidence summaries, quizzes, audio clips of patients' stories, and a short film to convey key messages. Comments from participants in the think aloud study highlighted occasional misunderstandings and off-putting/confusing features. These were addressed by modifying elements of content, style, and navigation to improve participants' experiences of using the website. We have developed an evidence-based website that incorporates theory-based techniques to inform members of the public about placebos and placebo effects. Qualitative research ensured our website was engaging and convincing for our target audience who might not perceive a need to learn about placebo effects. Before using the website in clinical trials, it is necessary to test its effects on key outcomes

  17. 5 CFR 1640.6 - Methods of providing information.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... part to participants by making it available on the TSP Web site. A participant can request paper copies of that information from the TSP by calling the ThriftLine, submitting a request through the TSP...

  18. 5 CFR 1640.6 - Methods of providing information.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... part to participants by making it available on the TSP Web site. A participant can request paper copies of that information from the TSP by calling the ThriftLine, submitting a request through the TSP...

  19. 5 CFR 1640.6 - Methods of providing information.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... part to participants by making it available on the TSP Web site. A participant can request paper copies of that information from the TSP by calling the ThriftLine, submitting a request through the TSP...

  20. Improve Biomedical Information Retrieval using Modified Learning to Rank Methods.

    PubMed

    Xu, Bo; Lin, Hongfei; Lin, Yuan; Ma, Yunlong; Yang, Liang; Wang, Jian; Yang, Zhihao

    2016-06-14

    In these years, the number of biomedical articles has increased exponentially, which becomes a problem for biologists to capture all the needed information manually. Information retrieval technologies, as the core of search engines, can deal with the problem automatically, providing users with the needed information. However, it is a great challenge to apply these technologies directly for biomedical retrieval, because of the abundance of domain specific terminologies. To enhance biomedical retrieval, we propose a novel framework based on learning to rank. Learning to rank is a series of state-of-the-art information retrieval techniques, and has been proved effective in many information retrieval tasks. In the proposed framework, we attempt to tackle the problem of the abundance of terminologies by constructing ranking models, which focus on not only retrieving the most relevant documents, but also diversifying the searching results to increase the completeness of the resulting list for a given query. In the model training, we propose two novel document labeling strategies, and combine several traditional retrieval models as learning features. Besides, we also investigate the usefulness of different learning to rank approaches in our framework. Experimental results on TREC Genomics datasets demonstrate the effectiveness of our framework for biomedical information retrieval.

  1. Linear and nonlinear information flow based on time-delayed mutual information method and its application to corticomuscular interaction.

    PubMed

    Jin, Seung-Hyun; Lin, Peter; Hallett, Mark

    2010-03-01

    To propose a model-free method to show linear and nonlinear information flow based on time-delayed mutual information (TDMI) by employing uni- and bi-variate surrogate tests and to investigate whether there are contributions of the nonlinear information flow in corticomuscular (CM) interaction. Using simulated data, we tested whether our method would successfully detect the direction of information flow and identify a relationship between two simulated time series. As an experimental data application, we applied this method to investigate CM interaction during a right wrist extension task. Results of simulation tests show that we can correctly detect the direction of information flow and the relationship between two time series without a prior knowledge of the dynamics of their generating systems. As experimental results, we found both linear and nonlinear information flow from contralateral sensorimotor cortex to muscle. Our method is a viable model-free measure of temporally varying causal interactions that is capable of distinguishing linear and nonlinear information flow. With respect to experimental application, there are both linear and nonlinear information flows in CM interaction from contralateral sensorimotor cortex to muscle, which may reflect the motor command from brain to muscle. This is the first study to show separate linear and nonlinear information flow in CM interaction.

  2. Control theory based airfoil design using the Euler equations

    NASA Technical Reports Server (NTRS)

    Jameson, Antony; Reuther, James

    1994-01-01

    This paper describes the implementation of optimization techniques based on control theory for airfoil design. In our previous work it was shown that control theory could be employed to devise effective optimization procedures for two-dimensional profiles by using the potential flow equation with either a conformal mapping or a general coordinate system. The goal of our present work is to extend the development to treat the Euler equations in two-dimensions by procedures that can readily be generalized to treat complex shapes in three-dimensions. Therefore, we have developed methods which can address airfoil design through either an analytic mapping or an arbitrary grid perturbation method applied to a finite volume discretization of the Euler equations. Here the control law serves to provide computationally inexpensive gradient information to a standard numerical optimization method. Results are presented for both the inverse problem and drag minimization problem.

  3. 5 CFR 1640.6 - Methods of providing information.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... part to participants by making it available on the TSP Web site. A participant can request paper copies of that information from the TSP by calling the ThriftLine, submitting a request through the TSP Web site, or by writing to the TSP record keeper. ...

  4. Methodical Bases for the Regional Information Potential Estimation

    ERIC Educational Resources Information Center

    Ashmarina, Svetlana I.; Khasaev, Gabibulla R.; Mantulenko, Valentina V.; Kasarin, Stanislav V.; Dorozhkin, Evgenij M.

    2016-01-01

    The relevance of the investigated problem is caused by the need to assess the implementation of informatization level of the region and the insufficient development of the theoretical, content-technological, scientific and methodological aspects of the assessment of the region's information potential. The aim of the research work is to develop a…

  5. Information Theory: A Method for Human Communication Research.

    ERIC Educational Resources Information Center

    Black, John W.

    This paper describes seven experiments related to human communication research. The first two experiments discuss studies treating the aural responses of listeners. The third experiment was undertaken to estimate the information of sounds and diagrams which might lead to an estimate of the redundancy ascribed to the phonetic structure of words. A…

  6. A multi-method approach to evaluate health information systems.

    PubMed

    Yu, Ping

    2010-01-01

    Systematic evaluation of the introduction and impact of health information systems (HIS) is a challenging task. As the implementation is a dynamic process, with diverse issues emerge at various stages of system introduction, it is challenge to weigh the contribution of various factors and differentiate the critical ones. A conceptual framework will be helpful in guiding the evaluation effort; otherwise data collection may not be comprehensive and accurate. This may again lead to inadequate interpretation of the phenomena under study. Based on comprehensive literature research and own practice of evaluating health information systems, the author proposes a multimethod approach that incorporates both quantitative and qualitative measurement and centered around DeLone and McLean Information System Success Model. This approach aims to quantify the performance of HIS and its impact, and provide comprehensive and accurate explanations about the casual relationships of the different factors. This approach will provide decision makers with accurate and actionable information for improving the performance of the introduced HIS.

  7. Statistical methods of combining information: Applications to sensor data fusion

    SciTech Connect

    Burr, T.

    1996-12-31

    This paper reviews some statistical approaches to combining information from multiple sources. Promising new approaches will be described, and potential applications to combining not-so-different data sources such as sensor data will be discussed. Experiences with one real data set are described.

  8. Paper Trail: One Method of Information Literacy Assessment

    ERIC Educational Resources Information Center

    Nutefall, Jennifer

    2004-01-01

    Assessing students' information literacy skills can be difficult depending on the involvement of the librarian in a course. To overcome this, librarians created an assignment called the Paper Trail, where students wrote a short essay about their research process and reflected on what they would do differently. Through reviewing and grading these…

  9. Verbal Information Processing Paradigms: A Review of Theory and Methods

    DTIC Science & Technology

    1984-09-01

    II necettary and Identity by block number) - Information processing^ Cognitive psychology- Verbal processing , ^ 20. ABSTRACT fCootCou« an...1979; Cooper, 1980), block design problems (Royer, 1977), matrix pattern abstraction (Hunt, 1974), and comprehension of text (Frederiksen, 1978...imagine or form abstract representation of stimulus 13. Mentally rotate spatial configuration 14. Comprehend and analyze language stimulus 15. Judge

  10. Contextualized theory-based predictors of intention to practice monogamy among adolescents in Botswana junior secondary schools: Results of focus group sessions and a cross-sectional study.

    PubMed

    Chilisa, Bagele; Mohiemang, Irene; Mpeta, Kolentino Nyamadzapasi; Malinga, Tumane; Ntshwarang, Poloko; Koyabe, Bramwell Walela; Heeren, G Anita

    2016-01-01

    Culture and tradition influences behaviour. Multiple partner and concurrent relationships are made responsible for the increase of HIV infection in Sub-Saharan Africa. A contextualized "Theory of Planned Behaviour" was used to identify predictors of intention to practice monogamy. A mixed method design using qualitative data from focus groups, stories and a survey were analyzed for quantitative data. The qualitative data added to the behavioural beliefs a socio-cultural belief domain as well as attitudes, subjective norms, and perceived behavioural control predicted the intention to practice monogamy. The adolescents showed a tendency towards having more than one sexual partner. The normative beliefs and the socio cultural beliefs also predicted intentions while hedonistic belief and partner reaction did not. In contextualizing theory-based interventions, it is important to draw from stories and the langauage that circulate in a community about a given behaviour. More studies are needed on ways to combine qualitative approaches with quantitative approaches to inform the development of theory based culturally appropriate and context specific intervention strategies to reduce the risk of HIV.

  11. Contextualized theory-based predictors of intention to practice monogamy among adolescents in Botswana junior secondary schools: Results of focus group sessions and a cross-sectional study

    PubMed Central

    Chilisa, Bagele; Mohiemang, Irene; Mpeta, Kolentino Nyamadzapasi; Malinga, Tumane; Ntshwarang, Poloko; Koyabe, Bramwell Walela; Heeren, G. Anita

    2016-01-01

    Culture and tradition influences behaviour. Multiple partner and concurrent relationships are made responsible for the increase of HIV infection in Sub-Saharan Africa. A contextualized “Theory of Planned Behaviour” was used to identify predictors of intention to practice monogamy. A mixed method design using qualitative data from focus groups, stories and a survey were analyzed for quantitative data. The qualitative data added to the behavioural beliefs a socio-cultural belief domain as well as attitudes, subjective norms, and perceived behavioural control predicted the intention to practice monogamy. The adolescents showed a tendency towards having more than one sexual partner. The normative beliefs and the socio cultural beliefs also predicted intentions while hedonistic belief and partner reaction did not. In contextualizing theory-based interventions, it is important to draw from stories and the langauage that circulate in a community about a given behaviour. More studies are needed on ways to combine qualitative approaches with quantitative approaches to inform the development of theory based culturally appropriate and context specific intervention strategies to reduce the risk of HIV. PMID:28090169

  12. A Danger-Theory-Based Immune Network Optimization Algorithm

    PubMed Central

    Li, Tao; Xiao, Xin; Shi, Yuanquan

    2013-01-01

    Existing artificial immune optimization algorithms reflect a number of shortcomings, such as premature convergence and poor local search ability. This paper proposes a danger-theory-based immune network optimization algorithm, named dt-aiNet. The danger theory emphasizes that danger signals generated from changes of environments will guide different levels of immune responses, and the areas around danger signals are called danger zones. By defining the danger zone to calculate danger signals for each antibody, the algorithm adjusts antibodies' concentrations through its own danger signals and then triggers immune responses of self-regulation. So the population diversity can be maintained. Experimental results show that the algorithm has more advantages in the solution quality and diversity of the population. Compared with influential optimization algorithms, CLONALG, opt-aiNet, and dopt-aiNet, the algorithm has smaller error values and higher success rates and can find solutions to meet the accuracies within the specified function evaluation times. PMID:23483853

  13. Information storage medium and method of recording and retrieving information thereon

    DOEpatents

    Marchant, D. D.; Begej, Stefan

    1986-01-01

    Information storage medium comprising a semiconductor doped with first and second impurities or dopants. Preferably, one of the impurities is introduced by ion implantation. Conductive electrodes are photolithographically formed on the surface of the medium. Information is recorded on the medium by selectively applying a focused laser beam to discrete regions of the medium surface so as to anneal discrete regions of the medium containing lattice defects introduced by the ion-implanted impurity. Information is retrieved from the storage medium by applying a focused laser beam to annealed and non-annealed regions so as to produce a photovoltaic signal at each region.

  14. An efficient steganography method for hiding patient confidential information.

    PubMed

    Al-Dmour, Hayat; Al-Ani, Ahmed; Nguyen, Hung

    2014-01-01

    This paper deals with the important issue of security and confidentiality of patient information when exchanging or storing medical images. Steganography has recently been viewed as an alternative or complement to cryptography, as existing cryptographic systems are not perfect due to their vulnerability to certain types of attack. We propose in this paper a new steganography algorithm for hiding patient confidential information. It utilizes Pixel Value Differencing (PVD) to identify contrast regions in the image and a Hamming code that embeds 3 secret message bits into 4 bits of the cover image. In order to preserve the content of the region of interest (ROI), the embedding is only performed using the Region of Non-Interest (RONI).

  15. Control methods for improved Fisher information with quantum sensing

    NASA Astrophysics Data System (ADS)

    Gefen, Tuvia; Jelezko, Fedor; Retzker, Alex

    2017-09-01

    Recently new approaches for sensing the frequency of time dependent Hamiltonians have been presented, and it was shown that the optimal Fisher information scales as T4. We present here our interpretation of this new scaling, where the relative phase is accumulated quadratically with time, and show that this can be produced by a variety of simple pulse sequences. Interestingly, this scaling has a limited duration, and we show that certain pulse sequences prolong the effect. The performance of these schemes is analyzed and we examine their relevance to state-of-the-art experiments. We analyze the T3 scaling of the Fisher information which appears when multiple synchronized measurements are performed, and is the optimal scaling in the case of a finite coherence time.

  16. Methods of information theory and algorithmic complexity for network biology.

    PubMed

    Zenil, Hector; Kiani, Narsis A; Tegnér, Jesper

    2016-03-01

    We survey and introduce concepts and tools located at the intersection of information theory and network biology. We show that Shannon's information entropy, compressibility and algorithmic complexity quantify different local and global aspects of synthetic and biological data. We show examples such as the emergence of giant components in Erdös-Rényi random graphs, and the recovery of topological properties from numerical kinetic properties simulating gene expression data. We provide exact theoretical calculations, numerical approximations and error estimations of entropy, algorithmic probability and Kolmogorov complexity for different types of graphs, characterizing their variant and invariant properties. We introduce formal definitions of complexity for both labeled and unlabeled graphs and prove that the Kolmogorov complexity of a labeled graph is a good approximation of its unlabeled Kolmogorov complexity and thus a robust definition of graph complexity. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Factors influencing variation in physician adenoma detection rates: a theory-based approach

    PubMed Central

    Atkins, Louise; Hunkeler, Enid M.; Jensen, Christopher D.; Michie, Susan; Lee, Jeffrey K.; Doubeni, Chyke A.; Zauber, Ann G.; Levin, Theodore R.; Quinn, Virginia P.; Corley, Douglas A.

    2015-01-01

    Background & Aims Interventions to improve physician adenoma detection rates for colonoscopy have generally not been successful and there are little data on the factors contributing to variation that may be appropriate targets for intervention. We sought to identify factors that may influence variation in detection rates using theory-based tools for understanding behavior. Methods We separately studied gastroenterologists and endoscopy nurses at three Kaiser Permanente Northern California medical centers to identify potentially modifiable factors relevant to physician adenoma detection rate variability using structured group interviews (focus groups) and theory-based tools for understanding behavior and eliciting behavior change: the Capability, Opportunity, and Motivation behavior model; the Theoretical Domains Framework; and the Behavior Change Wheel. Results Nine factors potentially associated with detection rate variability were identified, including six related to capability (uncertainty about which types of polyps to remove; style of endoscopy team leadership; compromised ability to focus during an examination due to distractions; examination technique during withdrawal; difficulty detecting certain types of adenomas; and examiner fatigue and pain), two related to opportunity (perceived pressure due to the number of examinations expected per shift and social pressure to finish examinations before scheduled breaks or the end of a shift), and one related to motivation (valuing a meticulous examination as the top priority). Examples of potential intervention strategies are provided. Conclusions Using theory-based tools, this study identified several novel and potentially modifiable factors relating to capability, opportunity, and motivation that may contribute to adenoma detection rate variability and be appropriate targets for future intervention trials. PMID:26366787

  18. A theory-based approach to teaching young children about health: A recipe for understanding

    PubMed Central

    Nguyen, Simone P.; McCullough, Mary Beth; Noble, Ashley

    2011-01-01

    The theory-theory account of conceptual development posits that children’s concepts are integrated into theories. Concept learning studies have documented the central role that theories play in children’s learning of experimenter-defined categories, but have yet to extensively examine complex, real-world concepts such as health. The present study examined whether providing young children with coherent and causally-related information in a theory-based lesson would facilitate their learning about the concept of health. This study used a pre-test/lesson/post-test design, plus a five month follow-up. Children were randomly assigned to one of three conditions: theory (i.e., 20 children received a theory-based lesson); nontheory (i.e., 20 children received a nontheory-based lesson); and control (i.e., 20 children received no lesson). Overall, the results showed that children in the theory condition had a more accurate conception of health than children in the nontheory and control conditions, suggesting the importance of theories in children’s learning of complex, real-world concepts. PMID:21894237

  19. Method for modeling social care processes for national information exchange.

    PubMed

    Miettinen, Aki; Mykkänen, Juha; Laaksonen, Maarit

    2012-01-01

    Finnish social services include 21 service commissions of social welfare including Adoption counselling, Income support, Child welfare, Services for immigrants and Substance abuse care. This paper describes the method used for process modeling in the National project for IT in Social Services in Finland (Tikesos). The process modeling in the project aimed to support common national target state processes from the perspective of national electronic archive, increased interoperability between systems and electronic client documents. The process steps and other aspects of the method are presented. The method was developed, used and refined during the three years of process modeling in the national project.

  20. Theory-based metrological traceability in education: A reading measurement network.

    PubMed

    Fisher, William P; Stenner, A Jackson

    2016-10-01

    Huge resources are invested in metrology and standards in the natural sciences, engineering, and across a wide range of commercial technologies. Significant positive returns of human, social, environmental, and economic value on these investments have been sustained for decades. Proven methods for calibrating test and survey instruments in linear units are readily available, as are data- and theory-based methods for equating those instruments to a shared unit. Using these methods, metrological traceability is obtained in a variety of commercially available elementary and secondary English and Spanish language reading education programs in the U.S., Canada, Mexico, and Australia. Given established historical patterns, widespread routine reproduction of predicted text-based and instructional effects expressed in a common language and shared frame of reference may lead to significant developments in theory and practice. Opportunities for systematic implementations of teacher-driven lean thinking and continuous quality improvement methods may be of particular interest and value.

  1. Dissolved oxygen prediction using a possibility theory based fuzzy neural network

    NASA Astrophysics Data System (ADS)

    Khan, Usman T.; Valeo, Caterina

    2016-06-01

    A new fuzzy neural network method to predict minimum dissolved oxygen (DO) concentration in a highly urbanised riverine environment (in Calgary, Canada) is proposed. The method uses abiotic factors (non-living, physical and chemical attributes) as inputs to the model, since the physical mechanisms governing DO in the river are largely unknown. A new two-step method to construct fuzzy numbers using observations is proposed. Then an existing fuzzy neural network is modified to account for fuzzy number inputs and also uses possibility theory based intervals to train the network. Results demonstrate that the method is particularly well suited to predicting low DO events in the Bow River. Model performance is compared with a fuzzy neural network with crisp inputs, as well as with a traditional neural network. Model output and a defuzzification technique are used to estimate the risk of low DO so that water resource managers can implement strategies to prevent the occurrence of low DO.

  2. Imaging systems and methods for obtaining and using biometric information

    DOEpatents

    McMakin, Douglas L [Richland, WA; Kennedy, Mike O [Richland, WA

    2010-11-30

    Disclosed herein are exemplary embodiments of imaging systems and methods of using such systems. In one exemplary embodiment, one or more direct images of the body of a clothed subject are received, and a motion signature is determined from the one or more images. In this embodiment, the one or more images show movement of the body of the subject over time, and the motion signature is associated with the movement of the subject's body. In certain implementations, the subject can be identified based at least in part on the motion signature. Imaging systems for performing any of the disclosed methods are also disclosed herein. Furthermore, the disclosed imaging, rendering, and analysis methods can be implemented, at least in part, as one or more computer-readable media comprising computer-executable instructions for causing a computer to perform the respective methods.

  3. A Method to Measure the Amount of Battlefield Situation Information

    DTIC Science & Technology

    2014-06-01

    log)( 20 atS  3.2 Measurement of trends information Kierkegaard once said “Life can only be understood backwards, but it must be lived forwards” [8...Towards a theory of s 10 11 , “The Journals of Soren Kierkegaard ”, A selection and translated by Alexander chun, “Formal Description of Command...and 37(2), pp32-64, 1995. [8] Kierkegaard , Soren – Dru , Oxford: Oxford University Press (1938). pp465 [9] ZHOU Dao-an, ZHANG Dong-ge, CHANG Shu

  4. A High Accuracy Method for Semi-supervised Information Extraction

    SciTech Connect

    Tratz, Stephen C.; Sanfilippo, Antonio P.

    2007-04-22

    Customization to specific domains of dis-course and/or user requirements is one of the greatest challenges for today’s Information Extraction (IE) systems. While demonstrably effective, both rule-based and supervised machine learning approaches to IE customization pose too high a burden on the user. Semi-supervised learning approaches may in principle offer a more resource effective solution but are still insufficiently accurate to grant realistic application. We demonstrate that this limitation can be overcome by integrating fully-supervised learning techniques within a semi-supervised IE approach, without increasing resource requirements.

  5. Query by Browsing: An Alternative Hypertext Information Retrieval Method

    PubMed Central

    Frisse, Mark E.; Cousins, Steve B.

    1989-01-01

    In this paper we discuss our efforts to develop programs which enhance the ability to navigate through large medical hypertext systems. Our approach organizes hypertext index terms into a belief network and uses reader feedback to update the degree of belief in the index terms' utility to a query. We begin by describing various possible configurations for indexes to hypertext. We then describe how belief network calculations can be applied to these indexes. After a brief discussion of early results using manuscripts from a medical handbook, we close with an analysis of our approach's applicability to a wider range of hypertext information retrieval problems.

  6. Extracting ocean surface information from altimeter returns - The deconvolution method

    NASA Technical Reports Server (NTRS)

    Rodriguez, Ernesto; Chapman, Bruce

    1989-01-01

    An evaluation of the deconvolution method for estimating ocean surface parameters from ocean altimeter waveforms is presented. It is shown that this method presents a fast, accurate way of determining the ocean surface parameters from noisy altimeter data. Three parameters may be estimated by using this method, including the altimeter-height error, the ocean-surface standard deviation, and the ocean-surface skewness. By means of a Monte Carlo experiment, an 'optimum' deconvolution algorithm and the accuracies with which the above parameters may be estimated using this algorithm are determined. Then the influence of instrument effects, such as errors in calibration and pointing-angle estimation, on the estimated parameters is examined. Finally, the deconvolution algorithm is used to estimate height and ocean-surface parameters from Seasat data.

  7. 10 CFR 207.3 - Method of collecting energy information under ESECA.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 3 2010-01-01 2010-01-01 false Method of collecting energy information under ESECA. 207.3 Section 207.3 Energy DEPARTMENT OF ENERGY OIL COLLECTION OF INFORMATION Collection of Information Under the Energy Supply and Environmental Coordination Act of 1974 § 207.3 Method of collecting...

  8. 10 CFR 207.3 - Method of collecting energy information under ESECA.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 3 2014-01-01 2014-01-01 false Method of collecting energy information under ESECA. 207.3 Section 207.3 Energy DEPARTMENT OF ENERGY OIL COLLECTION OF INFORMATION Collection of Information Under the Energy Supply and Environmental Coordination Act of 1974 § 207.3 Method of collecting...

  9. 10 CFR 207.3 - Method of collecting energy information under ESECA.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 3 2013-01-01 2013-01-01 false Method of collecting energy information under ESECA. 207.3 Section 207.3 Energy DEPARTMENT OF ENERGY OIL COLLECTION OF INFORMATION Collection of Information Under the Energy Supply and Environmental Coordination Act of 1974 § 207.3 Method of collecting...

  10. 10 CFR 207.3 - Method of collecting energy information under ESECA.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 3 2012-01-01 2012-01-01 false Method of collecting energy information under ESECA. 207.3 Section 207.3 Energy DEPARTMENT OF ENERGY OIL COLLECTION OF INFORMATION Collection of Information Under the Energy Supply and Environmental Coordination Act of 1974 § 207.3 Method of collecting...

  11. 10 CFR 207.3 - Method of collecting energy information under ESECA.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 3 2011-01-01 2011-01-01 false Method of collecting energy information under ESECA. 207.3 Section 207.3 Energy DEPARTMENT OF ENERGY OIL COLLECTION OF INFORMATION Collection of Information Under the Energy Supply and Environmental Coordination Act of 1974 § 207.3 Method of collecting...

  12. The Effects of Presentation Method and Information Density on Visual Search Ability and Working Memory Load

    ERIC Educational Resources Information Center

    Chang, Ting-Wen; Kinshuk; Chen, Nian-Shing; Yu, Pao-Ta

    2012-01-01

    This study investigates the effects of successive and simultaneous information presentation methods on learner's visual search ability and working memory load for different information densities. Since the processing of information in the brain depends on the capacity of visual short-term memory (VSTM), the limited information processing capacity…

  13. The Effects of Presentation Method and Information Density on Visual Search Ability and Working Memory Load

    ERIC Educational Resources Information Center

    Chang, Ting-Wen; Kinshuk; Chen, Nian-Shing; Yu, Pao-Ta

    2012-01-01

    This study investigates the effects of successive and simultaneous information presentation methods on learner's visual search ability and working memory load for different information densities. Since the processing of information in the brain depends on the capacity of visual short-term memory (VSTM), the limited information processing capacity…

  14. Method for extracting long-equivalent wavelength interferometric information

    NASA Technical Reports Server (NTRS)

    Hochberg, Eric B. (Inventor)

    1991-01-01

    A process for extracting long-equivalent wavelength interferometric information from a two-wavelength polychromatic or achromatic interferometer. The process comprises the steps of simultaneously recording a non-linear sum of two different frequency visible light interferograms on a high resolution film and then placing the developed film in an optical train for Fourier transformation, low pass spatial filtering and inverse transformation of the film image to produce low spatial frequency fringes corresponding to a long-equivalent wavelength interferogram. The recorded non-linear sum irradiance derived from the two-wavelength interferometer is obtained by controlling the exposure so that the average interferogram irradiance is set at either the noise level threshold or the saturation level threshold of the film.

  15. Risk-Informed Safety Margin Characterization Methods Development Work

    SciTech Connect

    Smith, Curtis L; Ma, Zhegang; Riley, Tom; Mandelli, Diego; Nielsen, Joseph W; Alfonsi, Andrea; Rabiti, Cristian

    2014-09-01

    This report summarizes the research activity developed during the Fiscal year 2014 within the Risk Informed Safety Margin and Characterization (RISMC) pathway within the Light Water Reactor Sustainability (LWRS) campaign. This research activity is complementary to the one presented in the INL/EXT-??? report which shows advances Probabilistic Risk Assessment Analysis using RAVEN and RELAP-7 in conjunction to novel flooding simulation tools. Here we present several analyses that prove the values of the RISMC approach in order to assess risk associated to nuclear power plants (NPPs). We focus on simulation based PRA which, in contrast to classical PRA, heavily employs system simulator codes. Firstly we compare, these two types of analyses, classical and RISMC, for a Boiling water reactor (BWR) station black out (SBO) initiating event. Secondly we present an extended BWR SBO analysis using RAVEN and RELAP-5 which address the comments and suggestions received about he original analysis presented in INL/EXT-???. This time we focus more on the stochastic analysis such probability of core damage and on the determination of the most risk-relevant factors. We also show some preliminary results regarding the comparison between RELAP5-3D and the new code RELAP-7 for a simplified Pressurized Water Reactors system. Lastly we present some conceptual ideas regarding the possibility to extended the RISMC capabilities from an off-line tool (i.e., as PRA analysis tool) to an online-tool. In this new configuration, RISMC capabilities can be used to assist and inform reactor operator during real accident scenarios.

  16. Implementing shared decision-making in nutrition clinical practice: A theory-based approach and feasibility study

    PubMed Central

    Desroches, Sophie; Gagnon, Marie-Pierre; Tapp, Sylvie; Légaré, France

    2008-01-01

    Background There are a growing number of dietary treatment options to choose from for the management of many chronic diseases. Shared decision making represents a promising approach to improve the quality of the decision making process needed for dietary choices that are informed by the best evidence and value-based. However, there are no studies reporting on theory-based approaches that foster the implementation of shared decision making in health professions allied to medicine. The objectives of this study are to explore the integration of shared decision making within real nutritional consultations, and to design questionnaires to assess dieticians' intention to adopt two specific behaviors related to shared decision making using the Theory of Planned Behavior. Methods Forty dieticians will audiotape one clinical encounter to explore the presence of shared decision making within the consultation. They will also participate to one of five to six focus groups that aim to identify the salient beliefs underlying the determinants of their intention to present evidence-based dietary treatment options to their patients, and clarify the values related to dietary choices that are important to their patients. These salient beliefs will be used to elaborate the items of two questionnaires. The internal consistency of theoretical constructs and the temporal stability of their measurement will be checked using the test-retest method by asking 35 dieticians to complete the questionnaire twice within a two-week interval. Discussion The proposed research project will be the first study to: provide preliminary data about the adoption of shared decision making by dieticians and theirs patients; elicit dieticians' salient beliefs regarding the intention to adopt shared decision making behaviors, report on the development of a specific questionnaire; explore dieticians' views on the implementation of shared decision making; and compare their views regarding the implementation of

  17. ROOM: A recursive object oriented method for information systems development

    SciTech Connect

    Thelliez, T.; Donahue, S.

    1994-02-09

    Although complementary for the development of complex systems, top-down structured design and object oriented approach are still opposed and not integrated. As the complexity of the systems are still growing, and the so-called software crisis still not solved, it is urgent to provide a framework mixing the two paradigms. This paper presents an elegant attempt in this direction through our Recursive Object-Oriented Method (ROOM) in which a top-down approach divides the complexity of the system and an object oriented method studies a given level of abstraction. Illustrating this recursive schema with a simple example, we demonstrate that we achieve the goal of creating loosely coupled and reusable components.

  18. Testing a theory-based mobility monitoring protocol using in-home sensors: a feasibility study.

    PubMed

    Reeder, Blaine; Chung, Jane; Lazar, Amanda; Joe, Jonathan; Demiris, George; Thompson, Hilaire J

    2013-10-01

    Mobility is a key factor in the performance of many everyday tasks required for independent living as a person ages. The purpose of this mixed-methods study was to test a theory-based mobility monitoring protocol by comparing sensor-based measures to self-report measures of mobility and assess the acceptability of in-home sensors with older adults. Standardized instruments to measure physical, psychosocial, and cognitive parameters were administered to 8 community-dwelling older adults at baseline, 3-month, and 6-month visits. Semi-structured interviews to characterize acceptability of the technology were conducted at the 3-month and 6-month visits. Technical issues prevented comparison of sensor-based measures with self-report measures. In-home sensor technology for monitoring mobility is acceptable to older adults. Implementing our theory-based mobility monitoring protocol in a field study in the homes of older adults is a feasible undertaking but requires more robust technology for sensor-based measure validation.

  19. Evaluating participatory decision processes: which methods inform reflective practice?

    PubMed

    Kaufman, Sanda; Ozawa, Connie P; Shmueli, Deborah F

    2014-02-01

    Evaluating participatory decision processes serves two key purposes: validating the usefulness of specific interventions for stakeholders, interveners and funders of conflict management processes, and improving practice. However, evaluation design remains challenging, partly because when attempting to serve both purposes we may end up serving neither well. In fact, the better we respond to one, the less we may satisfy the other. Evaluations tend to focus on endogenous factors (e.g., stakeholder selection, BATNAs, mutually beneficial tradeoffs, quality of the intervention, etc.), because we believe that the success of participatory decision processes hinges on them, and they also seem to lend themselves to caeteris paribus statistical comparisons across cases. We argue that context matters too and that contextual differences among specific cases are meaningful enough to undermine conclusions derived solely from comparisons of process-endogenous factors implicitly rooted in the caeteris paribus assumption. We illustrate this argument with an environmental mediation case. We compare data collected about it through surveys geared toward comparability across cases to information elicited through in-depth interviews geared toward case specifics. The surveys, designed by the U.S. Institute of Environmental Conflict Resolution, feed a database of environmental conflicts that can help make the (statistical) case for intervention in environmental conflict management. Our interviews elicit case details - including context - that enable interveners to link context specifics and intervention actions to outcomes. We argue that neither approach can "serve both masters."

  20. Similarity theory based on the Dougherty-Ozmidov length scale

    NASA Astrophysics Data System (ADS)

    Grachev, Andrey A.; Andreas, Edgar L.; Fairall, Christopher W.; Guest, Peter S.; Persson, P. Ola G.

    2015-07-01

    Local similarity theory is suggested based on the Brunt-Vaisala frequency and the dissipation rate of turbulent kinetic energy instead the turbulent fluxes used in the traditional Monin-Obukhov similarity theory. Based on dimensional analysis (Pi theorem), it is shown that any properly scaled statistics of the small-scale turbulence are universal functions of a stability parameter defined as the ratio of a reference height z and the Dougherty-Ozmidov length scale which in the limit of z-less stratification is linearly proportional to the Obukhov length scale. Measurements of atmospheric turbulence made at five levels on a 20-m tower over the Arctic pack ice during the Surface Heat Budget of the Arctic Ocean experiment (SHEBA) are used to examine the behaviour of different similarity functions in the stable boundary layer. It is found that in the framework of this approach the non-dimensional turbulent viscosity is equal to the gradient Richardson number whereas the non-dimensional turbulent thermal diffusivity is equal to the flux Richardson number. These results are a consequence of the approximate local balance between production of turbulence by the mean flow shear and viscous dissipation. The turbulence framework based on the Brunt-Vaisala frequency and the dissipation rate of turbulent kinetic energy may have practical advantages for estimating turbulence when the fluxes are not directly available.

  1. Quantum field theory based on birefringent modified Maxwell theory

    NASA Astrophysics Data System (ADS)

    Schreck, M.

    2014-04-01

    In the current paper the properties of a birefringent Lorentz-violating extension of quantum electrodynamics is considered. The theory results from coupling modified Maxwell theory, which is a CPT-even Lorentz-violating extension of the photon sector, to a Dirac theory of standard spin-1/2 particles. It is then restricted to a special birefringent case with one nonzero Lorentz-violating coefficient. The modified dispersion laws of electromagnetic waves are obtained plus their phase and group velocities are considered. After deriving the photon propagator and the polarization vectors for a special momentum configuration we prove both unitarity at tree level and microcausality for the quantum field theory based on this Lorentz-violating modification. These analytical proofs are done for a spatial momentum with two vanishing components and the proof of unitarity is supported by numerical investigations in case all components are nonvanishing. The upshot is that the theory is well behaved within the framework of our assumptions where there is a possible issue for negative Lorentz-violating coefficients. The paper shall provide a basis for the future analysis of alternative birefringent quantum field theories.

  2. Examination of an Electronic Patient Record Display Method to Protect Patient Information Privacy.

    PubMed

    Niimi, Yukari; Ota, Katsumasa

    2017-02-01

    Electronic patient records facilitate the provision of safe, high-quality medical care. However, because personnel can view almost all stored information, this study designed a display method using a mosaic blur (pixelation) to temporarily conceal information patients do not want shared. This study developed an electronic patient records display method for patient information that balanced the patient's desire for personal information protection against the need for information sharing among medical personnel. First, medical personnel were interviewed about the degree of information required for both individual duties and team-based care. Subsequently, they tested a mock display method that partially concealed information using a mosaic blur, and they were interviewed about the effectiveness of the display method that ensures patient privacy. Participants better understood patients' demand for confidentiality, suggesting increased awareness of patients' privacy protection. However, participants also indicated that temporary concealment of certain information was problematic. Other issues included the inconvenience of removing the mosaic blur to obtain required information and risk of insufficient information for medical care. Despite several issues with using a display method that temporarily conceals information according to patient privacy needs, medical personnel could accept this display method if information essential to medical safety remains accessible.

  3. Rough set theory based prognostic classification models for hospice referral.

    PubMed

    Gil-Herrera, Eleazar; Aden-Buie, Garrick; Yalcin, Ali; Tsalatsanis, Athanasios; Barnes, Laura E; Djulbegovic, Benjamin

    2015-11-25

    This paper explores and evaluates the application of classical and dominance-based rough set theory (RST) for the development of data-driven prognostic classification models for hospice referral. In this work, rough set based models are compared with other data-driven methods with respect to two factors related to clinical credibility: accuracy and accessibility. Accessibility refers to the ability of the model to provide traceable, interpretable results and use data that is relevant and simple to collect. We utilize retrospective data from 9,103 terminally ill patients to demonstrate the design and implementation RST- based models to identify potential hospice candidates. The classical rough set approach (CRSA) provides methods for knowledge acquisition, founded on the relational indiscernibility of objects in a decision table, to describe required conditions for membership in a concept class. On the other hand, the dominance-based rough set approach (DRSA) analyzes information based on the monotonic relationships between condition attributes values and their assignment to the decision class. CRSA decision rules for six-month patient survival classification were induced using the MODLEM algorithm. Dominance-based decision rules were extracted using the VC-DomLEM rule induction algorithm. The RST-based classifiers are compared with other predictive and rule based decision modeling techniques, namely logistic regression, support vector machines, random forests and C4.5. The RST-based classifiers demonstrate average AUC of 69.74 % with MODLEM and 71.73 % with VC-DomLEM, while the compared methods achieve average AUC of 74.21 % for logistic regression, 73.52 % for support vector machines, 74.59 % for random forests, and 70.88 % for C4.5. This paper contributes to the growing body of research in RST-based prognostic models. RST and its extensions posses features that enhance the accessibility of clinical decision support models. While the non-rule-based methods

  4. A method for extracting drainage networks with heuristic information from digital elevation models.

    PubMed

    Hou, Kun; Yang, Wei; Sun, Jigui; Sun, Tieli

    2011-01-01

    Depression filling and direction assignment over flat areas are critical issues in hydrologic analysis. This paper proposes a method to handle depressions and flat areas in one procedure. Being different from the traditional raster neighbourhoods processing with little heuristic information, the method is designed to compensate for the inadequate searching information of other methods. The proposed method routes flow through depressions and flat areas by searching for the outlet using the heuristic information. Heuristic information can reveal the general trend slope of the DEM (digital elevation models) and help the proposed method find the outlet accurately. The method is implemented in Pascal and experiments are carried out on actual DEM data. It can be seen from the comparison with the four existing methods that the proposed method can get a closer match result with the ground truth network. Moreover, the proposed method can avoid the generation of the unrealistic parallel drainage lines, unreal drainage lines and spurious terrain features.

  5. Navigating Longitudinal Clinical Notes with an Automated Method for Detecting New Information

    PubMed Central

    Zhang, Rui; Pakhomov, Serguei; Lee, Janet T.; Melton, Genevieve B.

    2015-01-01

    Automated methods to detect new information in clinical notes may be valuable for navigating and using information in these documents for patient care. Statistical language models were evaluated as a means to quantify new information over longitudinal clinical notes for a given patient. The new information proportion (NIP) in target notes decreased logarithmically with increasing numbers of previous notes to create the language model. For a given patient, the amount of new information had cyclic patterns. Higher NIP scores correlated with notes having more new information often with clinically significant events, and lower NIP scores indicated notes with less new information. Our analysis also revealed “copying and pasting” to be widely used in generating clinical notes by copying information from the most recent historical clinical notes forward. These methods can potentially aid clinicians in finding notes with more clinically relevant new information and in reviewing notes more purposefully which may increase the efficiency of clinicians in delivering patient care. PMID:23920658

  6. Method and Application for Dynamic Comprehensive Evaluation with Subjective and Objective Information

    PubMed Central

    Liu, Dinglin; Zhao, Xianglian

    2013-01-01

    In an effort to deal with more complicated evaluation situations, scientists have focused their efforts on dynamic comprehensive evaluation research. How to make full use of the subjective and objective information has become one of the noteworthy content. In this paper, a dynamic comprehensive evaluation method with subjective and objective information is proposed. We use the combination weighting method to determine the index weight. Analysis hierarchy process method is applied to dispose the subjective information, and criteria importance through intercriteria correlation method is used to handle the objective information. And for the time weight determination, we consider both time distance and information size to embody the principle of esteeming the present over the past. And then the linear weighted average model is constructed to make the evaluation process more practicable. Finally, an example is presented to illustrate the effectiveness of this method. Overall, the results suggest that the proposed method is reasonable and effective. PMID:24386176

  7. Successful Aging with Sickle Cell Disease: Using Qualitative Methods to Inform Theory.

    PubMed

    Jenerette, Coretta M; Lauderdale, Gloria

    2008-04-01

    Little is known about the lives of adults with sickle cell disease (SCD). This article reports findings from a qualitative pilot study, which used life review as a method to explore influences on health outcomes among middle-aged and older adults with SCD, Six females with SCD, recruited from two urban sickle cell clinics in the U.S., engaged in semi-structured, in-depth life review interviews. MaxQDA2 software was used for qualitative data coding and analysis. Three major themes were identified: vulnerability factors, self-care management resources, and health outcomes. These themes are consistent with the Theory of Self-Care Management for Sickle Cell Disease. Identifying vulnerability factors, self-care management resources, and health outcomes in adults with SCD may aid in developing theory-based interventions to meet health care needs of younger individuals with SCD. The life review process is a useful means to gain insight into successful aging with SCD and other chronic illnesses.

  8. Effective Methods for Studying Information Seeking and Use. Introduction and Overview.

    ERIC Educational Resources Information Center

    Wildemuth, Barbara M.

    2002-01-01

    In conjunction with the American Society for Information Science and Technology's (ASIST) annual meeting in fall 2001, the Special Interest Group on Information Needs, Seeking, and Use (SIG USE) sponsored a research symposium on "Effective Methods for Studying Information Seeking and Use." This article briefly reviews six articles presented at the…

  9. Entropy theory based multi-criteria resampling of rain gauge networks for hydrological modelling - a case study of humid area in southern China

    NASA Astrophysics Data System (ADS)

    Xu, Hongliang; Xu, Chongyu; Roar Sælthun, Nils; Zhou, Bin; Xu, Youpeng

    2014-05-01

    Rain gauge networks are usually used to provide estimates of area average rainfall or point rainfalls at catchment scale and provide the most important input for hydrological models. Due to economical, technical and other constraints, rain gauge networks are usually not dense enough or not properly placed to measure precipitation at the resolution and extent necessary for determining the spatial variability of rainfall. It is therefore desirable to study the effect of rain gauge distribution and to design well distributed rain gauge networks with minimal gauge densities to provide best possible estimations with both rainfall amount and spatial-temporal variability. Based on a dense rain gauge network of 185 rain gauges in Xiangjiang River Basin, southern China, this study applied an entropy theory based multi-criteria method which simultaneously considering the information derived from rainfall series and minimizing the bias of areal mean rainfall as well as the information overlapped by different gauges to resample the rain gauge networks with different gauge densities. The optimal networks are tested using two hydrological models: the lumped Xinanjiang Model and distributed SWAT Model in order to investigate how the lumped and distributed models react to the number of rain gauges and their spatial distribution. The hydrological simulation results reveal that the performances of the lumped Xinanjiang Model using different optimized networks are stable while the distributed SWAT Model shows an improved trend in model performances with more rain gauges are included in simulation. The results indicate that the entropy theory based multi-criteria strategy provide a robust design of rain gauge networks and more stations are needed in order to realize the advantages of distributed models in hydrological simulations.

  10. 49 CFR 1135.2 - Revenue Shortfall Allocation Method: Annual State tax information.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... RECOVERY PROCEDURES § 1135.2 Revenue Shortfall Allocation Method: Annual State tax information. (a) To... 49 Transportation 8 2011-10-01 2011-10-01 false Revenue Shortfall Allocation Method: Annual State tax information. 1135.2 Section 1135.2 Transportation Other Regulations Relating to Transportation...

  11. 49 CFR 1135.2 - Revenue Shortfall Allocation Method: Annual State tax information.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... RECOVERY PROCEDURES § 1135.2 Revenue Shortfall Allocation Method: Annual State tax information. (a) To... 49 Transportation 8 2010-10-01 2010-10-01 false Revenue Shortfall Allocation Method: Annual State tax information. 1135.2 Section 1135.2 Transportation Other Regulations Relating to Transportation...

  12. Mixed Methods Approach to Assessing an Informal Buddy Support System for Canadian Forces Reservists

    DTIC Science & Technology

    2011-04-01

    Mixed Methods Approach to Assessing an Informal Buddy Support System for Canadian Forces Reservists Donna I. Pickering...Tara Holton Defence R&D Canada Technical Memorandum DRDC Toronto TM 2011-028 April 2011...Mixed Methods Approach to Assessing an Informal Buddy Support System for Canadian Forces Reservists Donna I. Pickering Tara Holton

  13. A Review of Web Information Seeking Research: Considerations of Method and Foci of Interest

    ERIC Educational Resources Information Center

    Martzoukou, Konstantina

    2005-01-01

    Introduction: This review shows that Web information seeking research suffers from inconsistencies in method and a lack of homogeneity in research foci. Background: Qualitative and quantitative methods are needed to produce a comprehensive view of information seeking. Studies also recommend observation as one of the most fundamental ways of…

  14. Static analysis of rectangular nanoplates using trigonometric shear deformation theory based on nonlocal elasticity theory

    PubMed Central

    Nami, Mohammad Rahim

    2013-01-01

    Summary In this article, a new higher order shear deformation theory based on trigonometric shear deformation theory is developed. In order to consider the size effects, the nonlocal elasticity theory is used. An analytical method is adopted to solve the governing equations for static analysis of simply supported nanoplates. In the present theory, the transverse shear stresses satisfy the traction free boundary conditions of the rectangular plates and these stresses can be calculated from the constitutive equations. The effects of different parameters such as nonlocal parameter and aspect ratio are investigated on both nondimensional deflections and deflection ratios. It may be important to mention that the present formulations are general and can be used for isotropic, orthotropic and anisotropic nanoplates. PMID:24455455

  15. Removing barriers to rehabilitation: Theory-based family intervention in community settings after brain injury.

    PubMed

    Stejskal, Taryn M

    2012-01-01

    Rehabilitation professionals have become increasingly aware that family members play a critical role in the recovery process of individuals after brain injury. In addition, researchers have begun to identify a relationship between family member caregivers' well-being and survivors' outcomes. The idea of a continuum of care or following survivors from inpatient care to community reintegration has become an important model of treatment across many hospital and community-based settings. In concert with the continuum of care, present research literature indicates that family intervention may be a key component to successful rehabilitation after brain injury. Yet, clinicians interacting with family members and survivors often feel confounded about how exactly to intervene with the broader family system beyond the individual survivor. Drawing on the systemic nature of the field of marriage and family therapy (MFT), this article provides information to assist clinicians in effectively intervening with families using theory-based interventions in community settings. First, a rationale for the utilization of systems-based, as opposed to individual-based, therapies will be uncovered. Second, historically relevant publications focusing on family psychotherapy and intervention after brain injury are reviewed and their implications discussed. Recommendations for the utilization of systemic theory-based principles and strategies, specifically cognitive behavioral therapy (CBT), narrative therapy (NT), and solution-focused therapy (SFT) will be examined. Descriptions of common challenges families and couples face will be presented along with case examples to illustrate how these theoretical frameworks might be applied to these special concerns postinjury. Finally, the article concludes with an overview of the ideas presented in this manuscript to assist practitioners and systems of care in community-based settings to more effectively intervene with the family system as a whole

  16. Novel lattice Boltzmann method based on integrated edge and region information for medical image segmentation.

    PubMed

    Wen, Junling; Yan, Zhuangzhi; Jiang, Jiehui

    2014-01-01

    The lattice Boltzmann (LB) method is a mesoscopic method based on kinetic theory and statistical mechanics. The main advantage of the LB method is parallel computation, which increases the speed of calculation. In the past decade, LB methods have gradually been introduced for image processing, e.g., image segmentation. However, a major shortcoming of existing LB methods is that they can only be applied to the processing of medical images with intensity homogeneity. In practice, however, many medical images possess intensity inhomogeneity. In this study, we developed a novel LB method to integrate edge and region information for medical image segmentation. In contrast to other segmentation methods, we added edge information as a relaxing factor and used region information as a source term. The proposed method facilitates the segmentation of medical images with intensity inhomogeneity and it still allows parallel computation. Preliminary tests of the proposed method are presented in this paper.

  17. Moderators of Theory-Based Interventions to Promote Physical Activity in 77 Randomized Controlled Trials

    ERIC Educational Resources Information Center

    Bernard, Paquito; Carayol, Marion; Gourlan, Mathieu; Boiché, Julie; Romain, Ahmed Jérôme; Bortolon, Catherine; Lareyre, Olivier; Ninot, Gregory

    2017-01-01

    A meta-analysis of randomized controlled trials (RCTs) has recently showed that theory-based interventions designed to promote physical activity (PA) significantly increased PA behavior. The objective of the present study was to investigate the moderators of the efficacy of these theory-based interventions. Seventy-seven RCTs evaluating…

  18. Models for Theory-Based M.A. and Ph.D. Programs.

    ERIC Educational Resources Information Center

    Botan, Carl; Vasquez, Gabriel

    1999-01-01

    Presents work accomplished at the 1998 National Communication Association Summer Conference. Outlines reasons for theory-based education in public relations. Presents an integrated model of student outcomes, curriculum, pedagogy, and assessment for theory-based master's and doctoral programs, including assumptions made and rationale for such…

  19. Dynamic stepping information process method in mobile bio-sensing computing environments.

    PubMed

    Lee, Tae-Gyu; Lee, Seong-Hoon

    2014-01-01

    Recently, the interest toward human longevity free from diseases is being converged as one system frame along with the development of mobile computing environment, diversification of remote medical system and aging society. Such converged system enables implementation of a bioinformatics system created as various supplementary information services by sensing and gathering health conditions and various bio-information of mobile users to set up medical information. The existing bio-information system performs static and identical process without changes after the bio-information process defined at the initial system configuration executes the system. However, such static process indicates ineffective execution in the application of mobile bio-information system performing mobile computing. Especially, an inconvenient duty of having to perform initialization of new definition and execution is accompanied during the process configuration of bio-information system and change of method. This study proposes a dynamic process design and execution method to overcome such ineffective process.

  20. A method for extracting task-oriented information from biological text sources.

    PubMed

    Kuttiyapillai, Dhanasekaran; Rajeswari, R

    2015-01-01

    A method for information extraction which processes the unstructured data from document collection has been introduced. A dynamic programming technique adopted to find relevant genes from sequences which are longest and accurate is used for finding matching sequences and identifying effects of various factors. The proposed method could handle complex information sequences which give different meanings in different situations, eliminating irrelevant information. The text contents were pre-processed using a general-purpose method and were applied with entity tagging component. The bottom-up scanning of key-value pairs improves content finding to generate relevant sequences to the testing task. This paper highlights context-based extraction method for extracting food safety information, which is identified from articles, guideline documents and laboratory results. The graphical disease model verifies weak component through utilisation of development data set. This improves the accuracy of information retrieval in biological text analysis and reporting applications.

  1. Novel Methods for Measuring Depth of Anesthesia by Quantifying Dominant Information Flow in Multichannel EEGs

    PubMed Central

    Choi, Byung-Moon; Noh, Gyu-Jeong

    2017-01-01

    In this paper, we propose novel methods for measuring depth of anesthesia (DOA) by quantifying dominant information flow in multichannel EEGs. Conventional methods mainly use few EEG channels independently and most of multichannel EEG based studies are limited to specific regions of the brain. Therefore the function of the cerebral cortex over wide brain regions is hardly reflected in DOA measurement. Here, DOA is measured by the quantification of dominant information flow obtained from principle bipartition. Three bipartitioning methods are used to detect the dominant information flow in entire EEG channels and the dominant information flow is quantified by calculating information entropy. High correlation between the proposed measures and the plasma concentration of propofol is confirmed from the experimental results of clinical data in 39 subjects. To illustrate the performance of the proposed methods more easily we present the results for multichannel EEG on a two-dimensional (2D) brain map. PMID:28408923

  2. Novel Methods for Measuring Depth of Anesthesia by Quantifying Dominant Information Flow in Multichannel EEGs.

    PubMed

    Cha, Kab-Mun; Choi, Byung-Moon; Noh, Gyu-Jeong; Shin, Hyun-Chool

    2017-01-01

    In this paper, we propose novel methods for measuring depth of anesthesia (DOA) by quantifying dominant information flow in multichannel EEGs. Conventional methods mainly use few EEG channels independently and most of multichannel EEG based studies are limited to specific regions of the brain. Therefore the function of the cerebral cortex over wide brain regions is hardly reflected in DOA measurement. Here, DOA is measured by the quantification of dominant information flow obtained from principle bipartition. Three bipartitioning methods are used to detect the dominant information flow in entire EEG channels and the dominant information flow is quantified by calculating information entropy. High correlation between the proposed measures and the plasma concentration of propofol is confirmed from the experimental results of clinical data in 39 subjects. To illustrate the performance of the proposed methods more easily we present the results for multichannel EEG on a two-dimensional (2D) brain map.

  3. Inter-instrumental method transfer of chiral capillary electrophoretic methods using robustness test information.

    PubMed

    De Cock, Bart; Borsuk, Agnieszka; Dejaegher, Bieke; Stiens, Johan; Mangelings, Debby; Vander Heyden, Yvan

    2014-08-01

    Capillary electrophoresis (CE) is an electrodriven separation technique that is often used for the separation of chiral molecules. Advantages of CE are its flexibility, low cost and efficiency. On the other hand, the precision and transfer of CE methods are well-known problems of the technique. Reasons for the more complicated method transfer are the more diverse instrumental differences, such as total capillary lengths and capillary cooling systems; and the higher response variability of CE methods compared to other techniques, such as liquid chromatography (HPLC). Therefore, a larger systematic change in peak resolutions, migration times and peak areas, with a loss of separation and efficiency may be seen when a CE method is transferred to another laboratory or another type of instrument. A swift and successful method transfer is required because development and routine use of analytical methods are usually not performed in the same laboratory and/or on the same type of equipment. The aim of our study was to develop transfer rules to facilitate CE method transfers between different laboratories and instruments. In our case study, three β-blockers were chirally separated and inter-instrumental transfers were performed. The first step of our study was to optimise the precision of the chiral CE method. Next, a robustness test was performed to identify the instrumental and experimental parameters that were most influencing the considered responses. The precision- and the robustness study results were used to adapt instrumental and/or method settings to improve the transfer between different instruments. Finally, the comparison of adapted and non-adapted transfers allowed deriving some rules to facilitate CE method transfers.

  4. Searching for Suicide Methods: Accessibility of Information About Helium as a Method of Suicide on the Internet.

    PubMed

    Gunnell, David; Derges, Jane; Chang, Shu-Sen; Biddle, Lucy

    2015-01-01

    Helium gas suicides have increased in England and Wales; easy-to-access descriptions of this method on the Internet may have contributed to this rise. To investigate the availability of information on using helium as a method of suicide and trends in searching about this method on the Internet. We analyzed trends in (a) Google searching (2004-2014) and (b) hits on a Wikipedia article describing helium as a method of suicide (2013-2014). We also investigated the extent to which helium was described as a method of suicide on web pages and discussion forums identified via Google. We found no evidence of rises in Internet searching about suicide using helium. News stories about helium suicides were associated with increased search activity. The Wikipedia article may have been temporarily altered to increase awareness of suicide using helium around the time of a celebrity suicide. Approximately one third of the links retrieved using Google searches for suicide methods mentioned helium. Information about helium as a suicide method is readily available on the Internet; the Wikipedia article describing its use was highly accessed following celebrity suicides. Availability of online information about this method may contribute to rises in helium suicides.

  5. Information/Knowledge Acquisition Methods for Decision Support Systems and Expert Systems.

    ERIC Educational Resources Information Center

    Yang, Heng-Li

    1995-01-01

    Compares information requirement-elicitation (IRE) methods for decision support systems (DSS) with knowledge acquisition (KA) methods for expert systems (ES) development. The definition and architectures of ES and DSS are compared and the systems' development cycles and IRE/KA methods are discussed. Differences are noted between ES and DSS…

  6. Effectiveness of Visual Methods in Information Procedures for Stem Cell Recipients and Donors.

    PubMed

    Sarıtürk, Çağla; Gereklioğlu, Çiğdem; Korur, Aslı; Asma, Suheyl; Yeral, Mahmut; Solmaz, Soner; Büyükkurt, Nurhilal; Tepebası, Songül; Kozanoğlu, İlknur; Boğa, Can; Özdoğu, Hakan

    2016-07-15

    Obtaining informed consent from hematopoietic stem cell recipients and donors is a critical step in the transplantation process. Anxiety may affect their understanding of the provided information. However, use of audiovisual methods may facilitate understanding. In this prospective randomized study, we investigated the effectiveness of using an audiovisual method of providing information to patients and donors in combination with the standard model. A 10-minute informational animation was prepared for this purpose. In total, 82 participants were randomly assigned to two groups: Group 1 received the additional audiovisual information and Group 2 received standard information. A 20-item questionnaire was administered to participants at the end of the informational session. A reliability test and factor analysis showed the questionnaire was reliable and valid. For all participants, the mean overall satisfaction score was 184.8±19.8 (maximum possible score of 200). However, for satisfaction with information about written informed consent, Group 1 scored significantly higher than Group 2 (p=0.039). Satisfaction level was not affected by age, education level, or differences between the physicians conducting the informative session. This study shows that using audiovisual tools may contribute to a better understanding of the informed consent procedure and potential risks of stem cell transplantation.

  7. Change Detection Method with Spatial and Spectral Information from Deep Learning

    NASA Astrophysics Data System (ADS)

    Lyu, Haobo; Lu, Hui

    2017-04-01

    Change detection is a key application of remote sensing technology. For multi-spectral images, the available spatial information and useful spectral information is both helpful for data analysis, especially change detection tasks. However, it is difficult that how to learn the changed features from spatial and spectral information meantime in one model. In this paper, we proposed a new method which combines 2-dimensional Convolutional Neural Network and 1-dimensional Recurrent Neural Network for learn changed feature. Compared with only using spectral information, the spatial information will be helpful to overcome temporal spectral variance issues. Our method extracts the spatial difference and spectral difference meantime, and these change information will be balanced in final memory cell of our model, and the leaned change information will be exploited to character change features for change detection. Finally, experiments are performed on two multi-temporal datasets, and the results show superior performance on detecting changes with spatial information and spectral information. Index Terms— Change detection, multi-temporal images, recurrent neural network, convolutional neural network , deep learning, spatial information, spectral information

  8. Personality and Psychopathology: a Theory-Based Revision of Eysenck’s PEN Model

    PubMed Central

    van Kampen, Dirk

    2009-01-01

    The principal aim of this paper is to investigate whether it is possible to create a personality taxonomy of clinical relevance out of Eysenck’s original PEN model by repairing the various shortcomings that can be noted in Eysenck’s personality theory, particularly in relation to P or Psychoticism. Addressing three approaches that have been followed to answer the question ‘which personality factors are basic?’, arguments are listed to show that particularly the theory-informed approach, originally defended by Eysenck, may lead to scientific progress. However, also noting the many deficiencies in the nomological network surrounding P, the peculiar situation arises that we adhere to Eysenck’s theory-informed methodology, but criticize his theory. These arguments and criticisms led to the replacement of P by three orthogonal and theory-based factors, Insensitivity (S), Orderliness (G), and Absorption (A), that together with the dimensions E or Extraversion and N or Neuroticism, that were retained from Eysenck’s PEN model, appear to give a comprehensive account of the main vulnerability factors in schizophrenia and affective disorders, as well as in other psychopathological conditions. PMID:20498694

  9. Personality and Psychopathology: a Theory-Based Revision of Eysenck's PEN Model.

    PubMed

    van Kampen, Dirk

    2009-12-08

    The principal aim of this paper is to investigate whether it is possible to create a personality taxonomy of clinical relevance out of Eysenck's original PEN model by repairing the various shortcomings that can be noted in Eysenck's personality theory, particularly in relation to P or Psychoticism. Addressing three approaches that have been followed to answer the question 'which personality factors are basic?', arguments are listed to show that particularly the theory-informed approach, originally defended by Eysenck, may lead to scientific progress. However, also noting the many deficiencies in the nomological network surrounding P, the peculiar situation arises that we adhere to Eysenck's theory-informed methodology, but criticize his theory. These arguments and criticisms led to the replacement of P by three orthogonal and theory-based factors, Insensitivity (S), Orderliness (G), and Absorption (A), that together with the dimensions E or Extraversion and N or Neuroticism, that were retained from Eysenck's PEN model, appear to give a comprehensive account of the main vulnerability factors in schizophrenia and affective disorders, as well as in other psychopathological conditions.

  10. Method for Evaluating Information to Solve Problems of Control, Monitoring and Diagnostics

    NASA Astrophysics Data System (ADS)

    Vasil'ev, V. A.; Dobrynina, N. V.

    2017-06-01

    The article describes a method for evaluating information to solve problems of control, monitoring and diagnostics. It is necessary for reducing the dimensionality of informational indicators of situations, bringing them to relative units, for calculating generalized information indicators on their basis, ranking them by characteristic levels, for calculating the efficiency criterion of a system functioning in real time. The design of information evaluation system has been developed on its basis that allows analyzing, processing and assessing information about the object. Such object can be a complex technical, economic and social system. The method and the based system thereof can find a wide application in the field of analysis, processing and evaluation of information on the functioning of the systems, regardless of their purpose, goals, tasks and complexity. For example, they can be used to assess the innovation capacities of industrial enterprises and management decisions.

  11. Method of bistable optical information storage using antiferroelectric phase PLZT ceramics

    DOEpatents

    Land, Cecil E.

    1990-01-01

    A method for bistable storage of binary optical information includes an antiferroelectric (AFE) lead lanthanum zirconate titanate (PLZT) layer having a stable antiferroelectric first phase and a ferroelectric (FE) second phase obtained by applying a switching electric field across the surface of the device. Optical information is stored by illuminating selected portions of the layer to photoactivate an FE to AFE transition in those portions. Erasure of the stored information is obtained by reapplying the switching field.

  12. Method of bistable optical information storage using antiferroelectric phase PLZT ceramics

    DOEpatents

    Land, C.E.

    1990-07-31

    A method for bistable storage of binary optical information includes an antiferroelectric (AFE) lead lanthanum zirconate titanate (PLZT) layer having a stable antiferroelectric first phase and a ferroelectric (FE) second phase obtained by applying a switching electric field across the surface of the device. Optical information is stored by illuminating selected portions of the layer to photoactivate an FE to AFE transition in those portions. Erasure of the stored information is obtained by reapplying the switching field. 8 figs.

  13. Basic Information for EPA's Selected Analytical Methods for Environmental Remediation and Recovery (SAM)

    EPA Pesticide Factsheets

    Contains basic information on the role and origins of the Selected Analytical Methods including the formation of the Homeland Security Laboratory Capacity Work Group and the Environmental Evaluation Analytical Process Roadmap for Homeland Security Events

  14. 78 FR 34427 - 2012 Tax Information for Use In The Revenue Shortfall Allocation Method

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-07

    ... Surface Transportation Board 2012 Tax Information for Use In The Revenue Shortfall Allocation Method... formula mistakenly compared pre-tax and after-tax revenues. In that decision, the Board stated that it.... STB, 584 F.3d 1076 (D.C. Cir. 2009). In Annual Submission of Tax Information for Use in the Revenue...

  15. Mathematical, Logical, and Formal Methods in Information Retrieval: An Introduction to the Special Issue.

    ERIC Educational Resources Information Center

    Crestani, Fabio; Dominich, Sandor; Lalmas, Mounia; van Rijsbergen, Cornelis Joost

    2003-01-01

    Discusses the importance of research on the use of mathematical, logical, and formal methods in information retrieval to help enhance retrieval effectiveness and clarify underlying concepts of information retrieval. Highlights include logic; probability; spaces; and future research needs. (Author/LRW)

  16. Theories and Methods for Research on Informal Learning and Work: Towards Cross-Fertilization

    ERIC Educational Resources Information Center

    Sawchuk, Peter H.

    2008-01-01

    The topic of informal learning and work has quickly become a staple in contemporary work and adult learning research internationally. The narrow conceptualization of work is briefly challenged before the article turns to a review of the historical origins as well as contemporary theories and methods involved in researching informal learning and…

  17. Theories and Methods for Research on Informal Learning and Work: Towards Cross-Fertilization

    ERIC Educational Resources Information Center

    Sawchuk, Peter H.

    2008-01-01

    The topic of informal learning and work has quickly become a staple in contemporary work and adult learning research internationally. The narrow conceptualization of work is briefly challenged before the article turns to a review of the historical origins as well as contemporary theories and methods involved in researching informal learning and…

  18. Mathematical, Logical, and Formal Methods in Information Retrieval: An Introduction to the Special Issue.

    ERIC Educational Resources Information Center

    Crestani, Fabio; Dominich, Sandor; Lalmas, Mounia; van Rijsbergen, Cornelis Joost

    2003-01-01

    Discusses the importance of research on the use of mathematical, logical, and formal methods in information retrieval to help enhance retrieval effectiveness and clarify underlying concepts of information retrieval. Highlights include logic; probability; spaces; and future research needs. (Author/LRW)

  19. Study protocol: a randomised controlled trial of a theory-based online intervention to improve sun safety among Australian adults

    PubMed Central

    2014-01-01

    Background The effects of exposure to ultraviolet radiation are a significant concern in Australia which has one of the highest incidences of skin cancer in the world. Despite most skin cancers being preventable by encouraging consistent adoption of sun-protective behaviours, incidence rates are not decreasing. There is a dearth of research examining the factors involved in engaging in sun-protective behaviours. Further, online multi-behavioural theory-based interventions have yet to be explored fully as a medium for improving sun-protective behaviour in adults. This paper presents the study protocol of a randomised controlled trial of an online intervention based on the Theory of Planned Behaviour (TPB) that aims to improve sun safety among Australian adults. Methods/Design Approximately 420 adults aged 18 and over and predominantly from Queensland, Australia, will be recruited and randomised to the intervention (n = 200), information only (n = 200) or the control group (n = 20). The intervention focuses on encouraging supportive attitudes and beliefs toward sun-protective behaviour, fostering perceptions of normative support for sun protection, and increasing perceptions of control/self-efficacy over sun protection. The intervention will be delivered online over a single session. Data will be collected immediately prior to the intervention (Time 1), immediately following the intervention (Time 1b), and one week (Time 2) and one month (Time 3) post-intervention. Primary outcomes are intentions to sun protect and sun-protective behaviour. Secondary outcomes are the participants’ attitudes toward sun protection, perceptions of normative support for sun protection (i.e. subjective norms, group norms, personal norms and image norms) and perceptions of control/self-efficacy toward sun protection. Discussion The study will contribute to an understanding of the effectiveness of a TPB-based online intervention to improve Australian adults’ sun

  20. An adaptive altitude information fusion method for autonomous landing processes of small unmanned aerial rotorcraft.

    PubMed

    Lei, Xusheng; Li, Jingjing

    2012-09-27

    This paper presents an adaptive information fusion method to improve the accuracy and reliability of the altitude measurement information for small unmanned aerial rotorcraft during the landing process. Focusing on the low measurement performance of sensors mounted on small unmanned aerial rotorcraft, a wavelet filter is applied as a pre-filter to attenuate the high frequency noises in the sensor output. Furthermore, to improve altitude information, an adaptive extended Kalman filter based on a maximum a posteriori criterion is proposed to estimate measurement noise covariance matrix in real time. Finally, the effectiveness of the proposed method is proved by static tests, hovering flight and autonomous landing flight tests.

  1. An Adaptive Altitude Information Fusion Method for Autonomous Landing Processes of Small Unmanned Aerial Rotorcraft

    PubMed Central

    Lei, Xusheng; Li, Jingjing

    2012-01-01

    This paper presents an adaptive information fusion method to improve the accuracy and reliability of the altitude measurement information for small unmanned aerial rotorcraft during the landing process. Focusing on the low measurement performance of sensors mounted on small unmanned aerial rotorcraft, a wavelet filter is applied as a pre-filter to attenuate the high frequency noises in the sensor output. Furthermore, to improve altitude information, an adaptive extended Kalman filter based on a maximum a posteriori criterion is proposed to estimate measurement noise covariance matrix in real time. Finally, the effectiveness of the proposed method is proved by static tests, hovering flight and autonomous landing flight tests. PMID:23201993

  2. Method and apparatus for bistable optical information storage for erasable optical disks

    DOEpatents

    Land, Cecil E.; McKinney, Ira D.

    1990-01-01

    A method and an optical device for bistable storage of optical information, together with reading and erasure of the optical information, using a photoactivated shift in a field dependent phase transition between a metastable or a bias-stabilized ferroelectric (FE) phase and a stable antiferroelectric (AFE) phase in an lead lanthanum zirconate titanate (PLZT). An optical disk contains the PLZT. Writing and erasing of optical information can be accomplished by a light beam normal to the disk. Reading of optical information can be accomplished by a light beam at an incidence angle of 15 to 60 degrees to the normal of the disk.

  3. Method and apparatus for bistable optical information storage for erasable optical disks

    DOEpatents

    Land, C.E.; McKinney, I.D.

    1988-05-31

    A method and an optical device for bistable storage of optical information, together with reading and erasure of the optical information, using a photoactivated shift in a field dependent phase transition between a metastable or a bias-stabilized ferroelectric (FE) phase and a stable antiferroelectric (AFE) phase in a lead lanthanum zirconate titanate (PLZT). An optical disk contains the PLZT. Writing and erasing of optical information can be accomplished by a light beam normal to the disk. Reading of optical information can be accomplished by a light beam at an incidence angle of 15 to 60 degrees to the normal of the disk. 10 figs.

  4. Redox potentials and pKa for benzoquinone from density functional theory based molecular dynamics.

    PubMed

    Cheng, Jun; Sulpizi, Marialore; Sprik, Michiel

    2009-10-21

    The density functional theory based molecular dynamics (DFTMD) method for the computation of redox free energies presented in previous publications and the more recent modification for computation of acidity constants are reviewed. The method uses a half reaction scheme based on reversible insertion/removal of electrons and protons. The proton insertion is assisted by restraining potentials acting as chaperones. The procedure for relating the calculated deprotonation free energies to Brønsted acidities (pK(a)) and the oxidation free energies to electrode potentials with respect to the normal hydrogen electrode is discussed in some detail. The method is validated in an application to the reduction of aqueous 1,4-benzoquinone. The conversion of hydroquinone to quinone can take place via a number of alternative pathways consisting of combinations of acid dissociations, oxidations, or dehydrogenations. The free energy changes of all elementary steps (ten in total) are computed. The accuracy of the calculations is assessed by comparing the energies of different pathways for the same reaction (Hess's law) and by comparison to experiment. This two-sided test enables us to separate the errors related with the restrictions on length and time scales accessible to DFTMD from the errors introduced by the DFT approximation. It is found that the DFT approximation is the main source of error for oxidation free energies.

  5. The equivalence of information-theoretic and likelihood-based methods for neural dimensionality reduction.

    PubMed

    Williamson, Ross S; Sahani, Maneesh; Pillow, Jonathan W

    2015-04-01

    Stimulus dimensionality-reduction methods in neuroscience seek to identify a low-dimensional space of stimulus features that affect a neuron's probability of spiking. One popular method, known as maximally informative dimensions (MID), uses an information-theoretic quantity known as "single-spike information" to identify this space. Here we examine MID from a model-based perspective. We show that MID is a maximum-likelihood estimator for the parameters of a linear-nonlinear-Poisson (LNP) model, and that the empirical single-spike information corresponds to the normalized log-likelihood under a Poisson model. This equivalence implies that MID does not necessarily find maximally informative stimulus dimensions when spiking is not well described as Poisson. We provide several examples to illustrate this shortcoming, and derive a lower bound on the information lost when spiking is Bernoulli in discrete time bins. To overcome this limitation, we introduce model-based dimensionality reduction methods for neurons with non-Poisson firing statistics, and show that they can be framed equivalently in likelihood-based or information-theoretic terms. Finally, we show how to overcome practical limitations on the number of stimulus dimensions that MID can estimate by constraining the form of the non-parametric nonlinearity in an LNP model. We illustrate these methods with simulations and data from primate visual cortex.

  6. Assessing Bayesian model averaging uncertainty of groundwater modeling based on information entropy method

    NASA Astrophysics Data System (ADS)

    Zeng, Xiankui; Wu, Jichun; Wang, Dong; Zhu, Xiaobin; Long, Yuqiao

    2016-07-01

    Because of groundwater conceptualization uncertainty, multi-model methods are usually used and the corresponding uncertainties are estimated by integrating Markov Chain Monte Carlo (MCMC) and Bayesian model averaging (BMA) methods. Generally, the variance method is used to measure the uncertainties of BMA prediction. The total variance of ensemble prediction is decomposed into within-model and between-model variances, which represent the uncertainties derived from parameter and conceptual model, respectively. However, the uncertainty of a probability distribution couldn't be comprehensively quantified by variance solely. A new measuring method based on information entropy theory is proposed in this study. Due to actual BMA process hard to meet the ideal mutually exclusive collectively exhaustive condition, BMA predictive uncertainty could be decomposed into parameter, conceptual model, and overlapped uncertainties, respectively. Overlapped uncertainty is induced by the combination of predictions from correlated model structures. In this paper, five simple analytical functions are firstly used to illustrate the feasibility of the variance and information entropy methods. A discrete distribution example shows that information entropy could be more appropriate to describe between-model uncertainty than variance. Two continuous distribution examples show that the two methods are consistent in measuring normal distribution, and information entropy is more appropriate to describe bimodal distribution than variance. The two examples of BMA uncertainty decomposition demonstrate that the two methods are relatively consistent in assessing the uncertainty of unimodal BMA prediction. Information entropy is more informative in describing the uncertainty decomposition of bimodal BMA prediction. Then, based on a synthetical groundwater model, the variance and information entropy methods are used to assess the BMA uncertainty of groundwater modeling. The uncertainty assessments of

  7. A beacon configuration optimization method based on Fisher information for Mars atmospheric entry

    NASA Astrophysics Data System (ADS)

    Zhao, Zeduan; Yu, Zhengshi; Cui, Pingyuan

    2017-04-01

    The navigation capability of the proposed Mars network based entry navigation system is directly related to the beacon number and the relative configuration between the beacons and the entry vehicle. In this paper, a new beacon configuration optimization method is developed based on the Fisher information theory and this method is suitable for any number of visible beacons. The proposed method can be used for the navigation schemes based on range measurements provided by radio transceivers or other sensors for Mars entry. The observability of specific state is defined as its Fisher information based on the observation model. The overall navigation capability is improved by maximizing the minimum average Fisher information, even though the navigation system is not fully observed. In addition, when there is only one beacon capable of entry navigation and the observation information is relatively limited, the optimization method can be modulated to maximize the Fisher information of the specific state which may be preferred for the guidance and control system to improve its estimation accuracy. Finally, navigation scenarios consisted of 1-3 beacons are tested to validate the effectiveness of the developed optimization method. The extended Kalman filter (EKF) is employed to derive the state estimation error covariance. The results also show that the zero-Fisher information situation should be avoided, especially when the dynamic system is highly nonlinear and the states change dramatically.

  8. Monetary valuation of informal care: the well-being valuation method.

    PubMed

    van den Berg, Bernard; Ferrer-I-Carbonell, Ada

    2007-11-01

    This paper estimates the monetary value of providing informal care by means of a well-being valuation method. This is done by assessing the compensating variation necessary to maintain the same level of well-being after an informal caregiver provides an extra hour of informal care. The informal caregiver's well-being is proxied by the answer to two subjective well-being questions that were posed in a questionnaire answered by 865 Dutch informal caregivers between the end of 2001 and the beginning of 2002. In the econometric analysis, a distinction is made between the care recipients who are and the ones who are not a family member of the informal caregiver. The results indicate that an extra hour of informal care is worth about 9 or 10 Euros. This equals 8 or 9 Euros if the care recipient is a family member and about 7 or 9 Euros if not. When applying the contingent valuation method to the same sample, the value obtained was 10.52 Euros per hour. This paper concludes that the well-being valuation method is a useful complement to the more traditional valuation methods in the health economics literature in general and more particularly for the economic valuation of informal care: it includes all costs and effects associated with providing care from the perspective of the informal caregiver, it is relatively cheap to implement, and it offers an additional possibility to determine the convergent validity of the different monetary valuation methods. (c) 2007 John Wiley & Sons, Ltd.

  9. Literary pedagogy in nursing: a theory-based perspective.

    PubMed

    Sakalys, Jurate A

    2002-09-01

    Using fictional and autobiographical literature in nursing education is a primary way of understanding patients' lived experiences and fostering development of essential relational and reflective thinking skills. Application of literary theory to this pedagogic practice can expand conceptualization of teaching goals, inform specific teaching strategies, and potentially contribute to socially consequential educational outcomes. This article describes a theoretical schema that focuses on pedagogical goals in terms of the three related skills (i.e., reading, interpretation, criticism) of textual competence.

  10. Research on Methods of Processing Transit IC Card Information and Constructing Transit OD Matrix

    NASA Astrophysics Data System (ADS)

    Han, Xiuhua; Li, Jin; Peng, Han

    Transit OD matrix is of vital importance when planning urban transit system. Traditional transit OD matrix constructing method needs a large range of spot check survey. It is expensive and needs long cycle time to process information. Recently transit IC card charging systems have been widely applied in big cities. Being processed reasonably, transit passenger information stored in IC card database can turn into information resource. It will reduce survey cost a lot. The concept of transit trip chain is put forward in this paper. According to the characteristics of closed transit trip chain, it discusses how to process IC card information and construct transit OD matrix. It also points out that urban transit information platform and data warehouse should be constructed, and how to integrate IC card information.

  11. Research Methods and Techniques in Spanish Library and Information Science Journals (2012-2014)

    ERIC Educational Resources Information Center

    Ferran-Ferrer, Núria; Guallar, Javier; Abadal, Ernest; Server, Adan

    2017-01-01

    Introduction. This study examines the research methods and techniques used in Spanish journals of library and information science, the topics addressed by papers in these journals and their authorship affiliation. Method. The researchers selected 580 papers published in the top seven Spanish LIS journals indexed in Web of Science and Scopus and…

  12. Farmers' Preferences for Methods of Receiving Information on New or Innovative Farming Practices.

    ERIC Educational Resources Information Center

    Riesenberg, Lou E.; Gor, Christopher Obel

    1989-01-01

    Survey of 386 Idaho farmers (response rate 58 percent) identified preferred methods of receiving information on new or innovative farming practices. Analysis revealed preference for interpersonal methods (demonstrations, tours, and field trips) over mass media such as computer-assisted instruction (CAI) and home study, although younger farmers,…

  13. Evaluation of Television as a Method of Disseminating Solar Energy Information.

    ERIC Educational Resources Information Center

    Edington, Everett D.; And Others

    This project included three separate studies undertaken to determine the effectiveness of television instruction as a method of effectively delivering information about solar energy systems to present and future workers in related industries, and as a method of delivery for adult continuing education instruction. All three studies used a series of…

  14. Visual Methods and Quality in Information Behaviour Research: The Cases of Photovoice and Mental Mapping

    ERIC Educational Resources Information Center

    Cox, Andrew; Benson, Melanie

    2017-01-01

    Introduction: The purpose of the paper is to explore the ways in which visual methods can increase the quality of qualitative information behaviour research. Methods: The paper examines Tracy's framework of eight criteria for research quality: worthy topic, rich rigour, sincerity, credibility, resonance, significant contribution, ethical issues…

  15. Evaluation of Television as a Method of Disseminating Solar Energy Information.

    ERIC Educational Resources Information Center

    Edington, Everett D.; And Others

    This project included three separate studies undertaken to determine the effectiveness of television instruction as a method of effectively delivering information about solar energy systems to present and future workers in related industries, and as a method of delivery for adult continuing education instruction. All three studies used a series of…

  16. A Method for the Analysis of Information Use in Source-Based Writing

    ERIC Educational Resources Information Center

    Sormunen, Eero; Heinstrom, Jannica; Romu, Leena; Turunen, Risto

    2012-01-01

    Introduction: Past research on source-based writing assignments has hesitated to scrutinize how students actually use information afforded by sources. This paper introduces a method for the analysis of text transformations from sources to texts composed. The method is aimed to serve scholars in building a more detailed understanding of how…

  17. Using Financial Information in Continuing Education. Accepted Methods and New Approaches.

    ERIC Educational Resources Information Center

    Matkin, Gary W.

    This book, which is intended as a resource/reference guide for experienced financial managers and course planners, examines accepted methods and new approaches for using financial information in continuing education. The introduction reviews theory and practice, traditional and new methods, planning and organizational management, and technology.…

  18. Using Financial Information in Continuing Education. Accepted Methods and New Approaches.

    ERIC Educational Resources Information Center

    Matkin, Gary W.

    This book, which is intended as a resource/reference guide for experienced financial managers and course planners, examines accepted methods and new approaches for using financial information in continuing education. The introduction reviews theory and practice, traditional and new methods, planning and organizational management, and technology.…

  19. Mixed Methods Research of Adult Family Care Home Residents and Informal Caregivers

    ERIC Educational Resources Information Center

    Jeanty, Guy C.; Hibel, James

    2011-01-01

    This article describes a mixed methods approach used to explore the experiences of adult family care home (AFCH) residents and informal caregivers (IC). A rationale is presented for using a mixed methods approach employing the sequential exploratory design with this poorly researched population. The unique challenges attendant to the sampling…

  20. A Qualitative Study about Performance Based Assesment Methods Used in Information Technologies Lesson

    ERIC Educational Resources Information Center

    Daghan, Gökhan; Akkoyunlu, Buket

    2014-01-01

    In this study, Information Technologies teachers' views and usage cases on performance based assesment methods (PBAMs) are examined. It is aimed to find out which of the PBAMs are used frequently or not used, preference reasons of these methods and opinions about the applicability of them. Study is designed with the phenomenological design which…

  1. Farmers' Preferences for Methods of Receiving Information on New or Innovative Farming Practices.

    ERIC Educational Resources Information Center

    Riesenberg, Lou E.; Gor, Christopher Obel

    1989-01-01

    Survey of 386 Idaho farmers (response rate 58 percent) identified preferred methods of receiving information on new or innovative farming practices. Analysis revealed preference for interpersonal methods (demonstrations, tours, and field trips) over mass media such as computer-assisted instruction (CAI) and home study, although younger farmers,…

  2. Game Theory Based Trust Model for Cloud Environment

    PubMed Central

    Gokulnath, K.; Uthariaraj, Rhymend

    2015-01-01

    The aim of this work is to propose a method to establish trust at bootload level in cloud computing environment. This work proposes a game theoretic based approach for achieving trust at bootload level of both resources and users perception. Nash equilibrium (NE) enhances the trust evaluation of the first-time users and providers. It also restricts the service providers and the users to violate service level agreement (SLA). Significantly, the problem of cold start and whitewashing issues are addressed by the proposed method. In addition appropriate mapping of cloud user's application to cloud service provider for segregating trust level is achieved as a part of mapping. Thus, time complexity and space complexity are handled efficiently. Experiments were carried out to compare and contrast the performance of the conventional methods and the proposed method. Several metrics like execution time, accuracy, error identification, and undecidability of the resources were considered. PMID:26380365

  3. An Information Theory-Based Approach to Assessing the Sustainability and Stability of an Island System

    EPA Science Inventory

    It is well-documented that a sustainable system is based on environmental stewardship, economic viability and social equity. What is often overlooked is the need for continuity such that desirable system behavior is maintained with mechanisms in place that facilitate the ability ...

  4. The Effect of Software Reusability on Information Theory Based Software Metrics

    DTIC Science & Technology

    1990-01-01

    17-24. [KERNI78] Kernighan , Brian W. and Ritchie, Dennis M., The C Programming Language, Prentice-Hall, Englewood Cliffs, NJ, 1978. [KERNI84... Kernighan , Brian W., "The UNIX System and Software Reusability," IEEE Transactions on Software Engineering, Vol. SE-10, No. 5, September 1984, pp. 513-518...July 1987, pp. 17-24. [KERNI78] Kernighan , Brian W. and Ritchie, Dennis M., The C Programming Language, Prentice-Hall, Englewood Cliffs, NJ, 1978

  5. An Information Theory-Based Approach to Assessing the Sustainability and Stability of an Island System

    EPA Science Inventory

    It is well-documented that a sustainable system is based on environmental stewardship, economic viability and social equity. What is often overlooked is the need for continuity such that desirable system behavior is maintained with mechanisms in place that facilitate the ability ...

  6. [Development method of healthcare information system integration based on business collaboration model].

    PubMed

    Li, Shasha; Nie, Hongchao; Lu, Xudong; Duan, Huilong

    2015-02-01

    Integration of heterogeneous systems is the key to hospital information construction due to complexity of the healthcare environment. Currently, during the process of healthcare information system integration, people participating in integration project usually communicate by free-format document, which impairs the efficiency and adaptability of integration. A method utilizing business process model and notation (BPMN) to model integration requirement and automatically transforming it to executable integration configuration was proposed in this paper. Based on the method, a tool was developed to model integration requirement and transform it to integration configuration. In addition, an integration case in radiology scenario was used to verify the method.

  7. Decision Aid Use in Primary Care: An Overview and Theory-Based Framework.

    PubMed

    Shultz, Cameron G; Jimbo, Masahito

    2015-10-01

    Increasing patients' participation in health care is a commonly cited goal. While patient decision aids can promote participation, they remain underutilized. Theory-based models that assess barriers and facilitators to sustained decision aid use are needed. The ready, willing, and able model specifies three preconditions for behavioral change. We present a descriptive analysis of the uptake of patient decision aids in the primary care setting and show how the ready, willing, and able model can be used to identify potential barriers and facilitators. An Ovid Medline literature search from January 2004 to November 2014 was used; additional sources were identified from reference lists and through peer consultations. Barriers and facilitators to decision aid use were identified and grouped into salient themes. The ready, willing, and able model provided a simple yet practical framework for identifying the mechanisms that facilitate (or work against) the adoption of patient decision aids within primary care. While time was a prominent barrier, additional barriers such as perceived legitimacy, clinic capacity, processes of care, and the overarching health care environment were also noted. The ready, willing, and able model posits that several preconditions must first be satisfied before sustained use of patient decision aids can take hold. By pinpointing bottlenecks, the model can inform policies and tailored interventions to target identified problems. Using the model to troubleshoot for bottlenecks prior to the implementation of a decision aid could help to improve uptake and sustained use within the primary care setting.

  8. Game theory-based mode cooperative selection mechanism for device-to-device visible light communication

    NASA Astrophysics Data System (ADS)

    Liu, Yuxin; Huang, Zhitong; Li, Wei; Ji, Yuefeng

    2016-03-01

    Various patterns of device-to-device (D2D) communication, from Bluetooth to Wi-Fi Direct, are emerging due to the increasing requirements of information sharing between mobile terminals. This paper presents an innovative pattern named device-to-device visible light communication (D2D-VLC) to alleviate the growing traffic problem. However, the occlusion problem is a difficulty in D2D-VLC. This paper proposes a game theory-based solution in which the best-response dynamics and best-response strategies are used to realize a mode-cooperative selection mechanism. This mechanism uses system capacity as the utility function to optimize system performance and selects the optimal communication mode for each active user from three candidate modes. Moreover, the simulation and experimental results show that the mechanism can attain a significant improvement in terms of effectiveness and energy saving compared with the cases where the users communicate via only the fixed transceivers (light-emitting diode and photo diode) or via only D2D.

  9. Development and validation of a theory-based multimedia application for educating Persian patients on hemodialysis.

    PubMed

    Feizalahzadeh, Hossein; Tafreshi, Mansoureh Zagheri; Moghaddasi, Hamid; Farahani, Mansoureh A; Khosrovshahi, Hamid Tayebi; Zareh, Zahra; Mortazavi, Fakhrsadat

    2014-05-01

    Although patients on hemodialysis require effective education for self-care, several issues associated with the process raise barriers that make learning difficult. Computer-based education can reduce these problems and improve the quality of education. This study aims to develop and validate a theory-based multimedia application to educate Persian patients on hemodialysis. The study consisted of five phases: (1) content development, (2) prototype development 1, (3) evaluation by users, (4) evaluation by a multidisciplinary group of experts, and (5) prototype development 2. Data were collected through interviews and literature review with open-ended questions and two survey forms that consisted of a five-level scale. In the Results section, patient needs on hemodialysis self-care and related content were categorized into seven sections, including kidney function and failure, hemodialysis, vascular access, nutrition, medication, physical activity, and living with hemodialysis. The application designed includes seven modules consisting of user-controlled small multimedia units. During navigation through this application, the users were provided step-by-step information on self-care. Favorable scores were obtained from evaluations by users and experts. The researchers concluded that this application can facilitate hemodialysis education and learning process for the patients by focusing on their self-care needs using the multimedia design principles.

  10. Development of StopAdvisor: A theory-based interactive internet-based smoking cessation intervention.

    PubMed

    Michie, Susan; Brown, Jamie; Geraghty, Adam W A; Miller, Sascha; Yardley, Lucy; Gardner, Benjamin; Shahab, Lion; McEwen, Andy; Stapleton, John A; West, Robert

    2012-09-01

    Reviews of internet-based behaviour-change interventions have shown that they can be effective but there is considerable heterogeneity and effect sizes are generally small. In order to advance science and technology in this area, it is essential to be able to build on principles and evidence of behaviour change in an incremental manner. We report the development of an interactive smoking cessation website, StopAdvisor, designed to be attractive and effective across the social spectrum. It was informed by a broad motivational theory (PRIME), empirical evidence, web-design expertise, and user-testing. The intervention was developed using an open-source web-development platform, 'LifeGuide', designed to facilitate optimisation and collaboration. We identified 19 theoretical propositions, 33 evidence- or theory-based behaviour change techniques, 26 web-design principles and nine principles from user-testing. These were synthesised to create the website, 'StopAdvisor' (see http://www.lifeguideonline.org/player/play/stopadvisordemonstration). The systematic and transparent application of theory, evidence, web-design expertise and user-testing within an open-source development platform can provide a basis for multi-phase optimisation contributing to an 'incremental technology' of behaviour change.

  11. Change of Brain Functional Connectivity in Patients With Spinal Cord Injury: Graph Theory Based Approach.

    PubMed

    Min, Yu-Sun; Chang, Yongmin; Park, Jang Woo; Lee, Jong-Min; Cha, Jungho; Yang, Jin-Ju; Kim, Chul-Hyun; Hwang, Jong-Moon; Yoo, Ji-Na; Jung, Tae-Du

    2015-06-01

    To investigate the global functional reorganization of the brain following spinal cord injury with graph theory based approach by creating whole brain functional connectivity networks from resting state-functional magnetic resonance imaging (rs-fMRI), characterizing the reorganization of these networks using graph theoretical metrics and to compare these metrics between patients with spinal cord injury (SCI) and age-matched controls. Twenty patients with incomplete cervical SCI (14 males, 6 females; age, 55±14.1 years) and 20 healthy subjects (10 males, 10 females; age, 52.9±13.6 years) participated in this study. To analyze the characteristics of the whole brain network constructed with functional connectivity using rs-fMRI, graph theoretical measures were calculated including clustering coefficient, characteristic path length, global efficiency and small-worldness. Clustering coefficient, global efficiency and small-worldness did not show any difference between controls and SCIs in all density ranges. The normalized characteristic path length to random network was higher in SCI patients than in controls and reached statistical significance at 12%-13% of density (p<0.05, uncorrected). The graph theoretical approach in brain functional connectivity might be helpful to reveal the information processing after SCI. These findings imply that patients with SCI can build on preserved competent brain control. Further analyses, such as topological rearrangement and hub region identification, will be needed for better understanding of neuroplasticity in patients with SCI.

  12. Interconnected but underprotected? Parents' methods and motivations for information seeking on digital safety issues.

    PubMed

    Davis, Vauna

    2012-12-01

    Parents need information and skills to meet the demands of mediating connected technology in their homes. Parents' methods and motivations for learning to protect children from digital risks were reported through a survey. This study explores relationships between information seeking, parents' concerns, risks children have experienced, and access to connected devices, in addition to the use and satisfaction of various digital safety resources. Three types of information-seeking behavior were identified: (a) protective information seeking, to protect children from being confronted with harmful content; (b) problem-solving information seeking, to help children who have been negatively affected by connected technology; and (c) attentive learning, by attending to media resources passively encountered on this topic. Friends and family are the dominant source of digital safety information, followed by presentations and the Internet. Parents' top concerns for their children using connected technology were accidental exposure to pornography, and sexual content in Internet-based entertainment. Higher numbers of risks experienced by children were positively associated with parents' problem-solving information seeking and level of attentive learning. Parents who were more concerned exhibited more problem-solving information seeking; but despite the high level of concern for children's safety online, 65 percent of parents seek information on this subject less than twice per year. Children have access to a mean of five connected devices at home; a higher number of devices was correlated with increased risks experienced by children, but was not associated with increased concern or information seeking from parents.

  13. Identifying physical activity information needs and preferred methods of delivery of people with multiple sclerosis.

    PubMed

    Sweet, Shane N; Perrier, Marie-Josée; Podzyhun, Christine; Latimer-Cheung, Amy E

    2013-01-01

    The purpose of this study was to examine the preferred sources and methods for acquiring physical activity information of individuals with multiple sclerosis (MS) using the Comprehensive Model of Information Seeking. A secondary objective was to explore the barriers and facilitators to physical activity information seeking. Twenty-one participants diagnosed with MS participated in focus groups or telephone interviews. A direct content analysis of the transcripts revealed that individuals appeared to generally prefer receiving physical activity information during period of relapse and remission. Participants also had positive beliefs toward physical activity and a clear preference for a time when physical activity messages would be relevant. Receiving physical activity information from credible sources such as the MS Society of Canada, healthcare professionals and peers with MS was also deemed important. The Internet was a preferred source to receive information due to its accessibility, but it often was considered to lack credibility. The lack of physical activity information specific to MS is the greatest barrier for individuals with MS to learn about physical activity. Healthcare professionals, National MS Societies, and peers should work together to deliver specific and relevant physical activity messages the MS population. People with MS want more physical activity information from credible sources. Multiple vehicles of physical activity information delivery (i.e. healthcare providers, peers, MS Society) should be utilized. Physical activity information should be tailored to the individual with MS.

  14. Model, properties and imputation method of missing SNP genotype data utilizing mutual information

    NASA Astrophysics Data System (ADS)

    Wang, Ying; Wan, Weiming; Wang, Rui-Sheng; Feng, Enmin

    2009-07-01

    Mutual information can be used as a measure for the association of a genetic marker or a combination of markers with the phenotype. In this paper, we study the imputation of missing genotype data. We first utilize joint mutual information to compute the dependence between SNP sites, then construct a mathematical model in order to find the two SNP sites having maximal dependence with missing SNP sites, and further study the properties of this model. Finally, an extension method to haplotype-based imputation is proposed to impute the missing values in genotype data. To verify our method, extensive experiments have been performed, and numerical results show that our method is superior to haplotype-based imputation methods. At the same time, numerical results also prove joint mutual information can better measure the dependence between SNP sites. According to experimental results, we also conclude that the dependence between the adjacent SNP sites is not necessarily strongest.

  15. Development and Validation of a Theory Based Screening Process for Suicide Risk

    DTIC Science & Technology

    2014-09-01

    AD_________________ Award Number: W81XWH-11-1-0588 TITLE: Development and Validation of a Theory Based Screening Process for Suicide Risk...DATES COVERED 4. TITLE AND SUBTITLE Development and Validation of a Theory Based Screening Process for Suicide Risk 5a. CONTRACT NUMBER 5b...Distribution Unlimited 13. SUPPLEMENTARY NOTES 14. ABSTRACT The ultimate objective of this study is to assist in increasing the capacity of military-based

  16. Effects of a social cognitive theory-based hip fracture prevention web site for older adults.

    PubMed

    Nahm, Eun-Shim; Barker, Bausell; Resnick, Barbara; Covington, Barbara; Magaziner, Jay; Brennan, Patricia Flatley

    2010-01-01

    The purposes of this study were to develop a Social Cognitive Theory-based, structured Hip Fracture Prevention Web site for older adults and conduct a preliminary evaluation of its effectiveness. The Theory-based, structured Hip Fracture Prevention Web site is composed of learning modules and a moderated discussion board. A total of 245 older adults recruited from two Web sites and a newspaper advertisement were randomized into the Theory-based, structured Hip Fracture Prevention Web site and the conventional Web sites groups. Outcomes included (1) knowledge (hip fractures and osteoporosis), (2) self-efficacy and outcome expectations, and (3) calcium intake and exercise and were assessed at baseline, end of treatment (2 weeks), and follow-up (3 months). Both groups showed significant improvement in most outcomes. For calcium intake, only the Theory-based, structured Hip Fracture Prevention Web site group showed improvement. None of the group and time interactions were significant. The Theory-based, structured Hip Fracture Prevention Web site group, however, was more satisfied with the intervention. The discussion board usage was significantly correlated with outcome gains. Despite several limitations, the findings showed some preliminary effectiveness of Web-based health interventions for older adults and the use of a Theory-based, structured Hip Fracture Prevention Web site as a sustainable Web structure for online health behavior change interventions.

  17. A Privacy-Preserved Analytical Method for eHealth Database with Minimized Information Loss

    PubMed Central

    Chen, Ya-Ling; Cheng, Bo-Chao; Chen, Hsueh-Lin; Lin, Chia-I; Liao, Guo-Tan; Hou, Bo-Yu; Hsu, Shih-Chun

    2012-01-01

    Digitizing medical information is an emerging trend that employs information and communication technology (ICT) to manage health records, diagnostic reports, and other medical data more effectively, in order to improve the overall quality of medical services. However, medical information is highly confidential and involves private information, even legitimate access to data raises privacy concerns. Medical records provide health information on an as-needed basis for diagnosis and treatment, and the information is also important for medical research and other health management applications. Traditional privacy risk management systems have focused on reducing reidentification risk, and they do not consider information loss. In addition, such systems cannot identify and isolate data that carries high risk of privacy violations. This paper proposes the Hiatus Tailor (HT) system, which ensures low re-identification risk for medical records, while providing more authenticated information to database users and identifying high-risk data in the database for better system management. The experimental results demonstrate that the HT system achieves much lower information loss than traditional risk management methods, with the same risk of re-identification. PMID:22969273

  18. Component 1: Current and Future Methods for Representing and Interacting with Qualitative Geographic Information

    DTIC Science & Technology

    2011-10-26

    CONTRACT NUMBER W9132V-11- P -0010 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR( S ) Alexander Savelyev, Anthony C. Robinson, and Alan M...Display Spatializations. Cartography and Geographic Information Science, 31, 237-252. Fang, S ., M. Lwin & P . Ebright. 2006. Visualization of unstructured...New Methods for Representing and Interacting with Qualitative Geographic Information Contract #: W9132V-11- P -0010 Contract Period: May 23, 2011

  19. [Review of data quality dimensions and applied methods in the evaluation of health information systems].

    PubMed

    Lima, Claudia Risso de Araujo; Schramm, Joyce Mendes de Andrade; Coeli, Claudia Medina; da Silva, Márcia Elizabeth Marinho

    2009-10-01

    In Brazil, quality monitoring of data from the various health information systems does not follow a regular evaluation plan. This paper reviews quality evaluation initiatives related to the Brazilian information systems, identifying the selected quality dimensions and the method employed. The SciELO and LILACS databases were searched, as were the bibliographical references from articles identified in the search. 375 articles were initially identified, leaving a final total of 78 after exclusions. The four most frequent dimensions in articles totaled approximately 90% of the analyses. The studies prioritized certain quality dimensions: reliability, validity, coverage, and completeness. Half of the studies were limited to data from Rio de Janeiro and São Paulo. The limited number of studies on some systems and their unequal distribution between regions of the country hinder a comprehensive quality assessment of Brazil's health information systems. The importance of accurate information highlights the need to implement a data management policy for health information systems in Brazil.

  20. A method of building information extraction based on mathematical morphology and multiscale

    NASA Astrophysics Data System (ADS)

    Li, Jing-wen; Wang, Ke; Zhang, Zi-ping; Xue, Long-li; Yin, Shou-qiang; Zhou, Song

    2015-12-01

    In view of monitoring the changes of buildings on Earth's surface ,by analyzing the distribution characteristics of building in remote sensing image, combined with multi-scale in image segmentation and the advantages of mathematical morphology, this paper proposes a multi-scale combined with mathematical morphology of high resolution remote sensing image segmentation method, and uses the multiple fuzzy classification method and the shadow of auxiliary method to extract information building, With the comparison of k-means classification, and the traditional maximum likelihood classification method, the results of experiment object based on multi-scale combined with mathematical morphology of image segmentation and extraction method, can accurately extract the structure of the information is more clear classification data, provide the basis for the intelligent monitoring of earth data and theoretical support.

  1. A Theory-Based Exercise App to Enhance Exercise Adherence: A Pilot Study

    PubMed Central

    Voth, Elizabeth C; Oelke, Nelly D

    2016-01-01

    Background Use of mobile health (mHealth) technology is on an exponential rise. mHealth apps have the capability to reach a large number of individuals, but until now have lacked the integration of evidence-based theoretical constructs to increase exercise behavior in users. Objective The purpose of this study was to assess the effectiveness of a theory-based, self-monitoring app on exercise and self-monitoring behavior over 8 weeks. Methods A total of 56 adults (mean age 40 years, SD 13) were randomly assigned to either receive the mHealth app (experimental; n=28) or not to receive the app (control; n=28). All participants engaged in an exercise goal-setting session at baseline. Experimental condition participants received weekly short message service (SMS) text messages grounded in social cognitive theory and were encouraged to self-monitor exercise bouts on the app on a daily basis. Exercise behavior, frequency of self-monitoring exercise behavior, self-efficacy to self-monitor, and self-management of exercise behavior were collected at baseline and at postintervention. Results Engagement in exercise bouts was greater in the experimental condition (mean 7.24, SD 3.40) as compared to the control condition (mean 4.74, SD 3.70, P=.03, d=0.70) at week 8 postintervention. Frequency of self-monitoring increased significantly over the 8-week investigation between the experimental and control conditions (P<.001, partial η2=.599), with participants in the experimental condition self-monitoring significantly more at postintervention (mean 6.00, SD 0.93) in comparison to those in the control condition (mean 1.95, SD 2.58, P<.001, d=2.10). Self-efficacy to self-monitor and perceived self-management of exercise behavior were unaffected by this intervention. Conclusions The successful integration of social cognitive theory into an mHealth exercise self-monitoring app provides support for future research to feasibly integrate theoretical constructs into existing exercise apps

  2. Measuring information flow in cellular networks by the systems biology method through microarray data

    PubMed Central

    Chen, Bor-Sen; Li, Cheng-Wei

    2015-01-01

    In general, it is very difficult to measure the information flow in a cellular network directly. In this study, based on an information flow model and microarray data, we measured the information flow in cellular networks indirectly by using a systems biology method. First, we used a recursive least square parameter estimation algorithm to identify the system parameters of coupling signal transduction pathways and the cellular gene regulatory network (GRN). Then, based on the identified parameters and systems theory, we estimated the signal transductivities of the coupling signal transduction pathways from the extracellular signals to each downstream protein and the information transductivities of the GRN between transcription factors in response to environmental events. According to the proposed method, the information flow, which is characterized by signal transductivity in coupling signaling pathways and information transductivity in the GRN, can be estimated by microarray temporal data or microarray sample data. It can also be estimated by other high-throughput data such as next-generation sequencing or proteomic data. Finally, the information flows of the signal transduction pathways and the GRN in leukemia cancer cells and non-leukemia normal cells were also measured to analyze the systematic dysfunction in this cancer from microarray sample data. The results show that the signal transductivities of signal transduction pathways change substantially from normal cells to leukemia cancer cells. PMID:26082788

  3. Measuring information flow in cellular networks by the systems biology method through microarray data.

    PubMed

    Chen, Bor-Sen; Li, Cheng-Wei

    2015-01-01

    In general, it is very difficult to measure the information flow in a cellular network directly. In this study, based on an information flow model and microarray data, we measured the information flow in cellular networks indirectly by using a systems biology method. First, we used a recursive least square parameter estimation algorithm to identify the system parameters of coupling signal transduction pathways and the cellular gene regulatory network (GRN). Then, based on the identified parameters and systems theory, we estimated the signal transductivities of the coupling signal transduction pathways from the extracellular signals to each downstream protein and the information transductivities of the GRN between transcription factors in response to environmental events. According to the proposed method, the information flow, which is characterized by signal transductivity in coupling signaling pathways and information transductivity in the GRN, can be estimated by microarray temporal data or microarray sample data. It can also be estimated by other high-throughput data such as next-generation sequencing or proteomic data. Finally, the information flows of the signal transduction pathways and the GRN in leukemia cancer cells and non-leukemia normal cells were also measured to analyze the systematic dysfunction in this cancer from microarray sample data. The results show that the signal transductivities of signal transduction pathways change substantially from normal cells to leukemia cancer cells.

  4. Developing information literacy skills in pre-registration nurses: an experimental study of teaching methods.

    PubMed

    Brettle, Alison; Raynor, Michael

    2013-02-01

    To compare the effectiveness of an online information literacy tutorial with a face-to-face session for teaching information literacy skills to nurses. Randomised control trial. Seventy-seven first year undergraduate pre-registration diploma nursing students. Online in-house information literacy tutorial One hour face-to-face session, covering the same material as the intervention, delivered by the nursing subject librarian. Search histories were scored using a validated checklist covering keyword selection, boolean operators, truncation and synonyms. Skills retention was measured at 1 month using the same checklist. Inferential statistics were used to compare search skills within and between groups pre and post-session. The searching skills of first year pre-registration nursing students improve following information literacy sessions (p<0.001), and remain unchanged 1 month later, regardless of teaching method. The two methods produce a comparable improvement (p=0.263). There is no improvement or degradation of skills 1 month post-session for either method (p=0.216). Nurses Information literacy skills improve after both face-to-face and online instruction. There is no skills degradation at 1 month post-intervention for either method. Copyright © 2011. Published by Elsevier Ltd.

  5. Fast-updating and nonrepeating Lissajous image reconstruction method for capturing increased dynamic information.

    PubMed

    Hoy, Christopher L; Durr, Nicholas J; Ben-Yakar, Adela

    2011-06-01

    We present a fast-updating Lissajous image reconstruction methodology that uses an increased image frame rate beyond the pattern repeat rate generally used in conventional Lissajous image reconstruction methods. The fast display rate provides increased dynamic information and reduced motion blur, as compared to conventional Lissajous reconstruction, at the cost of single-frame pixel density. Importantly, this method does not discard any information from the conventional Lissajous image reconstruction, and frames from the complete Lissajous pattern can be displayed simultaneously. We present the theoretical background for this image reconstruction methodology along with images and video taken using the algorithm in a custom-built miniaturized multiphoton microscopy system.

  6. Parallel implementation of multireference coupled-cluster theories based on the reference-level parallelism

    SciTech Connect

    Brabec, Jiri; Pittner, Jiri; van Dam, Hubertus JJ; Apra, Edoardo; Kowalski, Karol

    2012-02-01

    A novel algorithm for implementing general type of multireference coupled-cluster (MRCC) theory based on the Jeziorski-Monkhorst exponential Ansatz [B. Jeziorski, H.J. Monkhorst, Phys. Rev. A 24, 1668 (1981)] is introduced. The proposed algorithm utilizes processor groups to calculate the equations for the MRCC amplitudes. In the basic formulation each processor group constructs the equations related to a specific subset of references. By flexible choice of processor groups and subset of reference-specific sufficiency conditions designated to a given group one can assure optimum utilization of available computing resources. The performance of this algorithm is illustrated on the examples of the Brillouin-Wigner and Mukherjee MRCC methods with singles and doubles (BW-MRCCSD and Mk-MRCCSD). A significant improvement in scalability and in reduction of time to solution is reported with respect to recently reported parallel implementation of the BW-MRCCSD formalism [J.Brabec, H.J.J. van Dam, K. Kowalski, J. Pittner, Chem. Phys. Lett. 514, 347 (2011)].

  7. Mixture theory-based poroelasticity as a model of interstitial tissue growth

    PubMed Central

    Cowin, Stephen C.; Cardoso, Luis

    2011-01-01

    This contribution presents an alternative approach to mixture theory-based poroelasticity by transferring some poroelastic concepts developed by Maurice Biot to mixture theory. These concepts are a larger RVE and the subRVE-RVE velocity average tensor, which Biot called the micro-macro velocity average tensor. This velocity average tensor is assumed here to depend upon the pore structure fabric. The formulation of mixture theory presented is directed toward the modeling of interstitial growth, that is to say changing mass and changing density of an organism. Traditional mixture theory considers constituents to be open systems, but the entire mixture is a closed system. In this development the mixture is also considered to be an open system as an alternative method of modeling growth. Growth is slow and accelerations are neglected in the applications. The velocity of a solid constituent is employed as the main reference velocity in preference to the mean velocity concept from the original formulation of mixture theory. The standard development of statements of the conservation principles and entropy inequality employed in mixture theory are modified to account for these kinematic changes and to allow for supplies of mass, momentum and energy to each constituent and to the mixture as a whole. The objective is to establish a basis for the development of constitutive equations for growth of tissues. PMID:22184481

  8. Mixture theory-based poroelasticity as a model of interstitial tissue growth.

    PubMed

    Cowin, Stephen C; Cardoso, Luis

    2012-01-01

    This contribution presents an alternative approach to mixture theory-based poroelasticity by transferring some poroelastic concepts developed by Maurice Biot to mixture theory. These concepts are a larger RVE and the subRVE-RVE velocity average tensor, which Biot called the micro-macro velocity average tensor. This velocity average tensor is assumed here to depend upon the pore structure fabric. The formulation of mixture theory presented is directed toward the modeling of interstitial growth, that is to say changing mass and changing density of an organism. Traditional mixture theory considers constituents to be open systems, but the entire mixture is a closed system. In this development the mixture is also considered to be an open system as an alternative method of modeling growth. Growth is slow and accelerations are neglected in the applications. The velocity of a solid constituent is employed as the main reference velocity in preference to the mean velocity concept from the original formulation of mixture theory. The standard development of statements of the conservation principles and entropy inequality employed in mixture theory are modified to account for these kinematic changes and to allow for supplies of mass, momentum and energy to each constituent and to the mixture as a whole. The objective is to establish a basis for the development of constitutive equations for growth of tissues.

  9. Reducing sedentary behavior in minority girls via a theory-based, tailored classroom media intervention

    PubMed Central

    SPRUIJT-METZ, DONNA; NGUYEN-MICHEL, SELENA T.; GORAN, MICHAEL I.; CHOU, CHIH-PING; HUANG, TERRY T-K.

    2010-01-01

    Objective To develop, implement and test an innovative, theory-based classroom media intervention known as Get Moving! to increase physical activity and decrease sedentary behaviors in predominantly Latina middle school girls. Research methods and procedures School-based intervention on five to seven consecutive school days in seven schools (four intervention and three control) with high Latino populations (above 60%). Intervention schools were matched to control schools by ethnic makeup and socioeconomic status (SES). Measures conducted 3 months before and 3 months after intervention included height, weight, percentage body fat (bioimpedance analysis), physical activity and psychosocial aspects of activity by questionnaire. Subjects were middle school girls, mean age 12.5 years old, 73% Latina (N=459 girls). Results Get Moving! significantly reduced time spent on sedentary behavior (β± standard error, SE=−0.27±0.14, p<0.05) and significantly increased intrinsic motivation (β±SE=0.11±0.05, p<0.05). There was a trend for mediation effects of intrinsic motivation, but this did not reach significance. Discussion Get Moving! is a promising school-based approach that specifically targets physical activity and sedentary behavior in Latina girls, a population at high risk for obesity and related diseases. PMID:19023773

  10. The Effect of Theory Based Nutritional Education on Fat Intake, Weight and Blood Lipids

    PubMed Central

    Kamran, Aziz; Sharifirad, Gholamreza; Heydari, Heshmatolah; Sharifian, Elham

    2016-01-01

    Introduction Though Nutrition plays a key role in the control of hypertension, it is often forgotten in Iranian patients’ diet. In fact, dietary behavior can be regarded as unsatisfactory among Iranian patients. This study was aimed to assess the effectiveness of theory based educational intervention on fat intake, weight, and blood lipids among rural hypertensive patients. Methods This quasi experimental study was conducted on 138 hypertensive patients who had referred to Ardabil rural health centers during 2014. The nutritional education based on DASH and Health Promotion Model (HPM) was treated for six sessions. The pre-test and post-test had intervals of two and six months. Data were analyzed using SPSS-18 and Chi-square, independent-samples t-test, paired-samples t-test and repeated measure ANOVA. Results After treating intervention, weight, dietary fat, LDL_C and Total cholesterol, systolic and diastolic blood pressures decreased significantly in the intervention group compared with the control group (p < 0.001). In contrast, HDL_C increased significantly in the intervention group. Conclusion Educational intervention, provided based on Pender’s health promotion model, affecting fat intake, blood lipids, and blood pressure, led to their decrease PMID:28163845

  11. A Feature Extraction Method Based on Information Theory for Fault Diagnosis of Reciprocating Machinery

    PubMed Central

    Wang, Huaqing; Chen, Peng

    2009-01-01

    This paper proposes a feature extraction method based on information theory for fault diagnosis of reciprocating machinery. A method to obtain symptom parameter waves is defined in the time domain using the vibration signals, and an information wave is presented based on information theory, using the symptom parameter waves. A new way to determine the difference spectrum of envelope information waves is also derived, by which the feature spectrum can be extracted clearly and machine faults can be effectively differentiated. This paper also compares the proposed method with the conventional Hilbert-transform-based envelope detection and with a wavelet analysis technique. Practical examples of diagnosis for a rolling element bearing used in a diesel engine are provided to verify the effectiveness of the proposed method. The verification results show that the bearing faults that typically occur in rolling element bearings, such as outer-race, inner-race, and roller defects, can be effectively identified by the proposed method, while these bearing faults are difficult to detect using either of the other techniques it was compared to. PMID:22574021

  12. A feature extraction method based on information theory for fault diagnosis of reciprocating machinery.

    PubMed

    Wang, Huaqing; Chen, Peng

    2009-01-01

    This paper proposes a feature extraction method based on information theory for fault diagnosis of reciprocating machinery. A method to obtain symptom parameter waves is defined in the time domain using the vibration signals, and an information wave is presented based on information theory, using the symptom parameter waves. A new way to determine the difference spectrum of envelope information waves is also derived, by which the feature spectrum can be extracted clearly and machine faults can be effectively differentiated. This paper also compares the proposed method with the conventional Hilbert-transform-based envelope detection and with a wavelet analysis technique. Practical examples of diagnosis for a rolling element bearing used in a diesel engine are provided to verify the effectiveness of the proposed method. The verification results show that the bearing faults that typically occur in rolling element bearings, such as outer-race, inner-race, and roller defects, can be effectively identified by the proposed method, while these bearing faults are difficult to detect using either of the other techniques it was compared to.

  13. Using a Marginal Structural Model to Design a Theory-Based Mass Media Campaign

    PubMed Central

    Taguri, Masataka; Ishikawa, Yoshiki

    2016-01-01

    Background The essential first step in the development of mass media health campaigns is to identify specific beliefs of the target audience. The challenge is to prioritize suitable beliefs derived from behavioral theory. The purpose of this study was to identify suitable beliefs to target in a mass media campaign to change behavior using a new method to estimate the possible effect size of a small set of beliefs. Methods Data were drawn from the 2010 Japanese Young Female Smoker Survey (n = 500), conducted by the Japanese Ministry of Health, Labor and Welfare. Survey measures included intention to quit smoking, psychological beliefs (attitude, norms, and perceived control) based on the theory of planned behavior and socioeconomic status (age, education, household income, and marital status). To identify suitable candidate beliefs for a mass media health campaign, we estimated the possible effect size required to change the intention to quit smoking among the population of young Japanese women using the population attributable fraction from a marginal structural model. Results Thirteen percent of study participants intended to quit smoking. The marginal structural model estimated a population attributable fraction of 47 psychological beliefs (21 attitudes, 6 norms, and 19 perceived controls) after controlling for socioeconomic status. The belief, “I could quit smoking if my husband or significant other recommended it” suggested a promising target for a mass media campaign (population attributable fraction = 0.12, 95% CI = 0.02–0.23). Messages targeting this belief could possibly improve intention rates by up to 12% among this population. The analysis also suggested the potential for regulatory action. Conclusions This study proposed a method by which campaign planners can develop theory-based mass communication strategies to change health behaviors at the population level. This method might contribute to improving the quality of future mass health

  14. Nonlinear PET parametric image reconstruction with MRI information using kernel method

    NASA Astrophysics Data System (ADS)

    Gong, Kuang; Wang, Guobao; Chen, Kevin T.; Catana, Ciprian; Qi, Jinyi

    2017-03-01

    Positron Emission Tomography (PET) is a functional imaging modality widely used in oncology, cardiology, and neurology. It is highly sensitive, but suffers from relatively poor spatial resolution, as compared with anatomical imaging modalities, such as magnetic resonance imaging (MRI). With the recent development of combined PET/MR systems, we can improve the PET image quality by incorporating MR information. Previously we have used kernel learning to embed MR information in static PET reconstruction and direct Patlak reconstruction. Here we extend this method to direct reconstruction of nonlinear parameters in a compartment model by using the alternating direction of multiplier method (ADMM) algorithm. Simulation studies show that the proposed method can produce superior parametric images compared with existing methods.

  15. An overview of methods and applications to value informal care in economic evaluations of healthcare.

    PubMed

    Koopmanschap, Marc A; van Exel, Job N A; van den Berg, Bernard; Brouwer, Werner B F

    2008-01-01

    This paper compares several applied valuation methods for including informal care in economic evaluations of healthcare programmes: the proxy good method; the opportunity cost method; the contingent valuation method (CVM); conjoint measurement (CM); and valuation of health effects in terms of health-related quality of life (HR-QOL) and well-being. The comparison focuses on three questions: what outcome measures are available for including informal care in economic evaluations of healthcare programmes; whether these measures are compatible with the common types of economic evaluation; and, when applying these measures, whether all relevant aspects of informal care are incorporated. All types of economic evaluation can incorporate a monetary value of informal care (using the opportunity cost method, the proxy good method, CVM and CM) on the cost side of an analysis, but only when the relevant aspects of time costs have been valued. On the effect side of a cost-effectiveness or cost-utility analysis, the health effects (for the patient and/or caregiver) measured in natural units or QALYs can be combined with cost estimates based on the opportunity cost method or the proxy good method. One should be careful when incorporating CVM and CM in cost-minimization, cost-effectiveness and cost-utility analyses, as the health effects of patients receiving informal care and the carers themselves may also have been valued separately. One should determine whether the caregiver valuation exercise allows combination with other valuation techniques. In cost-benefit analyses, CVM and CM appear to be the best tools for the valuation of informal care. When researchers decide to use the well-being method, we recommend applying it in a cost-benefit analysis framework. This method values overall QOL (happiness); hence it is broader than just HR-QOL, which complicates inclusion in traditional health economic evaluations that normally define outcomes more narrowly. Using broader, non

  16. The Effect of Health Information Technology on Health Care Provider Communication: A Mixed-Method Protocol

    PubMed Central

    Adler-Milstein, Julia; Harrod, Molly; Sales, Anne; Hofer, Timothy P; Saint, Sanjay; Krein, Sarah L

    2015-01-01

    Background Communication failures between physicians and nurses are one of the most common causes of adverse events for hospitalized patients, as well as a major root cause of all sentinel events. Communication technology (ie, the electronic medical record, computerized provider order entry, email, and pagers), which is a component of health information technology (HIT), may help reduce some communication failures but increase others because of an inadequate understanding of how communication technology is used. Increasing use of health information and communication technologies is likely to affect communication between nurses and physicians. Objective The purpose of this study is to describe, in detail, how health information and communication technologies facilitate or hinder communication between nurses and physicians with the ultimate goal of identifying how we can optimize the use of these technologies to support effective communication. Effective communication is the process of developing shared understanding between communicators by establishing, testing, and maintaining relationships. Our theoretical model, based in communication and sociology theories, describes how health information and communication technologies affect communication through communication practices (ie, use of rich media; the location and availability of computers) and work relationships (ie, hierarchies and team stability). Therefore we seek to (1) identify the range of health information and communication technologies used in a national sample of medical-surgical acute care units, (2) describe communication practices and work relationships that may be influenced by health information and communication technologies in these same settings, and (3) explore how differences in health information and communication technologies, communication practices, and work relationships between physicians and nurses influence communication. Methods This 4-year study uses a sequential mixed-methods

  17. Application of information-retrieval methods to the classification of physical data

    NASA Technical Reports Server (NTRS)

    Mamotko, Z. N.; Khorolskaya, S. K.; Shatrovskiy, L. I.

    1975-01-01

    Scientific data received from satellites are characterized as a multi-dimensional time series, whose terms are vector functions of a vector of measurement conditions. Information retrieval methods are used to construct lower dimensional samples on the basis of the condition vector, in order to obtain these data and to construct partial relations. The methods are applied to the joint Soviet-French Arkad project.

  18. Application of information-retrieval methods to the classification of physical data

    NASA Technical Reports Server (NTRS)

    Mamotko, Z. N.; Khorolskaya, S. K.; Shatrovskiy, L. I.

    1975-01-01

    Scientific data received from satellites are characterized as a multi-dimensional time series, whose terms are vector functions of a vector of measurement conditions. Information retrieval methods are used to construct lower dimensional samples on the basis of the condition vector, in order to obtain these data and to construct partial relations. The methods are applied to the joint Soviet-French Arkad project.

  19. A flood-based information flow analysis and network minimization method for gene regulatory networks.

    PubMed

    Pavlogiannis, Andreas; Mozhayskiy, Vadim; Tagkopoulos, Ilias

    2013-04-24

    Biological networks tend to have high interconnectivity, complex topologies and multiple types of interactions. This renders difficult the identification of sub-networks that are involved in condition- specific responses. In addition, we generally lack scalable methods that can reveal the information flow in gene regulatory and biochemical pathways. Doing so will help us to identify key participants and paths under specific environmental and cellular context. This paper introduces the theory of network flooding, which aims to address the problem of network minimization and regulatory information flow in gene regulatory networks. Given a regulatory biological network, a set of source (input) nodes and optionally a set of sink (output) nodes, our task is to find (a) the minimal sub-network that encodes the regulatory program involving all input and output nodes and (b) the information flow from the source to the sink nodes of the network. Here, we describe a novel, scalable, network traversal algorithm and we assess its potential to achieve significant network size reduction in both synthetic and E. coli networks. Scalability and sensitivity analysis show that the proposed method scales well with the size of the network, and is robust to noise and missing data. The method of network flooding proves to be a useful, practical approach towards information flow analysis in gene regulatory networks. Further extension of the proposed theory has the potential to lead in a unifying framework for the simultaneous network minimization and information flow analysis across various "omics" levels.

  20. Unified method to integrate and blend several, potentially related, sources of information for genetic evaluation.

    PubMed

    Vandenplas, Jérémie; Colinet, Frederic G; Gengler, Nicolas

    2014-09-30

    A condition to predict unbiased estimated breeding values by best linear unbiased prediction is to use simultaneously all available data. However, this condition is not often fully met. For example, in dairy cattle, internal (i.e. local) populations lead to evaluations based only on internal records while widely used foreign sires have been selected using internally unavailable external records. In such cases, internal genetic evaluations may be less accurate and biased. Because external records are unavailable, methods were developed to combine external information that summarizes these records, i.e. external estimated breeding values and associated reliabilities, with internal records to improve accuracy of internal genetic evaluations. Two issues of these methods concern double-counting of contributions due to relationships and due to records. These issues could be worse if external information came from several evaluations, at least partially based on the same records, and combined into a single internal evaluation. Based on a Bayesian approach, the aim of this research was to develop a unified method to integrate and blend simultaneously several sources of information into an internal genetic evaluation by avoiding double-counting of contributions due to relationships and due to records. This research resulted in equations that integrate and blend simultaneously several sources of information and avoid double-counting of contributions due to relationships and due to records. The performance of the developed equations was evaluated using simulated and real datasets. The results showed that the developed equations integrated and blended several sources of information well into a genetic evaluation. The developed equations also avoided double-counting of contributions due to relationships and due to records. Furthermore, because all available external sources of information were correctly propagated, relatives of external animals benefited from the integrated

  1. A Branch and Bound Method to the Continuous Time Model Elevator System with Full Information

    NASA Astrophysics Data System (ADS)

    Shen, Zhen; Zhao, Qianchuan

    A new Branch and Bound method is given for the scheduling of the group elevator system with full information. Full information means that not only the parameters of the elevator systems but also the arrival time, origins and destinations of all the passengers who are to be served are known beforehand. The performance obtained by solving the full information problem is the best performance that the elevator scheduling algorithm can achieve and then can be used to measure how good an elevator scheduling algorithm is. The method can handle the continuous time event and is based on the concept of “trip”, which refers to the movement of the car without changing the direction and with at least one passenger being served.

  2. Theory based design and optimization of materials for spintronics applications

    NASA Astrophysics Data System (ADS)

    Xu, Tianyi

    The Spintronics industry has developed rapidly in the past decade. Finding the right material is very important for Spintronics applications, which requires good understanding of the physics behind specific phenomena. In this dissertation, we will focus on two types of perpendicular transport phenomena, the current-perpendicular-to-plane giant-magneto-resistance (CPP-GMR) phenomenon and the tunneling phenomenon in the magnetic tunnel junctions. The Valet-Fert model is a very useful semi-classical approach for understanding the transport and spin-flip process in CPP-GMR. We will present a finite element based implementation for the Valet-Fert model which enables a practical way to calculate the electron transport in real CPP-GMR spin valves. It is very important to find high spin polarized materials for CPP-GMR spin valves. The half-metal, due to its full spin polarization, is of interest. We will propose a rational way to find half-metals based on the gap theorem. Then we will focus on the high-MR TMR phenomenon. The tunneling theory of electron transport in mesoscopic systems will be covered. Then we will calculate the transport properties of certain junctions with the help of Green's function under the Landauer-Buttiker formalism, also known as the scattering formalism. The damping constant determines the switching rate of a device. We can calculate it using a method based on the Extended Huckel Tight-Binding theory (EHTB). The symmetry filtering effect is very helpful for finding materials for TMR junctions. Based upon which, we find a good candidate material, MnAl, for TMR applications.

  3. Development of the CODER System: A Testbed for Artificial Intelligence Methods in Information Retrieval.

    ERIC Educational Resources Information Center

    Fox, Edward A.

    1987-01-01

    Discusses the CODER system, which was developed to investigate the application of artificial intelligence methods to increase the effectiveness of information retrieval systems, particularly those involving heterogeneous documents. Highlights include the use of PROLOG programing, blackboard-based designs, knowledge engineering, lexicological…

  4. 76 FR 40448 - 2010 Tax Information for Use in the Revenue Shortfall Allocation Method

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-08

    ... Surface Transportation Board 2010 Tax Information for Use in the Revenue Shortfall Allocation Method... Sept. 5, 2007),\\1\\ as further revised in Simplified Standards for Rail Rate Cases--Taxes in Revenue... mistakenly compared pre-tax and after-tax revenues. In that decision, the Board stated that it would...

  5. Method and Apparatus Providing Deception and/or Altered Operation in an Information System Operating System

    DOEpatents

    Cohen, Fred; Rogers, Deanna T.; Neagoe, Vicentiu

    2008-10-14

    A method and/or system and/or apparatus providing deception and/or execution alteration in an information system. In specific embodiments, deceptions and/or protections are provided by intercepting and/or modifying operation of one or more system calls of an operating system.

  6. Genetically Informative Research on Adolescent Substance Use: Methods, Findings, and Challenges

    ERIC Educational Resources Information Center

    Lynskey, Michael T.; Agrawal, Arpana; Heath, Andrew C.

    2010-01-01

    Objective: To provide an overview of the genetic epidemiology of substance use and misuse in adolescents. Method: A selective review of genetically informative research strategies, their limitations, and key findings examining issues related to the heritability of substance use and substance use disorders in children and adolescents is presented.…

  7. Blended Lessons of Teaching Method for Information Studies in Which Students Produce a Learning Guidance Plan

    ERIC Educational Resources Information Center

    Miyaji, Isao

    2013-01-01

    Adopting exercise-making and evaluation activities, we conducted a teaching method of Information Studies which is a teaching-training course subject. We surveyed the learners' recognition rate of terms related to lessons at both the beginning and the end of lessons. Then we tested the significance of the differences between both rates. Those…

  8. THE RELATIVE EFFECTIVENESS OF THE TRADITIONAL AND TWO MODIFIED METHODS OF ORGANIZING INFORMATION SHEETS.

    ERIC Educational Resources Information Center

    PUCEL, DAVID J.

    THE EFFECTIVENESS OF A TYPICAL METHOD OF ORGANIZING TECHNICAL INFORMATION SHEETS USED BY VOCATIONAL EDUCATORS TO PROVIDE UP-TO-DATE INSTRUCTION TO STUDENTS WAS COMPARED TO THAT OF TWO NEWLY DEVELOPED ORGANIZATIONS BASED ON "THE SUBSUMPTION THEORY OF MEANINGFUL VERBAL LEARNING AND RETENTION" (AUSUBEL, 1962). AN OPERATIONAL DEFINITION…

  9. Parenting Practices of Anxious and Nonanxious Mothers: A Multi-Method, Multi-Informant Approach

    ERIC Educational Resources Information Center

    Drake, Kelly L.; Ginsburg, Golda S.

    2011-01-01

    Anxious and nonanxious mothers were compared on theoretically derived parenting and family environment variables (i.e., overcontrol, warmth, criticism, anxious modeling) using multiple informants and methods. Mother-child dyads completed questionnaires about parenting and were observed during an interactional task. Findings reveal that, after…

  10. A Method for Analyzing Volunteered Geographic Information to Visualize Community Valuation of Ecosystem Services

    EPA Science Inventory

    Volunteered geographic information (VGI) can be used to identify public valuation of ecosystem services in a defined geographic area using photos as a representation of lived experiences. This method can help researchers better survey and report on the values and preferences of s...

  11. Genetically Informative Research on Adolescent Substance Use: Methods, Findings, and Challenges

    ERIC Educational Resources Information Center

    Lynskey, Michael T.; Agrawal, Arpana; Heath, Andrew C.

    2010-01-01

    Objective: To provide an overview of the genetic epidemiology of substance use and misuse in adolescents. Method: A selective review of genetically informative research strategies, their limitations, and key findings examining issues related to the heritability of substance use and substance use disorders in children and adolescents is presented.…

  12. Systems, methods and apparatus for implementation of formal specifications derived from informal requirements

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G. (Inventor); Rash, James L. (Inventor); Erickson, John D. (Inventor); Gracinin, Denis (Inventor); Rouff, Christopher A. (Inventor)

    2010-01-01

    Systems, methods and apparatus are provided through which in some embodiments an informal specification is translated without human intervention into a formal specification. In some embodiments the formal specification is a process-based specification. In some embodiments, the formal specification is translated into a high-level computer programming language which is further compiled into a set of executable computer instructions.

  13. Parenting Practices of Anxious and Nonanxious Mothers: A Multi-Method, Multi-Informant Approach

    ERIC Educational Resources Information Center

    Drake, Kelly L.; Ginsburg, Golda S.

    2011-01-01

    Anxious and nonanxious mothers were compared on theoretically derived parenting and family environment variables (i.e., overcontrol, warmth, criticism, anxious modeling) using multiple informants and methods. Mother-child dyads completed questionnaires about parenting and were observed during an interactional task. Findings reveal that, after…

  14. An Inquiry-Based Approach to Teaching Research Methods in Information Studies

    ERIC Educational Resources Information Center

    Albright, Kendra; Petrulis, Robert; Vasconcelos, Ana; Wood, Jamie

    2012-01-01

    This paper presents the results of a project that aimed at restructuring the delivery of research methods training at the Information School at the University of Sheffield, UK, based on an Inquiry-Based Learning (IBL) approach. The purpose of this research was to implement inquiry-based learning that would allow customization of research methods…

  15. Aligning Professional Skills and Active Learning Methods: An Application for Information and Communications Technology Engineering

    ERIC Educational Resources Information Center

    Llorens, Ariadna; Berbegal-Mirabent, Jasmina; Llinàs-Audet, Xavier

    2017-01-01

    Engineering education is facing new challenges to effectively provide the appropriate skills to future engineering professionals according to market demands. This study proposes a model based on active learning methods, which is expected to facilitate the acquisition of the professional skills most highly valued in the information and…

  16. Reduction in redundancy of multichannel telemetric information by the method of adaptive discretization with associative sorting

    NASA Technical Reports Server (NTRS)

    Kantor, A. V.; Timonin, V. G.; Azarova, Y. S.

    1974-01-01

    The method of adaptive discretization is the most promising for elimination of redundancy from telemetry messages characterized by signal shape. Adaptive discretization with associative sorting was considered as a way to avoid the shortcomings of adaptive discretization with buffer smoothing and adaptive discretization with logical switching in on-board information compression devices (OICD) in spacecraft. Mathematical investigations of OICD are presented.

  17. A Method for Analyzing Volunteered Geographic Information to Visualize Community Valuation of Ecosystem Services

    EPA Science Inventory

    Volunteered geographic information (VGI) can be used to identify public valuation of ecosystem services in a defined geographic area using photos as a representation of lived experiences. This method can help researchers better survey and report on the values and preferences of s...

  18. Reduction in redundancy of multichannel telemetric information by the method of adaptive discretization with associative sorting

    NASA Technical Reports Server (NTRS)

    Kantor, A. V.; Timonin, V. G.; Azarova, Y. S.

    1974-01-01

    The method of adaptive discretization is the most promising for elimination of redundancy from telemetry messages characterized by signal shape. Adaptive discretization with associative sorting was considered as a way to avoid the shortcomings of adaptive discretization with buffer smoothing and adaptive discretization with logical switching in on-board information compression devices (OICD) in spacecraft. Mathematical investigations of OICD are presented.

  19. An Inquiry-Based Approach to Teaching Research Methods in Information Studies

    ERIC Educational Resources Information Center

    Albright, Kendra; Petrulis, Robert; Vasconcelos, Ana; Wood, Jamie

    2012-01-01

    This paper presents the results of a project that aimed at restructuring the delivery of research methods training at the Information School at the University of Sheffield, UK, based on an Inquiry-Based Learning (IBL) approach. The purpose of this research was to implement inquiry-based learning that would allow customization of research methods…

  20. A Proposed Method of Measuring the Utility of Individual Information Retrieval Tools.

    ERIC Educational Resources Information Center

    Meadow, Charles T.

    1996-01-01

    Proposes a method of evaluating information retrieval systems by concentrating on individual tools (commands, their menus or graphic interface equivalents, or a move/stratagem). A user would assess the relative success of a small part of a search, and every tool used in that part would be credited with a contribution to the result. Cumulative…

  1. Development of the CODER System: A Testbed for Artificial Intelligence Methods in Information Retrieval.

    ERIC Educational Resources Information Center

    Fox, Edward A.

    1987-01-01

    Discusses the CODER system, which was developed to investigate the application of artificial intelligence methods to increase the effectiveness of information retrieval systems, particularly those involving heterogeneous documents. Highlights include the use of PROLOG programing, blackboard-based designs, knowledge engineering, lexicological…

  2. Moderators of Theory-Based Interventions to Promote Physical Activity in 77 Randomized Controlled Trials.

    PubMed

    Bernard, Paquito; Carayol, Marion; Gourlan, Mathieu; Boiché, Julie; Romain, Ahmed Jérôme; Bortolon, Catherine; Lareyre, Olivier; Ninot, Gregory

    2017-04-01

    A meta-analysis of randomized controlled trials (RCTs) has recently showed that theory-based interventions designed to promote physical activity (PA) significantly increased PA behavior. The objective of the present study was to investigate the moderators of the efficacy of these theory-based interventions. Seventy-seven RCTs evaluating theory-based interventions were systematically identified. Sample, intervention, methodology, and theory implementation characteristics were extracted, coded by three duos of independent investigators, and tested as moderators of interventions effect in a multiple-meta-regression model. Three moderators were negatively associated with the efficacy of theory-based interventions on PA behavior: intervention length (≥14 weeks; β = -.22, p = .004), number of experimental patients (β = -.10, p = .002), and global methodological quality score (β = -.08, p = .04). Our findings suggest that the efficacy of theory-based interventions to promote PA could be overestimated consequently due to methodological weaknesses of RCTs and that interventions shorter than 14 weeks could maximize the increase of PA behavior.

  3. Using a Marginal Structural Model to Design a Theory-Based Mass Media Campaign.

    PubMed

    Nishiuchi, Hiromu; Taguri, Masataka; Ishikawa, Yoshiki

    2016-01-01

    The essential first step in the development of mass media health campaigns is to identify specific beliefs of the target audience. The challenge is to prioritize suitable beliefs derived from behavioral theory. The purpose of this study was to identify suitable beliefs to target in a mass media campaign to change behavior using a new method to estimate the possible effect size of a small set of beliefs. Data were drawn from the 2010 Japanese Young Female Smoker Survey (n = 500), conducted by the Japanese Ministry of Health, Labor and Welfare. Survey measures included intention to quit smoking, psychological beliefs (attitude, norms, and perceived control) based on the theory of planned behavior and socioeconomic status (age, education, household income, and marital status). To identify suitable candidate beliefs for a mass media health campaign, we estimated the possible effect size required to change the intention to quit smoking among the population of young Japanese women using the population attributable fraction from a marginal structural model. Thirteen percent of study participants intended to quit smoking. The marginal structural model estimated a population attributable fraction of 47 psychological beliefs (21 attitudes, 6 norms, and 19 perceived controls) after controlling for socioeconomic status. The belief, "I could quit smoking if my husband or significant other recommended it" suggested a promising target for a mass media campaign (population attributable fraction = 0.12, 95% CI = 0.02-0.23). Messages targeting this belief could possibly improve intention rates by up to 12% among this population. The analysis also suggested the potential for regulatory action. This study proposed a method by which campaign planners can develop theory-based mass communication strategies to change health behaviors at the population level. This method might contribute to improving the quality of future mass health communication strategies and further research is needed.

  4. A Comparison of Limited-Information and Full-Information Methods in M"plus" for Estimating Item Response Theory Parameters for Nonnormal Populations

    ERIC Educational Resources Information Center

    DeMars, Christine E.

    2012-01-01

    In structural equation modeling software, either limited-information (bivariate proportions) or full-information item parameter estimation routines could be used for the 2-parameter item response theory (IRT) model. Limited-information methods assume the continuous variable underlying an item response is normally distributed. For skewed and…

  5. A Comparison of Limited-Information and Full-Information Methods in M"plus" for Estimating Item Response Theory Parameters for Nonnormal Populations

    ERIC Educational Resources Information Center

    DeMars, Christine E.

    2012-01-01

    In structural equation modeling software, either limited-information (bivariate proportions) or full-information item parameter estimation routines could be used for the 2-parameter item response theory (IRT) model. Limited-information methods assume the continuous variable underlying an item response is normally distributed. For skewed and…

  6. Methods study for the relocation of visual information in central scotoma cases

    NASA Astrophysics Data System (ADS)

    Scherlen, Anne-Catherine; Gautier, Vincent

    2005-03-01

    In this study we test the benefit on the reading performance of different ways to relocating the visual information present under the scotoma. The relocation (or unmasking) allows to compensate the loss of information and avoid the patient developing driving strategies not adapted for the reading. Eight healthy subjects were tested on a reading task, on each a central scotoma of various sizes was simulated. We then evaluate the reading speed (words/min) during three visual information relocation methods: all masked information is relocated - on both side of scotoma, - on the right of scotoma, - and only essentials letters for the word recognition too on the right of scotoma. We compare these reading speeds versus the pathological condition, ie without relocating visual information. Our results show that unmasking strategy improve the reading speed when all the visual information is unmask to the right of scotoma, this only for large scotoma. Taking account the word morphology, the perception of only certain letters outside the scotoma can be sufficient to improve the reading speed. A deepening of reading processes in the presence of a scotoma will then allows a new perspective for visual information unmasking. Multidisciplinary competences brought by engineers, ophtalmologists, linguists, clinicians would allow to optimize the reading benefit brought by the unmasking.

  7. Development of a theory-based (PEN-3 and Health Belief Model), culturally relevant intervention on cervical cancer prevention among Latina immigrants using intervention mapping.

    PubMed

    Scarinci, Isabel C; Bandura, Lisa; Hidalgo, Bertha; Cherrington, Andrea

    2012-01-01

    The development of efficacious theory-based, culturally relevant interventions to promote cervical cancer prevention among underserved populations is crucial to the elimination of cancer disparities. The purpose of this article is to describe the development of a theory-based, culturally relevant intervention focusing on primary (sexual risk reduction) and secondary (Pap smear) prevention of cervical cancer among Latina immigrants using intervention mapping (IM). The PEN-3 and Health Belief Model provided theoretical guidance for the intervention development and implementation. IM provides a logical five-step framework in intervention development: delineating proximal program objectives, selecting theory-based intervention methods and strategies, developing a program plan, planning for adoption in implementation, and creating evaluation plans and instruments. We first conducted an extensive literature review and qualitatively examined the sociocultural factors associated with primary and secondary prevention of cervical cancer. We then proceeded to quantitatively validate the qualitative findings, which led to development matrices linking the theoretical constructs with intervention objectives and strategies as well as evaluation. IM was a helpful tool in the development of a theory-based, culturally relevant intervention addressing primary and secondary prevention among Latina immigrants.

  8. Development of a Theory-Based (PEN-3 and Health Belief Model), Culturally Relevant Intervention on Cervical Cancer Prevention Among Latina Immigrants Using Intervention Mapping

    PubMed Central

    Scarinci, Isabel C.; Bandura, Lisa; Hidalgo, Bertha; Cherrington, Andrea

    2014-01-01

    The development of efficacious theory-based, culturally relevant interventions to promote cervical cancer prevention among underserved populations is crucial to the elimination of cancer disparities. The purpose of this article is to describe the development of a theory-based, culturally relevant intervention focusing on primary (sexual risk reduction) and secondary (Pap smear) prevention of cervical cancer among Latina immigrants using intervention mapping (IM). The PEN-3 and Health Belief Model provided theoretical guidance for the intervention development and implementation. IM provides a logical five-step framework in intervention development: delineating proximal program objectives, selecting theory-based intervention methods and strategies, developing a program plan, planning for adoption in implementation, and creating evaluation plans and instruments. We first conducted an extensive literature review and qualitatively examined the socio-cultural factors associated with primary and secondary prevention of cervical cancer. We then proceeded to quantitatively validate the qualitative findings, which led to development matrices linking the theoretical constructs with intervention objectives and strategies as well as evaluation. IM was a helpful tool in the development of a theory-based, culturally relevant intervention addressing primary and secondary prevention among Latina immigrants. PMID:21422254

  9. Metaproteomics of soils from semiarid environment: functional and phylogenetic information obtained with different protein extraction methods.

    PubMed

    Bastida, F; Hernández, T; García, C

    2014-04-14

    Microbial populations fulfil a critical role in the soil sustainability and their functionality can be ascertained by proteomics based on high-performance mass spectrometry (MS) measurements. However, soil proteomics is compromised by methodological issues, among which extraction is a limiting factor, and still has not been adequately applied in semiarid soils, which usually are nutrient limited. We aim to evaluate the functional and phylogenetic information retrieved from three semiarid soils with distinct edaphic properties and degradation levels. Three extraction methods with different physico-chemical bases were tested [1-3]. The HPLC-amino acid quantification of the extracted protein pellets revealed a tremendous inefficiency of the extraction methods, with a maximally 6.8% of the proteinaceous material being extracted in comparison with the protein content in the bulk soil. The composition of the proteomes extracted was analysed after SDS-PAGE and liquid chromatography coupled with electrospray-MS/MS. Chourey's method, based on boiling and DTT, yielded a high diversity of bacterial proteins and revealed differences in the community composition at the phylum level among the three soils. The overall metabolic information obtained by both extraction methods was similar, but Chourey's method provided additionally valuable bio-geochemical insights which suggest an ecological adaptation of microbial communities from semiarid soils for carbon and nitrogen fixation. Microbial communities inhabiting the soil perform critical reactions for the sustainability of the planet. At biochemical level, soil proteomics is starting to provide incipient insights into the microbial functionality of soils. However, methodological comparisons are needed to assess which methods are more suitable. Precisely, such information under arid and semiarid environments is missing. By using amino acid quantification of extracted proteomes and LC-MS/MS based proteomics, we provide a novel

  10. An innovative method for extracting isotopic information from low-resolution gamma spectra

    SciTech Connect

    Miko, D.; Estep, R.J.; Rawool-Sullivan, M.W.

    1998-12-01

    A method is described for the extraction of isotopic information from attenuated gamma ray spectra using the gross-count material basis set (GC-MBS) model. This method solves for the isotopic composition of an unknown mixture of isotopes attenuated through an absorber of unknown material. For binary isotopic combinations the problem is nonlinear in only one variable and is easily solved using standard line optimization techniques. Results are presented for NaI spectrum analyses of various binary combinations of enriched uranium, depleted uranium, low burnup Pu, {sup 137}Cs, and {sup 133}Ba attenuated through a suite of absorbers ranging in Z from polyethylene through lead. The GC-MBS method results are compared to those computed using ordinary response function fitting and with a simple net peak area method. The GC-MBS method was found to be significantly more accurate than the other methods over the range of absorbers and isotopic blends studied.

  11. Preferred Methods for Delivery of Technological Information by the North Carolina Agricultural Extension Service: Opinions of Agricultural Producers Who Use Extension Information.

    ERIC Educational Resources Information Center

    Richardson, John G.; Mustian, R. David

    The findings of a questionnaire survey of 702 North Carolina agricultural producers indicated that communication methods historically used by the North Carolina Agricultural Extension Service for information dissemination are accepted by state farmers and continue to be popular. Information delivery methods most frequently preferred are…

  12. An information processing method for acoustic emission signal inspired from musical staff

    NASA Astrophysics Data System (ADS)

    Zheng, Wei; Wu, Chunxian

    2016-01-01

    This study proposes a musical-staff-inspired signal processing method for standard description expressions for discrete signals and describing the integrated characteristics of acoustic emission (AE) signals. The method maps various AE signals with complex environments into the normalized musical space. Four new indexes are proposed to comprehensively describe the signal. Several key features, such as contour, amplitude, and signal changing rate, are quantitatively expressed in a normalized musical space. The processed information requires only a small storage space to maintain high fidelity. The method is illustrated by using experiments on sandstones and computed tomography (CT) scanning to determine its validity for AE signal processing.

  13. a Method of Tomato Image Segmentation Based on Mutual Information and Threshold Iteration

    NASA Astrophysics Data System (ADS)

    Wu, Hongxia; Li, Mingxi

    Threshold Segmentation is a kind of important image segmentation method and one of the important preconditioning steps of image detection and recognition, and it has very broad application during the research scopes of the computer vision. According to the internal relation between segment image and original image, a tomato image automatic optimization segmentation method (MI-OPT) which mutual information associate with optimum threshold iteration was presented. Simulation results show that this method has a better image segmentation effect on the tomato images of mature period and little background color difference or different color.

  14. Development of a Method to Obtain More Accurate General and Oral Health Related Information Retrospectively

    PubMed Central

    A, Golkari; A, Sabokseir; D, Blane; A, Sheiham; RG, Watt

    2017-01-01

    Statement of Problem: Early childhood is a crucial period of life as it affects one’s future health. However, precise data on adverse events during this period is usually hard to access or collect, especially in developing countries. Objectives: This paper first reviews the existing methods for retrospective data collection in health and social sciences, and then introduces a new method/tool for obtaining more accurate general and oral health related information from early childhood retrospectively. Materials and Methods: The Early Childhood Events Life-Grid (ECEL) was developed to collect information on the type and time of health-related adverse events during the early years of life, by questioning the parents. The validity of ECEL and the accuracy of information obtained by this method were assessed in a pilot study and in a main study of 30 parents of 8 to 11 year old children from Shiraz (Iran). Responses obtained from parents using the final ECEL were compared with the recorded health insurance documents. Results: There was an almost perfect agreement between the health insurance and ECEL data sets (Kappa value=0.95 and p < 0.001). Interviewees remembered the important events more accurately (100% exact timing match in case of hospitalization). Conclusions: The Early Childhood Events Life-Grid method proved to be highly accurate when compared with recorded medical documents. PMID:28959773

  15. Real-time flood forecasts & risk assessment using a possibility-theory based fuzzy neural network

    NASA Astrophysics Data System (ADS)

    Khan, U. T.

    2016-12-01

    Globally floods are one of the most devastating natural disasters and improved flood forecasting methods are essential for better flood protection in urban areas. Given the availability of high resolution real-time datasets for flood variables (e.g. streamflow and precipitation) in many urban areas, data-driven models have been effectively used to predict peak flow rates in river; however, the selection of input parameters for these types of models is often subjective. Additionally, the inherit uncertainty associated with data models along with errors in extreme event observations means that uncertainty quantification is essential. Addressing these concerns will enable improved flood forecasting methods and provide more accurate flood risk assessments. In this research, a new type of data-driven model, a quasi-real-time updating fuzzy neural network is developed to predict peak flow rates in urban riverine watersheds. A possibility-to-probability transformation is first used to convert observed data into fuzzy numbers. A possibility theory based training regime is them used to construct the fuzzy parameters and the outputs. A new entropy-based optimisation criterion is used to train the network. Two existing methods to select the optimum input parameters are modified to account for fuzzy number inputs, and compared. These methods are: Entropy-Wavelet-based Artificial Neural Network (EWANN) and Combined Neural Pathway Strength Analysis (CNPSA). Finally, an automated algorithm design to select the optimum structure of the neural network is implemented. The overall impact of each component of training this network is to replace the traditional ad hoc network configuration methods, with one based on objective criteria. Ten years of data from the Bow River in Calgary, Canada (including two major floods in 2005 and 2013) are used to calibrate and test the network. The EWANN method selected lagged peak flow as a candidate input, whereas the CNPSA method selected lagged

  16. Method for Bandwidth Compression and Transmission of Environmental Information in Bilateral Teleoperation

    NASA Astrophysics Data System (ADS)

    Kubo, Ryogo; Ohnishi, Kouhei

    In this paper, a novel method for bandwidth compression and transmission of environmental information is proposed for bilateral teleoperation systems with multiple degrees of freedom (MDOF). In this method, environmental information, i.e., the position of end-effectors and the reaction force exerted on them, is converted into environmental modes by using discrete Fourier transform (DFT) matrices. The environmental modes to be transmitted are then selected on the basis of the communication bandwidth between master and slave robots. Bilateral control is achieved in low-frequency modal spaces, and local position control is achieved in high-frequency modal spaces. The validity of the proposed method is confirmed by performing an experiment.

  17. Research on the method of information system risk state estimation based on clustering particle filter

    NASA Astrophysics Data System (ADS)

    Cui, Jia; Hong, Bei; Jiang, Xuepeng; Chen, Qinghua

    2017-05-01

    With the purpose of reinforcing correlation analysis of risk assessment threat factors, a dynamic assessment method of safety risks based on particle filtering is proposed, which takes threat analysis as the core. Based on the risk assessment standards, the method selects threat indicates, applies a particle filtering algorithm to calculate influencing weight of threat indications, and confirms information system risk levels by combining with state estimation theory. In order to improve the calculating efficiency of the particle filtering algorithm, the k-means cluster algorithm is introduced to the particle filtering algorithm. By clustering all particles, the author regards centroid as the representative to operate, so as to reduce calculated amount. The empirical experience indicates that the method can embody the relation of mutual dependence and influence in risk elements reasonably. Under the circumstance of limited information, it provides the scientific basis on fabricating a risk management control strategy.

  18. Lost information during the handover of critically injured trauma patients: a mixed-methods study.

    PubMed

    Zakrison, Tanya Liv; Rosenbloom, Brittany; McFarlan, Amanda; Jovicic, Aleksandra; Soklaridis, Sophie; Allen, Casey; Schulman, Carl; Namias, Nicholas; Rizoli, Sandro

    2016-12-01

    Clinical information may be lost during the transfer of critically injured trauma patients from the emergency department (ED) to the intensive care unit (ICU). The aim of this study was to investigate the causes and frequency of information discrepancies with handover and to explore solutions to improving information transfer. A mixed-methods research approach was used at our level I trauma centre. Information discrepancies between the ED and the ICU were measured using chart audits. Descriptive, parametric and non-parametric statistics were applied, as appropriate. Six focus groups of 46 ED and ICU nurses and nine individual interviews of trauma team leaders were conducted to explore solutions to improve information transfer using thematic analysis. Chart audits demonstrated that injuries were missed in 24% of patients. Clinical information discrepancies occurred in 48% of patients. Patients with these discrepancies were more likely to have unknown medical histories (p<0.001) requiring information rescue (p<0.005). Close to one in three patients with information rescue had a change in clinical management (p<0.01). Participants identified challenges according to their disciplines, with some overlap. Physicians, in contrast to nurses, were perceived as less aware of interdisciplinary stress and their role regarding variability in handover. Standardising handover, increasing non-technical physician training and understanding unit cultures were proposed as solutions, with nurses as drivers of a culture of safety. Trauma patient information was lost during handover from the ED to the ICU for multiple reasons. An interprofessional approach was proposed to improve handover through cross-unit familiarisation and use of communication tools is proposed. Going beyond traditional geographical and temporal boundaries was deemed important for improving patient safety during the ED to ICU handover. Published by the BMJ Publishing Group Limited. For permission to use (where not

  19. Improvements in recall and food choices using a graphical method to deliver information of select nutrients.

    PubMed

    Pratt, Nathan S; Ellison, Brenna D; Benjamin, Aaron S; Nakamura, Manabu T

    2016-01-01

    Consumers have difficulty using nutrition information. We hypothesized that graphically delivering information of select nutrients relative to a target would allow individuals to process information in time-constrained settings more effectively than numerical information. Objectives of the study were to determine the efficacy of the graphical method in (1) improving memory of nutrient information and (2) improving consumer purchasing behavior in a restaurant. Values of fiber and protein per calorie were 2-dimensionally plotted alongside a target box. First, a randomized cued recall experiment was conducted (n=63). Recall accuracy of nutrition information improved by up to 43% when shown graphically instead of numerically. Second, the impact of graphical nutrition signposting on diner choices was tested in a cafeteria. Saturated fat and sodium information was also presented using color coding. Nutrient content of meals (n=362) was compared between 3 signposting phases: graphical, nutrition facts panels (NFP), or no nutrition label. Graphical signposting improved nutrient content of purchases in the intended direction, whereas NFP had no effect compared with the baseline. Calories ordered from total meals, entrées, and sides were significantly less during graphical signposting than no-label and NFP periods. For total meal and entrées, protein per calorie purchased was significantly higher and saturated fat significantly lower during graphical signposting than the other phases. Graphical signposting remained a predictor of calories and protein per calorie purchased in regression modeling. These findings demonstrate that graphically presenting nutrition information makes that information more available for decision making and influences behavior change in a realistic setting. Copyright © 2016 Elsevier Inc. All rights reserved.

  20. The Effect of Health Information Technology on Health Care Provider Communication: A Mixed-Method Protocol.

    PubMed

    Manojlovich, Milisa; Adler-Milstein, Julia; Harrod, Molly; Sales, Anne; Hofer, Timothy P; Saint, Sanjay; Krein, Sarah L

    2015-06-11

    Communication failures between physicians and nurses are one of the most common causes of adverse events for hospitalized patients, as well as a major root cause of all sentinel events. Communication technology (ie, the electronic medical record, computerized provider order entry, email, and pagers), which is a component of health information technology (HIT), may help reduce some communication failures but increase others because of an inadequate understanding of how communication technology is used. Increasing use of health information and communication technologies is likely to affect communication between nurses and physicians. The purpose of this study is to describe, in detail, how health information and communication technologies facilitate or hinder communication between nurses and physicians with the ultimate goal of identifying how we can optimize the use of these technologies to support effective communication. Effective communication is the process of developing shared understanding between communicators by establishing, testing, and maintaining relationships. Our theoretical model, based in communication and sociology theories, describes how health information and communication technologies affect communication through communication practices (ie, use of rich media; the location and availability of computers) and work relationships (ie, hierarchies and team stability). Therefore we seek to (1) identify the range of health information and communication technologies used in a national sample of medical-surgical acute care units, (2) describe communication practices and work relationships that may be influenced by health information and communication technologies in these same settings, and (3) explore how differences in health information and communication technologies, communication practices, and work relationships between physicians and nurses influence communication. This 4-year study uses a sequential mixed-methods design, beginning with a

  1. Identification of depth information with stereoscopic mammography using different display methods

    NASA Astrophysics Data System (ADS)

    Morikawa, Takamitsu; Kodera, Yoshie

    2013-03-01

    Stereoscopy in radiography was widely used in the late 80's because it could be used for capturing complex structures in the human body, thus proving beneficial for diagnosis and screening. When radiologists observed the images stereoscopically, radiologists usually needed the training of their eyes in order to perceive the stereoscopic effect. However, with the development of three-dimensional (3D) monitors and their use in the medical field, only a visual inspection is no longer required in the medical field. The question then arises as to whether there is any difference in recognizing depth information when using conventional methods and that when using a 3D monitor. We constructed a phantom and evaluated the difference in capacity to identify the depth information between the two methods. The phantom consists of acryl steps and 3mm diameter acryl pillars on the top and bottom of each step. Seven observers viewed these images stereoscopically using the two display methods and were asked to judge the direction of the pillar that was on the top. We compared these judged direction with the direction of the real pillar arranged on the top, and calculated the percentage of correct answerers (PCA). The results showed that PCA obtained using the 3D monitor method was higher PCA by about 5% than that obtained using the naked-eye method. This indicated that people could view images stereoscopically more precisely using the 3D monitor method than when using with conventional methods, like the crossed or parallel eye viewing. We were able to estimate the difference in capacity to identify the depth information between the two display methods.

  2. “Please Don’t Send Us Spam!” A Participative, Theory-Based Methodology for Developing an mHealth Intervention

    PubMed Central

    2016-01-01

    Background Mobile health solutions have the potential of reducing burdens on health systems and empowering patients with important information. However, there is a lack of theory-based mHealth interventions. Objective The purpose of our study was to develop a participative, theory-based, mobile phone, audio messaging intervention attractive to recently circumcised men at voluntary medical male circumcision (VMMC) clinics in the Cape Town area in South Africa. We aimed to shift some of the tasks related to postoperative counselling on wound management and goal setting on safe sex. We place an emphasis on describing the full method of message generation to allow for replication. Methods We developed an mHealth intervention using a staggered qualitative methodology: (1) focus group discussions with 52 recently circumcised men and their partners to develop initial voice messages they felt were relevant and appropriate, (2) thematic analysis and expert consultation to select the final messages for pilot testing, and (3) cognitive interviews with 12 recent VMMC patients to judge message comprehension and rank the messages. Message content and phasing were guided by the theory of planned behavior and the health action process approach. Results Patients and their partners came up with 245 messages they thought would help men during the wound-healing period. Thematic analysis revealed 42 different themes. Expert review and cognitive interviews with more patients resulted in 42 messages with a clear division in terms of needs and expectations between the initial wound-healing recovery phase (weeks 1–3) and the adjustment phase (weeks 4–6). Discussions with patients also revealed potential barriers to voice messaging, such as lack of technical knowledge of mobile phones and concerns about the invasive nature of the intervention. Patients’ own suggested messages confirmed Ajzen’s theory of planned behavior that if a health promotion intervention can build trust and be

  3. Application of Multi-Sensor Information Fusion Method Based on Rough Sets and Support Vector Machine

    NASA Astrophysics Data System (ADS)

    Xue, Jinxue; Wang, Guohu; Wang, Xiaoqiang; Cui, Fengkui

    In order to improve the precision and date processing speed of multi-sensor information fusion, a kind of multi-sensor data fusion process algorithm has been studied in this paper. First, based on rough set theory (RS) to attribute reduction the parameter set, we use the advantages of rough set theory in dealing with large amount of data to eliminate redundant information. Then, the data can be trained and classified by Support Vector Machine (SYM). Experimental results showed that this method can improve the speed and accuracy of multi-sensor fusion system.

  4. Iterative development of Vegethon: a theory-based mobile app intervention to increase vegetable consumption.

    PubMed

    Mummah, Sarah A; King, Abby C; Gardner, Christopher D; Sutton, Stephen

    2016-08-08

    ). Vegethon is a theory-based, user-informed mobile intervention that was systematically developed using the IDEAS framework. Vegethon targets increased vegetable consumption among overweight adults and is currently being evaluated in a randomized controlled efficacy trial. Clinical Trials.gov: NCT01826591.

  5. Revisited: The South Dakota Board of Nursing theory-based regulatory decisioning model.

    PubMed

    Damgaard, Gloria; Bunkers, Sandra Schmidt

    2012-07-01

    The authors of this column describe the South Dakota Board of Nursing's 11 year journey utilizing a humanbecoming theory-based regulatory decisioning model. The column revisits the model with an emphasis on the cocreation of a strategic plan guiding the work of the South Dakota Board of Nursing through 2014. The strategic plan was influenced by the latest refinements of the humanbecoming postulates and the humanbecoming community change concepts. A graphic picture of the decisioning model is presented along with future plans for the theory-based model.

  6. Provision of information about newborn screening antenatally: a sequential exploratory mixed-methods project.

    PubMed

    Ulph, Fiona; Wright, Stuart; Dharni, Nimarta; Payne, Katherine; Bennett, Rebecca; Roberts, Stephen; Walshe, Kieran; Lavender, Tina

    2017-10-01

    Participation in the UK Newborn Bloodspot Screening Programme (NBSP) requires parental consent but concerns exist about whether or not this happens in practice and the best methods and timing to obtain consent at reasonable cost. To collate all possible modes of prescreening communication and consent for newborn (neonatal) screening (NBS); examine midwives', screening professionals' and users' views about the feasibility, efficiency and impact on understanding of each; measure midwives' and parents' preferences for information provision; and identify key drivers of cost-effectiveness for alternative modes of information provision. Six study designs were used: (1) realist review - to generate alternative communication and consent models; (2) qualitative interviews with parents and health professionals - to examine the implications of current practice for understanding and views on alternative models; (3) survey and observation of midwives - to establish current costs; (4) stated preference surveys with midwives, parents and potential future parents - to establish preferences for information provision; (5) economic analysis - to identify cost-effectiveness drivers of alternative models; and (6) stakeholder validation focus groups and interviews - to examine the acceptability, views and broader impact of alternative communication and consent models. Providers and users of NBS in England. Study 2: 45 parents and 37 health professionals; study 3: 22 midwives and eight observations; study 4: 705 adults aged 18-45 years and 134 midwives; and study 6: 12 health-care professionals and five parents. The realist review identified low parental knowledge and evidence of coercive consent practices. Interview, focus group and stated preference data suggested a preference for full information, with some valuing this more than choice. Health professionals preferred informed choice models but parents and health professionals queried whether or not current consent was fully informed

  7. An Improved Aerial Remote Sensing Image Defogging Method Based on Dark Channel Prior Information

    NASA Astrophysics Data System (ADS)

    Zhang, Z.; Feng, W.; Wang, T.; Zhang, Y.; Ding, L.

    2017-09-01

    Aerial remote sensing image is widely used due to its high resolution, abundant information and convenient processing. However, its image quality is easily influenced by clouds and fog. In recent years, fog and haze air pollution is becoming more and more serious in the north of China and its influence on aerial remote sensing image quality is especially obvious. Considering the characters that aerial remote image is usually in huge amount of data and seldom covers sky area, this paper proposes an improved aerial remote sensing image defogging method based on dark channel prior information. First, a 2 % linear stretching is applied to eliminate the haze offset effect and provide a better initial value for later defogging processing. Then the dark channel prior image is obtained by calculating the minimum values of r, g, b channels of each pixel directly. Subsequently, according to the particularity of aerial image, the adaptive threshold t0 is set up to improve the defogging effect. Finally, to improve the color cast phenomenon, a way called automatic color method is introduced to enhance the visual effect of defogged image. Experiments are performed on normal image in fog and on aerial remote sensing image in fog. Experimental results prove that the proposed method can obtain the defogged image with better visual effect and image quality. Moreover, the improved method significantly balances the color information in the defogged image and efficiently avoids the color cast phenomenon.

  8. A Low-Storage-Consumption XML Labeling Method for Efficient Structural Information Extraction

    NASA Astrophysics Data System (ADS)

    Liang, Wenxin; Takahashi, Akihiro; Yokota, Haruo

    Recently, labeling methods to extract and reconstruct the structural information of XML data, which are important for many applications such as XPath query and keyword search, are becoming more attractive. To achieve efficient structural information extraction, in this paper we propose C-DO-VLEI code, a novel update-friendly bit-vector encoding scheme, based on register-length bit operations combining with the properties of Dewey Order numbers, which cannot be implemented in other relevant existing schemes such as ORDPATH. Meanwhile, the proposed method also achieves lower storage consumption because it does not require either prefix schema or any reserved codes for node insertion. We performed experiments to evaluate and compare the performance and storage consumption of the proposed method with those of the ORDPATH method. Experimental results show that the execution times for extracting depth information and parent node labels using the C-DO-VLEI code are about 25% and 15% less, respectively, and the average label size using the C-DO-VLEI code is about 24% smaller, comparing with ORDPATH.

  9. A nonparametric statistical method for image segmentation using information theory and curve evolution.

    PubMed

    Kim, Junmo; Fisher, John W; Yezzi, Anthony; Cetin, Müjdat; Willsky, Alan S

    2005-10-01

    In this paper, we present a new information-theoretic approach to image segmentation. We cast the segmentation problem as the maximization of the mutual information between the region labels and the image pixel intensities, subject to a constraint on the total length of the region boundaries. We assume that the probability densities associated with the image pixel intensities within each region are completely unknown a priori, and we formulate the problem based on nonparametric density estimates. Due to the nonparametric structure, our method does not require the image regions to have a particular type of probability distribution and does not require the extraction and use of a particular statistic. We solve the information-theoretic optimization problem by deriving the associated gradient flows and applying curve evolution techniques. We use level-set methods to implement the resulting evolution. The experimental results based on both synthetic and real images demonstrate that the proposed technique can solve a variety of challenging image segmentation problems. Futhermore, our method, which does not require any training, performs as good as methods based on training.

  10. Development of a Method to Obtain More Accurate General and Oral Health Related Information Retrospectively.

    PubMed

    A, Golkari; A, Sabokseir; D, Blane; A, Sheiham; Rg, Watt

    2017-06-01

    Early childhood is a crucial period of life as it affects one's future health. However, precise data on adverse events during this period is usually hard to access or collect, especially in developing countries. This paper first reviews the existing methods for retrospective data collection in health and social sciences, and then introduces a new method/tool for obtaining more accurate general and oral health related information from early childhood retrospectively. The Early Childhood Events Life-Grid (ECEL) was developed to collect information on the type and time of health-related adverse events during the early years of life, by questioning the parents. The validity of ECEL and the accuracy of information obtained by this method were assessed in a pilot study and in a main study of 30 parents of 8 to 11 year old children from Shiraz (Iran). Responses obtained from parents using the final ECEL were compared with the recorded health insurance documents. There was an almost perfect agreement between the health insurance and ECEL data sets (Kappa value=0.95 and p < 0.001). Interviewees remembered the important events more accurately (100% exact timing match in case of hospitalization). The Early Childhood Events Life-Grid method proved to be highly accurate when compared with recorded medical documents.

  11. Preventing Postpartum Smoking Relapse Among Inner City Women: Development of a Theory-Based and Evidence-Guided Text Messaging Intervention

    PubMed Central

    Wen, Kuang-Yi; Kilby, Linda; Fleisher, Linda; Belton, Tanisha D; Roy, Gem; Hernandez, Enrique

    2014-01-01

    Background Underserved women are at high risk for smoking relapse after childbirth due to their unique socioeconomic and postpartum stressors and barriers. Mobile text messaging technology allows delivery of relapse prevention programs targeted to their personal needs over time. Objective To describe the development of a social-cognitive theory-based and evidence-guided text messaging intervention for preventing postpartum smoking relapse among inner city women. Methods Guided by the cognitive-social health information processing framework, user-centered design, and health communication best practices, the intervention was developed through a systematic process that included needs assessment, followed by an iterative cycling through message drafting, health literacy evaluation and rewriting, review by target community members and a scientific advisory panel, and message revision, concluding with usability testing. Results All message content was theory-grounded, derived by needs assessment analysis and evidence-based materials, reviewed and revised by the target population, health literacy experts, and scientific advisors. The final program, “Txt2Commit,” was developed as a fully automated system, designed to deliver 3 proactive messages per day for a 1-month postpartum smoking relapse intervention, with crave and lapse user-initiated message functions available when needed. Conclusions The developmental process suggests that the application of theory and best practices in the design of text messaging smoking cessation interventions is not only feasible but necessary for ensuring that the interventions are evidence based and user-centered. PMID:24698804

  12. Exploring racial/ethnic differences in substance use: a preliminary theory-based investigation with juvenile justice-involved youth

    PubMed Central

    2011-01-01

    Background Racial/ethnic differences in representation, substance use, and its correlates may be linked to differential long-term health outcomes for justice-involved youth. Determining the nature of these differences is critical to informing more efficacious health prevention and intervention efforts. In this study, we employed a theory-based approach to evaluate the nature of these potential differences. Specifically, we hypothesized that (1) racial/ethnic minority youth would be comparatively overrepresented in the juvenile justice system, (2) the rates of substance use would be different across racial/ethnic groups, and (3) individual-level risk factors would be better predictors of substance use for Caucasian youth than for youth of other racial/ethnic groups. Methods To evaluate these hypotheses, we recruited a large, diverse sample of justice-involved youth in the southwest (N = 651; M age = 15.7, SD = 1.05, range = 14-18 years); 66% male; 41% Hispanic, 24% African American, 15% Caucasian, 11% American Indian/Alaska Native). All youth were queried about their substance use behavior (alcohol, marijuana, tobacco, illicit hard drug use) and individual-level risk factors (school involvement, employment, self-esteem, level of externalizing behaviors). Results As predicted, racial/ethnic minority youth were significantly overrepresented in the juvenile justice system. Additionally, Caucasian youth reported the greatest rates of substance use and substance-related individual-level risk factors. In contrast, African American youth showed the lowest rates for substance use and individual risk factors. Contrary to predictions, a racial/ethnic group by risk factor finding emerged for only one risk factor and one substance use category. Conclusions This research highlights the importance of more closely examining racial/ethnic differences in justice populations, as there are likely to be differing health needs, and subsequent treatment approaches, by racial/ethnic group

  13. An organizational model to distinguish between and integrate research and evaluation activities in a theory based evaluation.

    PubMed

    Sample McMeeking, Laura B; Basile, Carole; Brian Cobb, R

    2012-11-01

    Theory-based evaluation (TBE) is an evaluation method that shows how a program will work under certain conditions and has been supported as a viable, evidence-based option in cases where randomized trials or high-quality quasi-experiments are not feasible. Despite the model's widely accepted theoretical appeal there are few examples of its well-implemented use, probably due to time and money limitations necessary for planning and a confusion over the definitions between research and evaluation functions and roles. In this paper, we describe the development of a theory-based evaluation design in a Math and Science Partnership (MSP) research project funded by the U.S. National Science Foundation (NSF). Through this work we developed an organizational model distinguishing between and integrating evaluation and research functions, explicating personnel roles and responsibilities, and highlighting connections between research and evaluation work. Although the research and evaluation components operated on independent budgeting, staffing, and implementation activities, we were able to combine datasets across activities to allow us to assess the integrity of the program theory, not just the hypothesized connections within it. This model has since been used for proposal development and has been invaluable as it creates a research and evaluation plan that is seamless from the beginning.

  14. Accuracy of two geocoding methods for geographic information system-based exposure assessment in epidemiological studies.

    PubMed

    Faure, Elodie; Danjou, Aurélie M N; Clavel-Chapelon, Françoise; Boutron-Ruault, Marie-Christine; Dossus, Laure; Fervers, Béatrice

    2017-02-24

    Environmental exposure assessment based on Geographic Information Systems (GIS) and study participants' residential proximity to environmental exposure sources relies on the positional accuracy of subjects' residences to avoid misclassification bias. Our study compared the positional accuracy of two automatic geocoding methods to a manual reference method. We geocoded 4,247 address records representing the residential history (1990-2008) of 1,685 women from the French national E3N cohort living in the Rhône-Alpes region. We compared two automatic geocoding methods, a free-online geocoding service (method A) and an in-house geocoder (method B), to a reference layer created by manually relocating addresses from method A (method R). For each automatic geocoding method, positional accuracy levels were compared according to the urban/rural status of addresses and time-periods (1990-2000, 2001-2008), using Chi Square tests. Kappa statistics were performed to assess agreement of positional accuracy of both methods A and B with the reference method, overall, by time-periods and by urban/rural status of addresses. Respectively 81.4% and 84.4% of addresses were geocoded to the exact address (65.1% and 61.4%) or to the street segment (16.3% and 23.0%) with methods A and B. In the reference layer, geocoding accuracy was higher in urban areas compared to rural areas (74.4% vs. 10.5% addresses geocoded to the address or interpolated address level, p < 0.0001); no difference was observed according to the period of residence. Compared to the reference method, median positional errors were 0.0 m (IQR = 0.0-37.2 m) and 26.5 m (8.0-134.8 m), with positional errors <100 m for 82.5% and 71.3% of addresses, for method A and method B respectively. Positional agreement of method A and method B with method R was 'substantial' for both methods, with kappa coefficients of 0.60 and 0.61 for methods A and B, respectively. Our study demonstrates the feasibility of geocoding

  15. Extracting important information from Chinese Operation Notes with natural language processing methods.

    PubMed

    Wang, Hui; Zhang, Weide; Zeng, Qiang; Li, Zuofeng; Feng, Kaiyan; Liu, Lei

    2014-04-01

    Extracting information from unstructured clinical narratives is valuable for many clinical applications. Although natural Language Processing (NLP) methods have been profoundly studied in electronic medical records (EMR), few studies have explored NLP in extracting information from Chinese clinical narratives. In this study, we report the development and evaluation of extracting tumor-related information from operation notes of hepatic carcinomas which were written in Chinese. Using 86 operation notes manually annotated by physicians as the training set, we explored both rule-based and supervised machine-learning approaches. Evaluating on unseen 29 operation notes, our best approach yielded 69.6% in precision, 58.3% in recall and 63.5% F-score. Copyright © 2014 Elsevier Inc. All rights reserved.

  16. Study on information fusion method by contamination analysis in hydraulic system

    NASA Astrophysics Data System (ADS)

    Zheng, Changsong; Ma, Biao; Shen, Rongwei

    2005-12-01

    The oil monitoring on the Power-shift Steering Transmission (PSST) of armored Tracked Vehicle road tests are carried out to control the contamination in hydraulic system and avoid the PSST's deadly fault caused by oil contamination. The oil was analyzed by the Portable Oil Diagnosis System (PODS), which can show the large quantity of every particle size. The information is fused by principal component analysis, which is developed from ideals of dimension reduction in multivariate analysis. The result shows that this method can find the key information from the large information, while which can offer the theory and test proof for designer how to select the filtration ratio and to control the system's contamination in process of product design.

  17. Post-reconstruction non-local means filtering methods using CT side information for quantitative SPECT.

    PubMed

    Chun, Se Young; Fessler, Jeffrey A; Dewaraja, Yuni K

    2013-09-07

    Quantitative SPECT techniques are important for many applications including internal emitter therapy dosimetry where accurate estimation of total target activity and activity distribution within targets are both potentially important for dose–response evaluations. We investigated non-local means (NLM) post-reconstruction filtering for accurate I-131 SPECT estimation of both total target activity and the 3D activity distribution. We first investigated activity estimation versus number of ordered-subsets expectation–maximization (OSEM) iterations. We performed simulations using the XCAT phantom with tumors containing a uniform and a non-uniform activity distribution, and measured the recovery coefficient (RC) and the root mean squared error (RMSE) to quantify total target activity and activity distribution, respectively. We observed that using more OSEM iterations is essential for accurate estimation of RC, but may or may not improve RMSE. We then investigated various post-reconstruction filtering methods to suppress noise at high iteration while preserving image details so that both RC and RMSE can be improved. Recently, NLM filtering methods have shown promising results for noise reduction. Moreover, NLM methods using high-quality side information can improve image quality further. We investigated several NLM methods with and without CT side information for I-131 SPECT imaging and compared them to conventional Gaussian filtering and to unfiltered methods. We studied four different ways of incorporating CT information in the NLM methods: two known (NLM CT-B and NLM CT-M) and two newly considered (NLM CT-S and NLM CT-H). We also evaluated the robustness of NLM filtering using CT information to erroneous CT. NLM CT-S and NLM CT-H yielded comparable RC values to unfiltered images while substantially reducing RMSE. NLM CT-S achieved −2.7 to 2.6% increase of RC compared to no filtering and NLM CT-H yielded up to 6% decrease in RC while other methods yielded lower

  18. Post-reconstruction non-local means filtering methods using CT side information for quantitative SPECT

    NASA Astrophysics Data System (ADS)

    Chun, Se Young; Fessler, Jeffrey A.; Dewaraja, Yuni K.

    2013-09-01

    Quantitative SPECT techniques are important for many applications including internal emitter therapy dosimetry where accurate estimation of total target activity and activity distribution within targets are both potentially important for dose-response evaluations. We investigated non-local means (NLM) post-reconstruction filtering for accurate I-131 SPECT estimation of both total target activity and the 3D activity distribution. We first investigated activity estimation versus number of ordered-subsets expectation-maximization (OSEM) iterations. We performed simulations using the XCAT phantom with tumors containing a uniform and a non-uniform activity distribution, and measured the recovery coefficient (RC) and the root mean squared error (RMSE) to quantify total target activity and activity distribution, respectively. We observed that using more OSEM iterations is essential for accurate estimation of RC, but may or may not improve RMSE. We then investigated various post-reconstruction filtering methods to suppress noise at high iteration while preserving image details so that both RC and RMSE can be improved. Recently, NLM filtering methods have shown promising results for noise reduction. Moreover, NLM methods using high-quality side information can improve image quality further. We investigated several NLM methods with and without CT side information for I-131 SPECT imaging and compared them to conventional Gaussian filtering and to unfiltered methods. We studied four different ways of incorporating CT information in the NLM methods: two known (NLM CT-B and NLM CT-M) and two newly considered (NLM CT-S and NLM CT-H). We also evaluated the robustness of NLM filtering using CT information to erroneous CT. NLM CT-S and NLM CT-H yielded comparable RC values to unfiltered images while substantially reducing RMSE. NLM CT-S achieved -2.7 to 2.6% increase of RC compared to no filtering and NLM CT-H yielded up to 6% decrease in RC while other methods yielded lower RCs

  19. The method providing fault-tolerance for information and control systems of the industrial mechatronic objects

    NASA Astrophysics Data System (ADS)

    Melnik, E. V.; Klimenko, A. B.; Korobkin, V. V.

    2017-02-01

    The paper deals with the provision of information and control system fault-tolerance. Nowadays, a huge quantity of industrial mechatronic objects operate within hazardous environments, where the human is not supposed to be. So the question of fault-tolerant information and control system design and development becomes the cornerstone of a large amount of industrial mechatronic objects. Within this paper, a new complex method of providing the reconfigurable systems fault-tolerance is represented. It bases on performance redundancy and decentralized dispatching principles. The key term within the method presented is a ‘configuration’, so the model of the configuration forming problem is represented too, and simulation results are given and discussed briefly.

  20. Patent information retrieval: approaching a method and analysing nanotechnology patent collaborations.

    PubMed

    Ozcan, Sercan; Islam, Nazrul

    2017-01-01

    Many challenges still remain in the processing of explicit technological knowledge documents such as patents. Given the limitations and drawbacks of the existing approaches, this research sets out to develop an improved method for searching patent databases and extracting patent information to increase the efficiency and reliability of nanotechnology patent information retrieval process and to empirically analyse patent collaboration. A tech-mining method was applied and the subsequent analysis was performed using Thomson data analyser software. The findings show that nations such as Korea and Japan are highly collaborative in sharing technological knowledge across academic and corporate organisations within their national boundaries, and China presents, in some cases, a great illustration of effective patent collaboration and co-inventorship. This study also analyses key patent strengths by country, organisation and technology.

  1. Information System Hazard Analysis: A Method for Identifying Technology-induced Latent Errors for Safety.

    PubMed

    Weber, Jens H; Mason-Blakley, Fieran; Price, Morgan

    2015-01-01

    Many health information and communication technologies (ICT) are safety-critical; moreover, reports of technology-induced adverse events related to them are plentiful in the literature. Despite repeated criticism and calls to action, recent data collected by the Institute of Medicine (IOM) and other organization do not indicate significant improvements with respect to the safety of health ICT systems. A large part of the industry still operates on a reactive "break & patch" model; the application of pro-active, systematic hazard analysis methods for engineering ICT that produce "safe by design" products is sparse. This paper applies one such method: Information System Hazard Analysis (ISHA). ISHA adapts and combines hazard analysis techniques from other safety-critical domains and customizes them for ICT. We provide an overview of the steps involved in ISHA and describe.

  2. A method for tailoring the information content of a software process model

    NASA Technical Reports Server (NTRS)

    Perkins, Sharon; Arend, Mark B.

    1990-01-01

    The framework is defined for a general method for selecting a necessary and sufficient subset of a general software life cycle's information products, to support new software development process. Procedures for characterizing problem domains in general and mapping to a tailored set of life cycle processes and products is presented. An overview of the method is shown using the following steps: (1) During the problem concept definition phase, perform standardized interviews and dialogs between developer and user, and between user and customer; (2) Generate a quality needs profile of the software to be developed, based on information gathered in step 1; (3) Translate the quality needs profile into a profile of quality criteria that must be met by the software to satisfy the quality needs; (4) Map the quality criteria to a set of accepted processes and products for achieving each criterion; (5) select the information products which match or support the accepted processes and product of step 4; and (6) Select the design methodology which produces the information products selected in step 5.

  3. A method for tailoring the information content of a software process model

    NASA Technical Reports Server (NTRS)

    Perkins, Sharon; Arend, Mark B.

    1990-01-01

    The framework is defined for a general method for selecting a necessary and sufficient subset of a general software life cycle's information products, to support new software development process. Procedures for characterizing problem domains in general and mapping to a tailored set of life cycle processes and products is presented. An overview of the method is shown using the following steps: (1) During the problem concept definition phase, perform standardized interviews and dialogs between developer and user, and between user and customer; (2) Generate a quality needs profile of the software to be developed, based on information gathered in step 1; (3) Translate the quality needs profile into a profile of quality criteria that must be met by the software to satisfy the quality needs; (4) Map the quality criteria to set of accepted processes and products for achieving each criterion; (5) Select the information products which match or support the accepted processes and product of step 4; and (6) Select the design methodology which produces the information products selected in step 5.

  4. NIST method for determining model-independent structural information by X-ray reflectometry

    SciTech Connect

    Windover, D.; Cline, J. P.; Henins, A.; Gil, D. L.; Armstrong, N.; Hung, P. Y.; Song, S. C.; Jammy, R.; Diebold, A.

    2007-09-26

    This work provides a method for determining when X-ray reflectometry (XRR) data provides useful information about structural model parameters. State-of-the-art analysis approaches for XRR data emphasize fitting measured data to a single structural model using fast optimization methods, such as genetic algorithms (GA). Though such optimization may find the best solution for a given model, it does not adequately map the parameter space to provide uncertainty estimates or test structural model validity. We present here two approaches for determining which structural parameters convey accurate information about the physical reality. First, using GA refinement, we repeatedly fit the data to several structural models. By comparing the maximum-likelihood estimates of the parameters in each model, we identify model-independent information. Second, we perform a Monte Carlo Markov Chain (MCMC) analysis using the most self-consistent structural model to provide uncertainty estimates for structural parameters. This two step approach uses fast, optimized refinement to search a range of models to locate structural information and a more detailed MCMC sampling to estimate parameter uncertainties. Here we present an example of this approach on a ZrN/TiN/Si structure, concentrating on thickness.

  5. A Method for Evaluating Information Security Governance (ISG) Components in Banking Environment

    NASA Astrophysics Data System (ADS)

    Ula, M.; Ula, M.; Fuadi, W.

    2017-02-01

    As modern banking increasingly relies on the internet and computer technologies to operate their businesses and market interactions, the threats and security breaches have highly increased in recent years. Insider and outsider attacks have caused global businesses lost trillions of Dollars a year. Therefore, that is a need for a proper framework to govern the information security in the banking system. The aim of this research is to propose and design an enhanced method to evaluate information security governance (ISG) implementation in banking environment. This research examines and compares the elements from the commonly used information security governance frameworks, standards and best practices. Their strength and weakness are considered in its approaches. The initial framework for governing the information security in banking system was constructed from document review. The framework was categorized into three levels which are Governance level, Managerial level, and technical level. The study further conducts an online survey for banking security professionals to get their professional judgment about the ISG most critical components and the importance for each ISG component that should be implemented in banking environment. Data from the survey was used to construct a mathematical model for ISG evaluation, component importance data used as weighting coefficient for the related component in the mathematical model. The research further develops a method for evaluating ISG implementation in banking based on the mathematical model. The proposed method was tested through real bank case study in an Indonesian local bank. The study evidently proves that the proposed method has sufficient coverage of ISG in banking environment and effectively evaluates the ISG implementation in banking environment.

  6. Electron crystallography as an informative method for studying the structure of nanoparticles

    SciTech Connect

    Avilov, A. S.; Gubin, S. P.; Zaporozhets, M. A.

    2013-11-15

    The overwhelming majority of modern nanotechnologies deal with nanoparticles owing to the great variety of their unusual properties, which make them irreplaceable in various fields of science and technology. Since the physical properties of nanoparticles depend on their composition, structure, and shape, the problem of monitoring these parameters both after and during formation of nanoparticles is very important. Methods of electron crystallography are most informative and appropriate for studying and monitoring nanoparticle parameters. In this review, we briefly report the main modern methods based on the use of electron diffraction and electron microscopy, along with examples of their applications for nanoparticles, to solve a number of urgent structural problems of nanomaterials science.

  7. A theory-based online health behavior intervention for new university students: study protocol

    PubMed Central

    2013-01-01

    Background Too few young people engage in behaviors that reduce the risk of morbidity and premature mortality, such as eating healthily, being physically active, drinking sensibly and not smoking. The present research developed an online intervention to target these health behaviors during the significant life transition from school to university when health beliefs and behaviors may be more open to change. This paper describes the intervention and the proposed approach to its evaluation. Methods/design Potential participants (all undergraduates about to enter the University of Sheffield) will be emailed an online questionnaire two weeks before starting university. On completion of the questionnaire, respondents will be randomly assigned to receive either an online health behavior intervention (U@Uni) or a control condition. The intervention employs three behavior change techniques (self-affirmation, theory-based messages, and implementation intentions) to target four heath behaviors (alcohol consumption, physical activity, fruit and vegetable intake, and smoking). Subsequently, all participants will be emailed follow-up questionnaires approximately one and six months after starting university. The questionnaires will assess the four targeted behaviors and associated cognitions (e.g., intentions, self-efficacy) as well as socio-demographic variables, health status, Body Mass Index (BMI), health service use and recreational drug use. A sub-sample of participants will provide a sample of hair to assess changes in biochemical markers of health behavior. A health economic evaluation of the cost effectiveness of the intervention will also be conducted. Discussion The findings will provide evidence on the effectiveness of online interventions as well as the potential for intervening during significant life transitions, such as the move from school to university. If successful, the intervention could be employed at other universities to promote healthy behaviors among new

  8. Increasing smoke alarm operability through theory-based health education: a randomised trial

    PubMed Central

    Miller, Ted R; Bergen, Gwen; Ballesteros, Michael F; Bhattacharya, Soma; Gielen, Andrea Carlson; Sheppard, Monique S

    2015-01-01

    Background Although working smoke alarms halve deaths in residential fires, many households do not keep alarms operational. We tested whether theory-based education increases alarm operability. Methods Randomised multiarm trial, with a single arm randomly selected for use each day, in low-income neighbourhoods in Maryland, USA. Intervention arms: (1) Full Education combining a health belief module with a social-cognitive theory module that provided hands-on practice installing alarm batteries and using the alarm’s hush button; (2) Hands-on Practice social-cognitive module supplemented by typical fire department education; (3) Current Norm receiving typical fire department education only. Four hundred and thirty-six homes recruited through churches or by knocking on doors in 2005–2008. Followup visits checked alarm operability in 370 homes (85%) 1–3.5 years after installation. Main outcome measures: number of homes with working alarms defined as alarms with working batteries or hard-wired and number of working alarms per home. Regressions controlled for alarm status preintervention; demographics and beliefs about fire risks and alarm effectiveness. Results Homes in the Full Education and Practice arms were more likely to have a functioning smoke alarm at follow-up (OR=2.77, 95% CI 1.09 to 7.03) and had an average of 0.32 more working alarms per home (95% CI 0.09 to 0.56). Working alarms per home rose 16%. Full Education and Practice had similar effectiveness (p=0.97 on both outcome measures). Conclusions Without exceeding typical fire department installation time, installers can achieve greater smoke alarm operability. Hands-on practice is key. Two years after installation, for every three homes that received hands-on practice, one had an additional working alarm. Trial registration number http://www.clinicaltrials.gov number NCT00139126. PMID:25165090

  9. Uncertainty estimates of purity measurements based on current information: toward a "live validation" of purity methods.

    PubMed

    Apostol, Izydor; Kelner, Drew; Jiang, Xinzhao Grace; Huang, Gang; Wypych, Jette; Zhang, Xin; Gastwirt, Jessica; Chen, Kenneth; Fodor, Szilan; Hapuarachchi, Suminda; Meriage, Dave; Ye, Frank; Poppe, Leszek; Szpankowski, Wojciech

    2012-12-01

    To predict precision and other performance characteristics of chromatographic purity methods, which represent the most widely used form of analysis in the biopharmaceutical industry. We have conducted a comprehensive survey of purity methods, and show that all performance characteristics fall within narrow measurement ranges. This observation was used to develop a model called Uncertainty Based on Current Information (UBCI), which expresses these performance characteristics as a function of the signal and noise levels, hardware specifications, and software settings. We applied the UCBI model to assess the uncertainty of purity measurements, and compared the results to those from conventional qualification. We demonstrated that the UBCI model is suitable to dynamically assess method performance characteristics, based on information extracted from individual chromatograms. The model provides an opportunity for streamlining qualification and validation studies by implementing a "live validation" of test results utilizing UBCI as a concurrent assessment of measurement uncertainty. Therefore, UBCI can potentially mitigate the challenges associated with laborious conventional method validation and facilitates the introduction of more advanced analytical technologies during the method lifecycle.

  10. Evaluation of optimization methods for nonrigid medical image registration using mutual information and B-splines.

    PubMed

    Klein, Stefan; Staring, Marius; Pluim, Josien P W

    2007-12-01

    A popular technique for nonrigid registration of medical images is based on the maximization of their mutual information, in combination with a deformation field parameterized by cubic B-splines. The coordinate mapping that relates the two images is found using an iterative optimization procedure. This work compares the performance of eight optimization methods: gradient descent (with two different step size selection algorithms), quasi-Newton, nonlinear conjugate gradient, Kiefer-Wolfowitz, simultaneous perturbation, Robbins-Monro, and evolution strategy. Special attention is paid to computation time reduction by using fewer voxels to calculate the cost function and its derivatives. The optimization methods are tested on manually deformed CT images of the heart, on follow-up CT chest scans, and on MR scans of the prostate acquired using a BFFE, T1, and T2 protocol. Registration accuracy is assessed by computing the overlap of segmented edges. Precision and convergence properties are studied by comparing deformation fields. The results show that the Robbins-Monro method is the best choice in most applications. With this approach, the computation time per iteration can be lowered approximately 500 times without affecting the rate of convergence by using a small subset of the image, randomly selected in every iteration, to compute the derivative of the mutual information. From the other methods the quasi-Newton and the nonlinear conjugate gradient method achieve a slightly higher precision, at the price of larger computation times.

  11. Closing the digital divide in HIV/AIDS care: development of a theory-based intervention to increase Internet access.

    PubMed

    Kalichman, S C; Weinhardt, L; Benotsch, E; Cherry, C

    2002-08-01

    Advances in information technology are revolutionizing medical patient education and the Internet is becoming a major source of information for people with chronic medical conditions, including HIV/AIDS. However, many AIDS patients do not have equal access to the Internet and are therefore at an information disadvantage, particularly minorities, persons of low-income levels and individuals with limited education. This paper describes the development and pilot testing of a workshop-style intervention designed to close the digital divide in AIDS care. Grounded in the Information-Motivation-Behavioral Skills (IMB) model of health behaviour change, we developed an intervention for persons with no prior history of using the Internet. The intervention included instruction in using hardware and search engines, motivational enhancement to increase interest and perceived relevance of the Internet, and skills for critically evaluating and using health information accessed via the Internet. Participants were also introduced to communication and support functions of the Internet including e-mail, newsgroups and chat groups. Pilot testing demonstrated feasibility, acceptability and promise for closing the digital divide in HIV/AIDS care using a relatively brief and intensive theory-based intervention that could be implemented in community settings.

  12. Genetic algorithm and graph theory based matrix factorization method for online friend recommendation.

    PubMed

    Li, Qu; Yao, Min; Yang, Jianhua; Xu, Ning

    2014-01-01

    Online friend recommendation is a fast developing topic in web mining. In this paper, we used SVD matrix factorization to model user and item feature vector and used stochastic gradient descent to amend parameter and improve accuracy. To tackle cold start problem and data sparsity, we used KNN model to influence user feature vector. At the same time, we used graph theory to partition communities with fairly low time and space complexity. What is more, matrix factorization can combine online and offline recommendation. Experiments showed that the hybrid recommendation algorithm is able to recommend online friends with good accuracy.

  13. a Registration Method of Point Clouds Collected by Mobile LIDAR Using Solely Standard Las Files Information

    NASA Astrophysics Data System (ADS)

    Gézero, L.; Antunes, C.

    2017-05-01

    In the last few years, LiDAR sensors installed in terrestrial vehicles have been revealed as an efficient method to collect very dense 3D georeferenced information. The possibility of creating very dense point clouds representing the surface surrounding the sensor, at a given moment, in a very fast, detailed and easy way, shows the potential of this technology to be used for cartography and digital terrain models production in large scale. However, there are still some limitations associated with the use of this technology. When several acquisitions of the same area with the same device, are made, differences between the clouds can be observed. The range of that differences can go from few centimetres to some several tens of centimetres, mainly in urban and high vegetation areas where the occultation of the GNSS system introduces a degradation of the georeferenced trajectory. Along this article a different method point cloud registration is proposed. In addition to the efficiency and speed of execution, the main advantages of the method are related to the fact that the adjustment is continuously made over the trajectory, based on the GPS time. The process is fully automatic and only information recorded in the standard LAS files is used, without the need for any auxiliary information, in particular regarding the trajectory.

  14. ROI-preserving 3D video compression method utilizing depth information

    NASA Astrophysics Data System (ADS)

    Ti, Chunli; Xu, Guodong; Guan, Yudong; Teng, Yidan

    2015-09-01

    Efficiently transmitting the extra information of three dimensional (3D) video is becoming a key issue of the development of 3DTV. 2D plus depth format not only occupies the smaller bandwidth and is compatible transmission under the condition of the existing channel, but also can provide technique support for advanced 3D video compression in some extend. This paper proposes an ROI-preserving compression scheme to further improve the visual quality at a limited bit rate. According to the connection between the focus of Human Visual System (HVS) and depth information, region of interest (ROI) can be automatically selected via depth map progressing. The main improvement from common method is that a meanshift based segmentation is executed to the depth map before foreground ROI selection to keep the integrity of scene. Besides, the sensitive areas along the edges are also protected. The Spatio-temporal filtering adapting to H.264 is used to the non-ROI of both 2D video and depth map before compression. Experiments indicate that, the ROI extracted by this method is more undamaged and according with subjective feeling, and the proposed method can keep the key high-frequency information more effectively while the bit rate is reduced.

  15. Review of methods for handling confounding by cluster and informative cluster size in clustered data

    PubMed Central

    Seaman, Shaun; Pavlou, Menelaos; Copas, Andrew

    2014-01-01

    Clustered data are common in medical research. Typically, one is interested in a regression model for the association between an outcome and covariates. Two complications that can arise when analysing clustered data are informative cluster size (ICS) and confounding by cluster (CBC). ICS and CBC mean that the outcome of a member given its covariates is associated with, respectively, the number of members in the cluster and the covariate values of other members in the cluster. Standard generalised linear mixed models for cluster-specific inference and standard generalised estimating equations for population-average inference assume, in general, the absence of ICS and CBC. Modifications of these approaches have been proposed to account for CBC or ICS. This article is a review of these methods. We express their assumptions in a common format, thus providing greater clarity about the assumptions that methods proposed for handling CBC make about ICS and vice versa, and about when different methods can be used in practice. We report relative efficiencies of methods where available, describe how methods are related, identify a previously unreported equivalence between two key methods, and propose some simple additional methods. Unnecessarily using a method that allows for ICS/CBC has an efficiency cost when ICS and CBC are absent. We review tools for identifying ICS/CBC. A strategy for analysis when CBC and ICS are suspected is demonstrated by examining the association between socio-economic deprivation and preterm neonatal death in Scotland. PMID:25087978

  16. Liminality in cultural transition: applying ID-EA to advance a concept into theory-based practice.

    PubMed

    Baird, Martha B; Reed, Pamela G

    2015-01-01

    As global migration increases worldwide, nursing interventions are needed to address the effects of migration on health. The concept of liminality emerged as a pivotal concept in the situation-specific theory of well-being in refugee women experiencing cultural transition. As a relatively new concept in the discipline of nursing, liminality is explored using a method, called ID-EA, which we developed to advance a theoretical concept for application to nursing practice. Liminality in the context of cultural transition is further developed using the five steps of inquiry of the ID-EA method. The five steps are as follows: (1) inductive inquiry: qualitative research, (2) deductive inquiry: literature review, (3) synthesis of inductive and deductive inquiry, (4) evaluation inquiry, and (5) application-to-practice inquiry. The overall goal of this particular work was to develop situation-specific, theory-based interventions that facilitate cultural transitions for immigrants and refugees.

  17. Development and Evaluation of a Theory-Based Physical Activity Guidebook for Breast Cancer Survivors

    ERIC Educational Resources Information Center

    Vallance, Jeffrey K.; Courneya, Kerry S.; Taylor, Lorian M.; Plotnikoff, Ronald C.; Mackey, John R.

    2008-01-01

    This study's objective was to develop and evaluate the suitability and appropriateness of a theory-based physical activity (PA) guidebook for breast cancer survivors. Guidebook content was constructed based on the theory of planned behavior (TPB) using salient exercise beliefs identified by breast cancer survivors in previous research. Expert…

  18. Advancing the Development and Application of Theory-Based Evaluation in the Practice of Public Health.

    ERIC Educational Resources Information Center

    Cole, Galen E.

    1999-01-01

    Provides strategies for constructing theories of theory-based evaluation and provides examples in the field of public health. Techniques are designed to systematize and bring objectivity to the process of theory construction. Also introduces a framework of program theory. (SLD)

  19. Lessons Learnt from Employing van Hiele Theory Based Instruction in Senior Secondary School Geometry Classrooms

    ERIC Educational Resources Information Center

    Alex, Jogymol Kalariparambil; Mammen, Kuttickattu John

    2016-01-01

    This paper reports on a part of a study which was conducted to determine the effect of van Hiele theory based instruction in the teaching of geometry to Grade 10 learners. The sample consisted of 359 participants from five conveniently selected schools from Mthatha District in the Eastern Cape Province in South Africa. There were 195 learners in…

  20. Effects of a Theory-Based, Peer-Focused Drug Education Course.

    ERIC Educational Resources Information Center

    Gonzalez, Gerardo M.

    1990-01-01

    Describes innovative, theory-based, peer-focused college drug education academic course and its effect on perceived levels of risk associated with the use of alcohol, marijuana, and cocaine. Evaluation of the effects of the course indicated the significant effect on perceived risk of cocaine, but not alcohol or marijuana. (Author/ABL)

  1. Assessment of Prevalence of Persons with Down Syndrome: A Theory-Based Demographic Model

    ERIC Educational Resources Information Center

    de Graaf, Gert; Vis, Jeroen C.; Haveman, Meindert; van Hove, Geert; de Graaf, Erik A. B.; Tijssen, Jan G. P.; Mulder, Barbara J. M.

    2011-01-01

    Background: The Netherlands are lacking reliable empirical data in relation to the development of birth and population prevalence of Down syndrome. For the UK and Ireland there are more historical empirical data available. A theory-based model is developed for predicting Down syndrome prevalence in the Netherlands from the 1950s onwards. It is…

  2. Towards a Theory-Based Design Framework for an Effective E-Learning Computer Programming Course

    ERIC Educational Resources Information Center

    McGowan, Ian S.

    2016-01-01

    Built on Dabbagh (2005), this paper presents a four component theory-based design framework for an e-learning session in introductory computer programming. The framework, driven by a body of exemplars component, emphasizes the transformative interaction between the knowledge building community (KBC) pedagogical model, a mixed instructional…

  3. Theory-Based Evaluation of a Comprehensive Latino Education Initiative: An Interactive Evaluation Approach

    ERIC Educational Resources Information Center

    Nesman, Teresa M.; Batsche, Catherine; Hernandez, Mario

    2007-01-01

    Latino student access to higher education has received significant national attention in recent years. This article describes a theory-based evaluation approach used with ENLACE of Hillsborough, a 5-year project funded by the W.K. Kellogg Foundation for the purpose of increasing Latino student graduation from high school and college. Theory-based…

  4. Using Emergence Theory-Based Curriculum to Teach Compromise Skills to Students with Autistic Spectrum Disorders

    ERIC Educational Resources Information Center

    Fein, Lance; Jones, Don

    2015-01-01

    This study addresses the compromise skills that are taught to students diagnosed with autistic spectrum disorders (ASD) and related social and communication deficits. A private school in the southeastern United States implemented an emergence theory-based curriculum to address these skills, yet no formal analysis was conducted to determine its…

  5. Assessing Instructional Reform in San Diego: A Theory-Based Approach

    ERIC Educational Resources Information Center

    O'Day, Jennifer; Quick, Heather E.

    2009-01-01

    This article provides an overview of the approach, methodology, and key findings from a theory-based evaluation of the district-led instructional reform effort in San Diego City Schools, under the leadership of Alan Bersin and Anthony Alvarado, that began in 1998. Beginning with an analysis of the achievement trends in San Diego relative to other…

  6. Validating a Theory-Based Survey to Evaluate Teaching Effectiveness in Higher Education

    ERIC Educational Resources Information Center

    Amrein-Beardsley, A.; Haladyna, T.

    2012-01-01

    Surveys to evaluate instructor effectiveness are commonly used in higher education. Yet the survey items included are often drawn from other surveys without reference to a theory of adult learning. The authors present the results from a validation study of such a theory-based survey. They evidence that an evaluation survey based on a theory that…

  7. Validating a Theory-Based Survey to Evaluate Teaching Effectiveness in Higher Education

    ERIC Educational Resources Information Center

    Amrein-Beardsley, A.; Haladyna, T.

    2012-01-01

    Surveys to evaluate instructor effectiveness are commonly used in higher education. Yet the survey items included are often drawn from other surveys without reference to a theory of adult learning. The authors present the results from a validation study of such a theory-based survey. They evidence that an evaluation survey based on a theory that…

  8. A Theory-Based Approach to Reading Assessment in the Army. Technical Report 625.

    ERIC Educational Resources Information Center

    Oxford-Carpenter, Rebecca L.; Schultz-Shiner, Linda J.

    Noting that the United States Army Research Institute for the Behavioral and Social Sciences (ARI) has been involved in research on reading assessment in the Army from both practical and theoretical perspectives, this paper addresses practical Army problems in reading assessment from a theory base that reflects the most recent and most sound…

  9. Development and Evaluation of a Theory-Based Physical Activity Guidebook for Breast Cancer Survivors

    ERIC Educational Resources Information Center

    Vallance, Jeffrey K.; Courneya, Kerry S.; Taylor, Lorian M.; Plotnikoff, Ronald C.; Mackey, John R.

    2008-01-01

    This study's objective was to develop and evaluate the suitability and appropriateness of a theory-based physical activity (PA) guidebook for breast cancer survivors. Guidebook content was constructed based on the theory of planned behavior (TPB) using salient exercise beliefs identified by breast cancer survivors in previous research. Expert…

  10. Assessment of Prevalence of Persons with Down Syndrome: A Theory-Based Demographic Model

    ERIC Educational Resources Information Center

    de Graaf, Gert; Vis, Jeroen C.; Haveman, Meindert; van Hove, Geert; de Graaf, Erik A. B.; Tijssen, Jan G. P.; Mulder, Barbara J. M.

    2011-01-01

    Background: The Netherlands are lacking reliable empirical data in relation to the development of birth and population prevalence of Down syndrome. For the UK and Ireland there are more historical empirical data available. A theory-based model is developed for predicting Down syndrome prevalence in the Netherlands from the 1950s onwards. It is…

  11. Schema Theory-Based Pre-Reading Tasks: A Neglected Essential in the ESL Reading Class.

    ERIC Educational Resources Information Center

    Ajideh, Parviz

    2003-01-01

    Describes a study in which an English-as-a-Second-Language reading instructor worked with a group of intermediate students that focused on schema theory-based pre-reading activities. Highlights the students' impressions on the strategies covered during the term. (Author/VWL)

  12. Ninter-Networked Interaction: Theory-based Cases in Teaching and Learning.

    ERIC Educational Resources Information Center

    Saarenkunnas, Maarit; Jarvela, Sanna; Hakkinen, Paivi; Kuure, Leena; Taalas, Peppi; Kunelius, Esa

    2000-01-01

    Describes the pedagogical framework of an interdisciplinary, international project entitled NINTER (Networked Interaction: Theory-Based Cases in Teaching and Learning). Discusses a pedagogical model for teacher and staff development programs in a networked environment; distributed cognition; cognitive apprenticeship; challenges for educational…

  13. Assessing Instructional Reform in San Diego: A Theory-Based Approach

    ERIC Educational Resources Information Center

    O'Day, Jennifer; Quick, Heather E.

    2009-01-01

    This article provides an overview of the approach, methodology, and key findings from a theory-based evaluation of the district-led instructional reform effort in San Diego City Schools, under the leadership of Alan Bersin and Anthony Alvarado, that began in 1998. Beginning with an analysis of the achievement trends in San Diego relative to other…

  14. Lessons Learnt from Employing van Hiele Theory Based Instruction in Senior Secondary School Geometry Classrooms

    ERIC Educational Resources Information Center

    Alex, Jogymol Kalariparambil; Mammen, Kuttickattu John

    2016-01-01

    This paper reports on a part of a study which was conducted to determine the effect of van Hiele theory based instruction in the teaching of geometry to Grade 10 learners. The sample consisted of 359 participants from five conveniently selected schools from Mthatha District in the Eastern Cape Province in South Africa. There were 195 learners in…

  15. Aphasic speech with and without SentenceShaper: Two methods for assessing informativeness.

    PubMed

    Fink, Ruth B; Bartlett, Megan R; Lowery, Jennifer S; Linebarger, Marcia C; Schwartz, Myrna F

    2008-01-01

    BACKGROUND: SentenceShaper((R)) (SSR) is a computer program that is for speech what a word-processing program is for written text; it allows the user to record words and phrases, play them back, and manipulate them on-screen to build sentences and narratives. A recent study demonstrated that when listeners rated the informativeness of functional narratives produced by chronic aphasic speakers with and without the program they gave higher informativeness ratings to the language produced with the aid of the program (Bartlett, Fink, Schwartz, & Linebarger, 2007). Bartlett et al. (2007) also compared unaided (spontaneous) narratives produced before and after the aided version of the narrative was obtained. In a subset of comparisons, the sample created after was judged to be more informative; they called this "topic-specific carryover". AIMS: (1) To determine whether differences in informativeness that Bartlett et al.'s listeners perceived are also revealed by Correct Information Unit (CIU) analysis (Nicholas & Brookshire, 1993)-a well studied, objective method for measuring informativeness-and (2) to demonstrate the usefulness of CIU analysis for samples of this type. METHODS #ENTITYSTARTX00026; PROCEDURES: A modified version of the CIU analysis was applied to the speech samples obtained by Bartlett et al. (2007). They had asked five individuals with chronic aphasia to create functional narratives on two topics, under three conditions: Unaided ("U"), Aided ("SSR"), & Post-SSR Unaided ("Post-U"). Here, these samples were analysed for differences in % CIUs across conditions. Linear associations between listener judgements and CIU measures were evaluated with bivariate correlations and multiple regression analysis. OUTCOMES #ENTITYSTARTX00026; RESULTS: (1) The aided effect was confirmed: samples produced with SentenceShaper had higher % CIUs, in most cases exceeding 90%. (2) There was little CONCLUSIONS: That the percentage of CIUs was higher in SSR-aided samples than in

  16. The use of qualitative methods to inform Delphi surveys in core outcome set development.

    PubMed

    Keeley, T; Williamson, P; Callery, P; Jones, L L; Mathers, J; Jones, J; Young, B; Calvert, M

    2016-05-04

    Core outcome sets (COS) help to minimise bias in trials and facilitate evidence synthesis. Delphi surveys are increasingly being used as part of a wider process to reach consensus about what outcomes should be included in a COS. Qualitative research can be used to inform the development of Delphi surveys. This is an advance in the field of COS development and one which is potentially valuable; however, little guidance exists for COS developers on how best to use qualitative methods and what the challenges are. This paper aims to provide early guidance on the potential role and contribution of qualitative research in this area. We hope the ideas we present will be challenged, critiqued and built upon by others exploring the role of qualitative research in COS development. This paper draws upon the experiences of using qualitative methods in the pre-Delphi stage of the development of three different COS. Using these studies as examples, we identify some of the ways that qualitative research might contribute to COS development, the challenges in using such methods and areas where future research is required. Qualitative research can help to identify what outcomes are important to stakeholders; facilitate understanding of why some outcomes may be more important than others, determine the scope of outcomes; identify appropriate language for use in the Delphi survey and inform comparisons between stakeholder data and other sources, such as systematic reviews. Developers need to consider a number of methodological points when using qualitative research: specifically, which stakeholders to involve, how to sample participants, which data collection methods are most appropriate, how to consider outcomes with stakeholders and how to analyse these data. A number of areas for future research are identified. Qualitative research has the potential to increase the research community's confidence in COS, although this will be dependent upon using rigorous and appropriate

  17. Optical methods for molecular sensing: Supplementing imaging of tissue microstructure with molecular information

    NASA Astrophysics Data System (ADS)

    Winkler, Amy Marie

    More and more researchers and clinicians are looking to molecular sensing to predict how cells will behave, seeking the answers to questions like will these tumor cells become malignant? or how will these cells respond to chemotherapy? Optical methods are attractive for answering these questions because optical radiation is safer and less expensive than alternative methods, such as CT which uses X-ray radiation, PET/SPECT which use gamma radiation, or MRI which is expensive and only available in a hospital setting. In this dissertation, three distinct optical methods are explored to detect at the molecular level: optical coherence tomography (OCT), laser-induced fluorescence (LIF), and optical polarimetry. OCT has the capability to simultaneously capture anatomical information as well as molecular information using targeted contrast agents such as gold nanoshells. LIF is less useful for capturing anatomical information, but it can achieve significantly better molecular sensitivity with the use of targeted fluorescent dyes. Optical polarimetry has potential to detect the concentration of helical molecules, such as glucose. All of these methods are noninvasive or minimally invasive. The work is organized into four specific aims. The first is the design and implementation of a fast, high resolution, endoscopic OCT system to facilitate minimally invasive mouse colon imaging. The second aim is to demonstrate the utility of this system for automatically identifying tumor lesions based on tissue microstructure. The third is to demonstrate the use of contrast agents to detect molecular expression using OCT and LIF. The last aim is to demonstrate a new method based on optical polarimetry for noninvasive glucose sensing.

  18. A Method to Quantify Visual Information Processing in Children Using Eye Tracking.

    PubMed

    Kooiker, Marlou J G; Pel, Johan J M; van der Steen-Kant, Sanny P; van der Steen, Johannes

    2016-07-09

    Visual problems that occur early in life can have major impact on a child's development. Without verbal communication and only based on observational methods, it is difficult to make a quantitative assessment of a child's visual problems. This limits accurate diagnostics in children under the age of 4 years and in children with intellectual disabilities. Here we describe a quantitative method that overcomes these problems. The method uses a remote eye tracker and a four choice preferential looking paradigm to measure eye movement responses to different visual stimuli. The child sits without head support in front of a monitor with integrated infrared cameras. In one of four monitor quadrants a visual stimulus is presented. Each stimulus has a specific visual modality with respect to the background, e.g., form, motion, contrast or color. From the reflexive eye movement responses to these specific visual modalities, output parameters such as reaction times, fixation accuracy and fixation duration are calculated to quantify a child's viewing behavior. With this approach, the quality of visual information processing can be assessed without the use of communication. By comparing results with reference values obtained in typically developing children from 0-12 years, the method provides a characterization of visual information processing in visually impaired children. The quantitative information provided by this method can be advantageous for the field of clinical visual assessment and rehabilitation in multiple ways. The parameter values provide a good basis to: (i) characterize early visual capacities and consequently to enable early interventions; (ii) compare risk groups and follow visual development over time; and (iii), construct an individual visual profile for each child.

  19. A Method to Quantify Visual Information Processing in Children Using Eye Tracking

    PubMed Central

    Kooiker, Marlou J.G.; Pel, Johan J.M.; van der Steen-Kant, Sanny P.; van der Steen, Johannes

    2016-01-01

    Visual problems that occur early in life can have major impact on a child's development. Without verbal communication and only based on observational methods, it is difficult to make a quantitative assessment of a child's visual problems. This limits accurate diagnostics in children under the age of 4 years and in children with intellectual disabilities. Here we describe a quantitative method that overcomes these problems. The method uses a remote eye tracker and a four choice preferential looking paradigm to measure eye movement responses to different visual stimuli. The child sits without head support in front of a monitor with integrated infrared cameras. In one of four monitor quadrants a visual stimulus is presented. Each stimulus has a specific visual modality with respect to the background, e.g., form, motion, contrast or color. From the reflexive eye movement responses to these specific visual modalities, output parameters such as reaction times, fixation accuracy and fixation duration are calculated to quantify a child's viewing behavior. With this approach, the quality of visual information processing can be assessed without the use of communication. By comparing results with reference values obtained in typically developing children from 0-12 years, the method provides a characterization of visual information processing in visually impaired children. The quantitative information provided by this method can be advantageous for the field of clinical visual assessment and rehabilitation in multiple ways. The parameter values provide a good basis to: (i) characterize early visual capacities and consequently to enable early interventions; (ii) compare risk groups and follow visual development over time; and (iii), construct an individual visual profile for each child. PMID:27500922

  20. 30 CFR 48.23 - Training plans; time of submission; where filed; information required; time for approval; method...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...; information required; time for approval; method for disapproval; commencement of training; approval of... filed; information required; time for approval; method for disapproval; commencement of training... miners as a normal method of operation by the operator. The operator to be so excepted shall maintain...

  1. Improved method for calculating the respiratory line length in the Concealed Information Test.

    PubMed

    Matsuda, Izumi; Ogawa, Tokihiro

    2011-08-01

    The Concealed Information Test (CIT) assesses an examinee's knowledge about a crime based on response differences between crime-relevant and crime-irrelevant items. One effective measure in the CIT is the respiration line length, which is the average of the moving distances of the respiration curve in a specified time interval after the item onset. However, the moving distance differs between parts of a respiratory cycle. As a result, the calculated respiration line length is biased by how the parts of the respiratory cycles are included in the time interval. To resolve this problem, we propose a weighted average method, which calculates the respiration line length per cycle and weights it with the proportion that the cycle occupies in the time interval. Simulation results indicated that the weighted average method removes the bias of respiration line lengths compared to the original method. The results of experimental CIT data demonstrated that the weighted average method significantly increased the discrimination performance as compared with the original method. The weighted average method is a promising method for assessing respiration changes in response to question items more accurately, which improves the respiration-based discrimination performance of the CIT.

  2. Data Delivery Method Based on Neighbor Nodes' Information in a Mobile Ad Hoc Network

    PubMed Central

    Hayashi, Takuma; Taenaka, Yuzo; Okuda, Takeshi; Yamaguchi, Suguru

    2014-01-01

    This paper proposes a data delivery method based on neighbor nodes' information to achieve reliable communication in a mobile ad hoc network (MANET). In a MANET, it is difficult to deliver data reliably due to instabilities in network topology and wireless network condition which result from node movement. To overcome such unstable communication, opportunistic routing and network coding schemes have lately attracted considerable attention. Although an existing method that employs such schemes, MAC-independent opportunistic routing and encoding (MORE), Chachulski et al. (2007), improves the efficiency of data delivery in an unstable wireless mesh network, it does not address node movement. To efficiently deliver data in a MANET, the method proposed in this paper thus first employs the same opportunistic routing and network coding used in MORE and also uses the location information and transmission probabilities of neighbor nodes to adapt to changeable network topology and wireless network condition. The simulation experiments showed that the proposed method can achieve efficient data delivery with low network load when the movement speed is relatively slow. PMID:24672371

  3. Aircraft target onboard detecting technology via Circular Information Matching method for remote sensing satellite

    NASA Astrophysics Data System (ADS)

    Xiao, Huachao; Zhou, Quan; Li, Li

    2015-10-01

    Image information onboard processing is one o f important technology to rapidly achieve intelligence for remote sensing satellites. As a typical target, aircraft onboard detection has been getting more attention. In this paper, we propose an efficient method of aircraft detection for remote sensing satellite onboard processing. According to the feature of aircraft performance in remote sensing image, the detection algorithm consists of two steps: First Salient Object Detection (SOD) is employed to reduce the amount of calculation on large remote sensing image. SOD uses Gabor filtering and a simple binary test between pixels in a filtered image. White points are connected as regions. Plane candidate regions are screened from white regions by area, length and width of connected region. Next a new algorithm, called Circumferential Information Matching method, is used to detect aircraft on candidate regions. The results of tests show circumference curve around the plane center is stable shape, so the candidate region can be accurately detecting with this feature. In order to rotation invariant, we use circle matched filter to detect target. And discrete fast Fourier transform (DFFT) is used to accelerate and reduce calculation. Experiments show the detection accuracy rate of proposed algorithm is 90% with less than 0.5s processing time. In addition, the calculation of the proposed method through quantitative anglicized is very small. Experimental results and theoretical analysis show that the proposed method is reasonable and highly-efficient.

  4. Sex education and contraceptive methods: knowledge and sources of information among the Estonian population.

    PubMed

    Kalda, R; Sarapuu, H; Pikk, A; Lember, M

    1998-06-01

    A survey on sex education and contraceptive methods was carried out within a monthly EMOR Omnibus Survey. By using a questionnaire, knowledge and attitudes, as well as the main sources of information on contraceptive methods and sex education, among the Estonian adult population (n = 618) was investigated. Of the respondents, 68% were female and 32% were males: the mean age was 34 years. Almost all respondents expressed the opinion that sex education should start at school and that education on contraceptive methods would reduce the number of abortions. The majority of the respondents believed that it would be more convenient to visit a family doctor than a gynecologist for family planning. Main sources of information on contraception were: literature, doctors and journals, as rated by females; and literature, partners and television, as rated by males. The roles of the school nurse, father and siblings were rated as comparatively small. The level of respondents' knowledge of contraceptive methods was not too high. It is concluded that the prerequisites for changing sexual behavior and knowledge over a short time are wider use of mass media and better sex education at schools. Also, it is necessary to prepare family doctors to offer family planning services to their patients.

  5. Quantifying the informativeness for biomedical literature summarization: An itemset mining method.

    PubMed

    Moradi, Milad; Ghadiri, Nasser

    2017-07-01

    Automatic text summarization tools can help users in the biomedical domain to access information efficiently from a large volume of scientific literature and other sources of text documents. In this paper, we propose a summarization method that combines itemset mining and domain knowledge to construct a concept-based model and to extract the main subtopics from an input document. Our summarizer quantifies the informativeness of each sentence using the support values of itemsets appearing in the sentence. To address the concept-level analysis of text, our method initially maps the original document to biomedical concepts using the Unified Medical Language System (UMLS). Then, it discovers the essential subtopics of the text using a data mining technique, namely itemset mining, and constructs the summarization model. The employed itemset mining algorithm extracts a set of frequent itemsets containing correlated and recurrent concepts of the input document. The summarizer selects the most related and informative sentences and generates the final summary. We evaluate the performance of our itemset-based summarizer using the Recall-Oriented Understudy for Gisting Evaluation (ROUGE) metrics, performing a set of experiments. We compare the proposed method with GraphSum, TexLexAn, SweSum, SUMMA, AutoSummarize, the term-based version of the itemset-based summarizer, and two baselines. The results show that the itemset-based summarizer performs better than the compared methods. The itemset-based summarizer achieves the best scores for all the assessed ROUGE metrics (R-1: 0.7583, R-2: 0.3381, R-W-1.2: 0.0934, and R-SU4: 0.3889). We also perform a set of preliminary experiments to specify the best value for the minimum support threshold used in the itemset mining algorithm. The results demonstrate that the value of this threshold directly affects the accuracy of the summarization model, such that a significant decrease can be observed in the performance of summarization due to

  6. A Novel Evaluation Method for Building Construction Project Based on Integrated Information Entropy with Reliability Theory

    PubMed Central

    Bai, Xiao-ping; Zhang, Xi-wei

    2013-01-01

    Selecting construction schemes of the building engineering project is a complex multiobjective optimization decision process, in which many indexes need to be selected to find the optimum scheme. Aiming at this problem, this paper selects cost, progress, quality, and safety as the four first-order evaluation indexes, uses the quantitative method for the cost index, uses integrated qualitative and quantitative methodologies for progress, quality, and safety indexes, and integrates engineering economics, reliability theories, and information entropy theory to present a new evaluation method for building construction project. Combined with a practical case, this paper also presents detailed computing processes and steps, including selecting all order indexes, establishing the index matrix, computing score values of all order indexes, computing the synthesis score, sorting all selected schemes, and making analysis and decision. Presented method can offer valuable references for risk computing of building construction projects. PMID:23533352

  7. A novel evaluation method for building construction project based on integrated information entropy with reliability theory.

    PubMed

    Bai, Xiao-ping; Zhang, Xi-wei

    2013-01-01

    Selecting construction schemes of the building engineering project is a complex multiobjective optimization decision process, in which many indexes need to be selected to find the optimum scheme. Aiming at this problem, this paper selects cost, progress, quality, and safety as the four first-order evaluation indexes, uses the quantitative method for the cost index, uses integrated qualitative and quantitative methodologies for progress, quality, and safety indexes, and integrates engineering economics, reliability theories, and information entropy theory to present a new evaluation method for building construction project. Combined with a practical case, this paper also presents detailed computing processes and steps, including selecting all order indexes, establishing the index matrix, computing score values of all order indexes, computing the synthesis score, sorting all selected schemes, and making analysis and decision. Presented method can offer valuable references for risk computing of building construction projects.

  8. A robust medical image segmentation method using KL distance and local neighborhood information.

    PubMed

    Zheng, Qian; Lu, Zhentai; Yang, Wei; Zhang, Minghui; Feng, Qianjin; Chen, Wufan

    2013-06-01

    In this paper, we propose an improved Chan-Vese (CV) model that uses Kullback-Leibler (KL) distances and local neighborhood information (LNI). Due to the effects of heterogeneity and complex constructions, the performance of level set segmentation is subject to confounding by the presence of nearby structures of similar intensity, preventing it from discerning the exact boundary of the object. Moreover, the CV model cannot usually obtain accurate results in medical image segmentation in cases of optimal configuration of controlling parameters, which requires substantial manual intervention. To overcome the above deficiency, we improve the segmentation accuracy by the usage of KL distance and LNI, thereby introducing the image local characteristics. Performance evaluation of the present method was achieved through experiments on the synthetic images and a series of real medical images. The extensive experimental results showed the superior performance of the proposed method over the state-of-the-art methods, in terms of both robustness and efficiency.

  9. Bilateral Teleoperation Method Using an Autonomous Control Based on Information on Contact Environment

    NASA Astrophysics Data System (ADS)

    Taguchi, Keiichi; Ohnishi, Kouhei

    In procedures that involve remote control, such as remote surgery, it is necessary to operate a robot in a remote location in a sensitive environment; the treatment of internal organs is an example of such a procedure. In this paper, we propose a method for autonomous hazard avoidance control that is based on information on the contact environment. The proposed method involves the use of bilateral control. During safe operations, systems are controlled by bilateral control. During dangerous operations, a slave system is controlled autonomously so as to avoid dangerous operations. In order to determine the degree of operation risk, fuzzy set theory is applied to the force exerted on the environment. Further, variable compliance control based on the force exerted on the environment is utilized to avoid the risk. The effectiveness of the proposed method is confirmed by experimental results.

  10. Modeling of environmental and genetic interactions with AMBROSIA, an information-theoretic model synthesis method.

    PubMed

    Chanda, P; Zhang, A; Ramanathan, M

    2011-10-01

    To develop a model synthesis method for parsimoniously modeling gene-environmental interactions (GEI) associated with clinical outcomes and phenotypes. The AMBROSIA model synthesis approach utilizes the k-way interaction information (KWII), an information-theoretic metric capable of identifying variable combinations associated with GEI. For model synthesis, AMBROSIA considers relevance of combinations to the phenotype, it precludes entry of combinations with redundant information, and penalizes for unjustifiable complexity; each step is KWII based. The performance and power of AMBROSIA were evaluated with simulations and Genetic Association Workshop 15 (GAW15) data sets of rheumatoid arthritis (RA). AMBROSIA identified parsimonious models in data sets containing multiple interactions with linkage disequilibrium present. For the GAW15 data set containing 9187 single-nucleotide polymorphisms, the parsimonious AMBROSIA model identified nine RA-associated combinations with power >90%. AMBROSIA was compared with multifactor dimensionality reduction across several diverse models and had satisfactory power. Software source code is available from http://www.cse.buffalo.edu/DBGROUP/bioinformatics/resources.html. AMBROSIA is a promising method for GEI model synthesis.

  11. Modeling of environmental and genetic interactions with AMBROSIA, an information-theoretic model synthesis method

    PubMed Central

    Chanda, P; Zhang, A; Ramanathan, M

    2011-01-01

    To develop a model synthesis method for parsimoniously modeling gene–environmental interactions (GEI) associated with clinical outcomes and phenotypes. The AMBROSIA model synthesis approach utilizes the k-way interaction information (KWII), an information-theoretic metric capable of identifying variable combinations associated with GEI. For model synthesis, AMBROSIA considers relevance of combinations to the phenotype, it precludes entry of combinations with redundant information, and penalizes for unjustifiable complexity; each step is KWII based. The performance and power of AMBROSIA were evaluated with simulations and Genetic Association Workshop 15 (GAW15) data sets of rheumatoid arthritis (RA). AMBROSIA identified parsimonious models in data sets containing multiple interactions with linkage disequilibrium present. For the GAW15 data set containing 9187 single-nucleotide polymorphisms, the parsimonious AMBROSIA model identified nine RA-associated combinations with power >90%. AMBROSIA was compared with multifactor dimensionality reduction across several diverse models and had satisfactory power. Software source code is available from http://www.cse.buffalo.edu/DBGROUP/bioinformatics/resources.html. AMBROSIA is a promising method for GEI model synthesis. PMID:21427755

  12. Clinical guideline representation in a CDS: a human information processing method.

    PubMed

    Kilsdonk, Ellen; Riezebos, Rinke; Kremer, Leontien; Peute, Linda; Jaspers, Monique

    2012-01-01

    The Dutch Childhood Oncology Group (DCOG) has developed evidence-based guidelines for screening childhood cancer survivors for possible late complications of treatment. These paper-based guidelines appeared to not suit clinicians' information retrieval strategies; it was thus decided to communicate the guidelines through a Computerized Decision Support (CDS) tool. To ensure high usability of this tool, an analysis of clinicians' cognitive strategies in retrieving information from the paper-based guidelines was used as requirements elicitation method. An information processing model was developed through an analysis of think aloud protocols and used as input for the design of the CDS user interface. Usability analysis of the user interface showed that the navigational structure of the CDS tool fitted well with the clinicians' mental strategies employed in deciding on survivors screening protocols. Clinicians were more efficient and more complete in deciding on patient-tailored screening procedures when supported by the CDS tool than by the paper-based guideline booklet. The think-aloud method provided detailed insight into users' clinical work patterns that supported the design of a highly usable CDS system.

  13. Informed consent recall and comprehension in orthodontics: traditional vs improved readability and processability methods.

    PubMed

    Kang, Edith Y; Fields, Henry W; Kiyak, Asuman; Beck, F Michael; Firestone, Allen R

    2009-10-01

    Low general and health literacy in the United States means informed consent documents are not well understood by most adults. Methods to improve recall and comprehension of informed consent have not been tested in orthodontics. The purposes of this study were to evaluate (1) recall and comprehension among patients and parents by using the American Association of Orthodontists' (AAO) informed consent form and new forms incorporating improved readability and processability; (2) the association between reading ability, anxiety, and sociodemographic variables and recall and comprehension; and (3) how various domains (treatment, risk, and responsibility) of information are affected by the forms. Three treatment groups (30 patient-parent pairs in each) received an orthodontic case presentation and either the AAO form, an improved readability form (MIC), or an improved readability and processability (pairing audio and visual cues) form (MIC + SS). Structured interviews were transcribed and coded to evaluate recall and comprehension. Significant relationships among patient-related variables and recall and comprehension explained little of the variance. The MIC + SS form significantly improved patient recall and parent recall and comprehension. Recall was better than comprehension, and parents performed better than patients. The MIC + SS form significantly improved patient treatment comprehension and risk recall and parent treatment recall and comprehension. Patients and parents both overestimated their understanding of the materials. Improving the readability of consent materials made little difference, but combining improved readability and processability benefited both patients' recall and parents' recall and comprehension compared with the AAO form.

  14. Support of Wheelchairs Using Pheromone Information with Two Types of Communication Methods

    NASA Astrophysics Data System (ADS)

    Yamamoto, Koji; Nitta, Katsumi

    In this paper, we propose a communication framework which combined two types of communication among wheelchairs and mobile devices. Due to restriction of range of activity, there is a problem that wheelchair users tend to shut themselves up in their houses. We developed a navigational wheelchair which loads a system that displays information on a map through WWW. However, this wheelchair is expensive because it needs a solid PC, a precise GPS, a battery, and so on. We introduce mobile devices and use this framework to provide information to wheelchair users and to facilitate them to go out. When a user encounters other users, they exchange messages which they have by short-distance wireless communication. Once a message is delivered to a navigational wheelchair, the wheelchair uploads the message to the system. We use two types of pheromone information which represent trends of user's movement and existences of a crowd of users. First, when users gather, ``crowd of people pheromone'' is emitted virtually. Users do not send these pheromones to the environment but carry them. If the density exceeds the threshold, messages that express ``people gethered'' are generated automatically. The other pheromone is ``movement trend pheromone'', which is used to improve probability of successful transmissions. From results of experiments, we concluded that our method can deliver information that wheelchair users gathered to other wheelchairs.

  15. Spatial modelling of periglacial phenomena in Deception Island (Maritime Antarctic): logistic regression and informative value method.

    NASA Astrophysics Data System (ADS)

    Melo, Raquel; Vieira, Gonçalo; Caselli, Alberto; Ramos, Miguel

    2010-05-01

    Field surveying during the austral summer of 2007/08 and the analysis of a QuickBird satellite image, resulted on the production of a detailed geomorphological map of the Irizar and Crater Lake area in Deception Island (South Shetlands, Maritime Antarctic - 1:10 000) and allowed its analysis and spatial modelling of the geomorphological phenomena. The present study focus on the analysis of the spatial distribution and characteristics of hummocky terrains, lag surfaces and nivation hollows, complemented by GIS spatial modelling intending to identify relevant controlling geographical factors. Models of the susceptibility of occurrence of these phenomena were created using two statistical methods: logistical regression, as a multivariate method; and the informative value as a bivariate method. Success and prediction rate curves were used for model validation. The Area Under the Curve (AUC) was used to quantify the level of performance and prediction of the models and to allow the comparison between the two methods. Regarding the logistic regression method, the AUC showed a success rate of 71% for the lag surfaces, 81% for the hummocky terrains and 78% for the nivation hollows. The prediction rate was 72%, 68% and 71%, respectively. Concerning the informative value method, the success rate was 69% for the lag surfaces, 84% for the hummocky terrains and 78% for the nivation hollows, and with a correspondingly prediction of 71%, 66% and 69%. The results were of very good quality and demonstrate the potential of the models to predict the influence of independent variables in the occurrence of the geomorphological phenomena and also the reliability of the data. Key-words: present-day geomorphological dynamics, detailed geomorphological mapping, GIS, spatial modelling, Deception Island, Antarctic.

  16. A Scalable Bayesian Method for Integrating Functional Information in Genome-wide Association Studies.

    PubMed

    Yang, Jingjing; Fritsche, Lars G; Zhou, Xiang; Abecasis, Gonçalo

    2017-09-07

    Genome-wide association studies (GWASs) have identified many complex loci. However, most loci reside in noncoding regions and have unknown biological functions. Integrative analysis that incorporates known functional information into GWASs can help elucidate the underlying biological mechanisms and prioritize important functional variants. Hence, we develop a flexible Bayesian variable selection model with efficient computational techniques for such integrative analysis. Different from previous approaches, our method models the effect-size distribution and probability of causality for variants with different annotations and jointly models genome-wide variants to account for linkage disequilibrium (LD), thus prioritizing associations based on the quantification of the annotations and allowing for multiple associated variants per locus. Our method dramatically improves both computational speed and posterior sampling convergence by taking advantage of the block-wise LD structures in human genomes. In simulations, our method accurately quantifies the functional enrichment and performs more powerfully for prioritizing the true associations than alternative methods, where the power gain is especially apparent when multiple associated variants in LD reside in the same locus. We applied our method to an in-depth GWAS of age-related macular degeneration with 33,976 individuals and 9,857,286 variants. We find the strongest enrichment for causality among non-synonymous variants (54× more likely to be causal, 1.4× larger effect sizes) and variants in transcription, repressed Polycomb, and enhancer regions, as well as identify five additional candidate loci beyond the 32 known AMD risk loci. In conclusion, our method is shown to efficiently integrate functional information in GWASs, helping identify functional associated-variants and underlying biology. Published by Elsevier Inc.

  17. Fast and robust brain tumor segmentation using level set method with multiple image information.

    PubMed

    Lok, Ka Hei; Shi, Lin; Zhu, Xianlun; Wang, Defeng

    2017-01-01

    Brain tumor segmentation is a challenging task for its variation in intensity. The phenomenon is caused by the inhomogeneous content of tumor tissue and the choice of imaging modality. In 2010 Zhang developed the Selective Binary Gaussian Filtering Regularizing Level Set (SBGFRLS) model that combined the merits of edge-based and region-based segmentation. To improve the SBGFRLS method by modifying the singed pressure force (SPF) term with multiple image information and demonstrate effectiveness of proposed method on clinical images. In original SBGFRLS model, the contour evolution direction mainly depends on the SPF. By introducing a directional term in SPF, the metric could control the evolution direction. The SPF is altered by statistic values enclosed by the contour. This concept can be extended to jointly incorporate multiple image information. The new SPF term is expected to bring a solution for blur edge problem in brain tumor segmentation. The proposed method is validated with clinical images including pre- and post-contrast magnetic resonance images. The accuracy and robustness is compared with sensitivity, specificity, DICE similarity coefficient and Jaccard similarity index. Experimental results show improvement, in particular the increase of sensitivity at the same specificity, in segmenting all types of tumors except for the diffused tumor. The novel brain tumor segmentation method is clinical-oriented with fast, robust and accurate implementation and a minimal user interaction. The method effectively segmented homogeneously enhanced, non-enhanced, heterogeneously-enhanced, and ring-enhanced tumor under MR imaging. Though the method is limited by identifying edema and diffuse tumor, several possible solutions are suggested to turn the curve evolution into a fully functional clinical diagnosis tool.

  18. A novel concealed information test method based on independent component analysis and support vector machine.

    PubMed

    Gao, Junfeng; Lu, Liang; Yang, Yong; Yu, Gang; Na, Liantao; Rao, NiNi

    2012-01-01

    The concealed information test (CIT) has drawn much attention and has been widely investigated in recent years. In this study, a novel CIT method based on denoised P3 and machine learning was proposed to improve the accuracy of lie detection. Thirty participants were chosen as the guilty and innocent participants to perform the paradigms of 3 types of stimuli. The electroencephalogram (EEG) signals were recorded and separated into many single trials. In order to enhance the signal noise ratio (SNR) of P3 components, the independent component analysis (ICA) method was adopted to separate non-P3 components (i.e., artifacts) from every single trial. In order to automatically identify the P3 independent components (ICs), a new method based on topography template was proposed to automatically identify the P3 ICs. Then the P3 waveforms with high SNR were reconstructed on Pz electrodes. Second, the 3 groups of features based on time,frequency, and wavelets were extracted from the reconstructed P3 waveforms. Finally, 2 classes of feature samples were used to train a support vector machine (SVM) classifier because it has higher performance compared with several other classifiers. Meanwhile, the optimal number of P3 ICs and some other parameter values in the classifiers were determined by the cross-validation procedures. The presented method achieved a balance test accuracy of 84.29% on detecting P3 components for the guilty and innocent participants. The presented method improves the efficiency of CIT in comparison with previous reported methods.

  19. Comparison of Seven Methods for Boolean Factor Analysis and Their Evaluation by Information Gain.

    PubMed

    Frolov, Alexander A; Húsek, Dušan; Polyakov, Pavel Yu

    2016-03-01

    An usual task in large data set analysis is searching for an appropriate data representation in a space of fewer dimensions. One of the most efficient methods to solve this task is factor analysis. In this paper, we compare seven methods for Boolean factor analysis (BFA) in solving the so-called bars problem (BP), which is a BFA benchmark. The performance of the methods is evaluated by means of information gain. Study of the results obtained in solving BP of different levels of complexity has allowed us to reveal strengths and weaknesses of these methods. It is shown that the Likelihood maximization Attractor Neural Network with Increasing Activity (LANNIA) is the most efficient BFA method in solving BP in many cases. Efficacy of the LANNIA method is also shown, when applied to the real data from the Kyoto Encyclopedia of Genes and Genomes database, which contains full genome sequencing for 1368 organisms, and to text data set R52 (from Reuters 21578) typically used for label categorization.

  20. Co-design of RAD and ETHICS methodologies: a combination of information system development methods

    NASA Astrophysics Data System (ADS)

    Nasehi, Arezo; Shahriyari, Salman

    2011-12-01

    Co-design is a new trend in the social world which tries to capture different ideas in order to use the most appropriate features for a system. In this paper, co-design of two information system methodologies is regarded; rapid application development (RAD) and effective technical and human implementation of computer-based systems (ETHICS). We tried to consider the characteristics of these methodologies to see the possibility of having a co-design or combination of them for developing an information system. To reach this purpose, four different aspects of them are analyzed: social or technical approach, user participation and user involvement, job satisfaction, and overcoming change resistance. Finally, a case study using the quantitative method is analyzed in order to examine the possibility of co-design using these factors. The paper concludes that RAD and ETHICS are appropriate to be co-designed and brings some suggestions for the co-design.