Science.gov

Sample records for information theory-based methods

  1. Optimal cross-sectional sampling for river modelling with bridges: An information theory-based method

    NASA Astrophysics Data System (ADS)

    Ridolfi, E.; Alfonso, L.; Di Baldassarre, G.; Napolitano, F.

    2016-06-01

    The description of river topography has a crucial role in accurate one-dimensional (1D) hydraulic modelling. Specifically, cross-sectional data define the riverbed elevation, the flood-prone area, and thus, the hydraulic behavior of the river. Here, the problem of the optimal cross-sectional spacing is solved through an information theory-based concept. The optimal subset of locations is the one with the maximum information content and the minimum amount of redundancy. The original contribution is the introduction of a methodology to sample river cross sections in the presence of bridges. The approach is tested on the Grosseto River (IT) and is compared to existing guidelines. The results show that the information theory-based approach can support traditional methods to estimate rivers' cross-sectional spacing.

  2. Evaluating hydrological model performance using information theory-based metrics

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The accuracy-based model performance metrics not necessarily reflect the qualitative correspondence between simulated and measured streamflow time series. The objective of this work was to use the information theory-based metrics to see whether they can be used as complementary tool for hydrologic m...

  3. IMMAN: free software for information theory-based chemometric analysis.

    PubMed

    Urias, Ricardo W Pino; Barigye, Stephen J; Marrero-Ponce, Yovani; García-Jacas, César R; Valdes-Martiní, José R; Perez-Gimenez, Facundo

    2015-05-01

    The features and theoretical background of a new and free computational program for chemometric analysis denominated IMMAN (acronym for Information theory-based CheMoMetrics ANalysis) are presented. This is multi-platform software developed in the Java programming language, designed with a remarkably user-friendly graphical interface for the computation of a collection of information-theoretic functions adapted for rank-based unsupervised and supervised feature selection tasks. A total of 20 feature selection parameters are presented, with the unsupervised and supervised frameworks represented by 10 approaches in each case. Several information-theoretic parameters traditionally used as molecular descriptors (MDs) are adapted for use as unsupervised rank-based feature selection methods. On the other hand, a generalization scheme for the previously defined differential Shannon's entropy is discussed, as well as the introduction of Jeffreys information measure for supervised feature selection. Moreover, well-known information-theoretic feature selection parameters, such as information gain, gain ratio, and symmetrical uncertainty are incorporated to the IMMAN software ( http://mobiosd-hub.com/imman-soft/ ), following an equal-interval discretization approach. IMMAN offers data pre-processing functionalities, such as missing values processing, dataset partitioning, and browsing. Moreover, single parameter or ensemble (multi-criteria) ranking options are provided. Consequently, this software is suitable for tasks like dimensionality reduction, feature ranking, as well as comparative diversity analysis of data matrices. Simple examples of applications performed with this program are presented. A comparative study between IMMAN and WEKA feature selection tools using the Arcene dataset was performed, demonstrating similar behavior. In addition, it is revealed that the use of IMMAN unsupervised feature selection methods improves the performance of both IMMAN and WEKA

  4. Correlation theory-based signal processing method for CMF signals

    NASA Astrophysics Data System (ADS)

    Shen, Yan-lin; Tu, Ya-qing

    2016-06-01

    Signal processing precision of Coriolis mass flowmeter (CMF) signals affects measurement accuracy of Coriolis mass flowmeters directly. To improve the measurement accuracy of CMFs, a correlation theory-based signal processing method for CMF signals is proposed, which is comprised of the correlation theory-based frequency estimation method and phase difference estimation method. Theoretical analysis shows that the proposed method eliminates the effect of non-integral period sampling signals on frequency and phase difference estimation. The results of simulations and field experiments demonstrate that the proposed method improves the anti-interference performance of frequency and phase difference estimation and has better estimation performance than the adaptive notch filter, discrete Fourier transform and autocorrelation methods in terms of frequency estimation and the data extension-based correlation, Hilbert transform, quadrature delay estimator and discrete Fourier transform methods in terms of phase difference estimation, which contributes to improving the measurement accuracy of Coriolis mass flowmeters.

  5. Scale effects on information theory-based measures applied to streamflow patterns in two rural watersheds

    NASA Astrophysics Data System (ADS)

    Pan, Feng; Pachepsky, Yakov A.; Guber, Andrey K.; McPherson, Brian J.; Hill, Robert L.

    2012-01-01

    varying from 0.2 to 0.6 in the two watersheds. We conclude that temporal effects must be evaluated and accounted for when the information theory-based methods are used for performance evaluation and comparison of hydrological models.

  6. Trends in information theory-based chemical structure codification.

    PubMed

    Barigye, Stephen J; Marrero-Ponce, Yovani; Pérez-Giménez, Facundo; Bonchev, Danail

    2014-08-01

    This report offers a chronological review of the most relevant applications of information theory in the codification of chemical structure information, through the so-called information indices. Basically, these are derived from the analysis of the statistical patterns of molecular structure representations, which include primitive global chemical formulae, chemical graphs, or matrix representations. Finally, new approaches that attempt to go "back to the roots" of information theory, in order to integrate other information-theoretic measures in chemical structure coding are discussed.

  7. Kinetic theory based new upwind methods for inviscid compressible flows

    NASA Technical Reports Server (NTRS)

    Deshpande, S. M.

    1986-01-01

    Two new upwind methods called the Kinetic Numerical Method (KNM) and the Kinetic Flux Vector Splitting (KFVS) method for the solution of the Euler equations have been presented. Both of these methods can be regarded as some suitable moments of an upwind scheme for the solution of the Boltzmann equation provided the distribution function is Maxwellian. This moment-method strategy leads to a unification of the Riemann approach and the pseudo-particle approach used earlier in the development of upwind methods for the Euler equations. A very important aspect of the moment-method strategy is that the new upwind methods satisfy the entropy condition because of the Boltzmann H-Theorem and suggest a possible way of extending the Total Variation Diminishing (TVD) principle within the framework of the H-Theorem. The ability of these methods in obtaining accurate wiggle-free solution is demonstrated by applying them to two test problems.

  8. Density functional theory based generalized effective fragment potential method

    SciTech Connect

    Nguyen, Kiet A. E-mail: ruth.pachter@wpafb.af.mil; Pachter, Ruth E-mail: ruth.pachter@wpafb.af.mil; Day, Paul N.

    2014-06-28

    We present a generalized Kohn-Sham (KS) density functional theory (DFT) based effective fragment potential (EFP2-DFT) method for the treatment of solvent effects. Similar to the original Hartree-Fock (HF) based potential with fitted parameters for water (EFP1) and the generalized HF based potential (EFP2-HF), EFP2-DFT includes electrostatic, exchange-repulsion, polarization, and dispersion potentials, which are generated for a chosen DFT functional for a given isolated molecule. The method does not have fitted parameters, except for implicit parameters within a chosen functional and the dispersion correction to the potential. The electrostatic potential is modeled with a multipolar expansion at each atomic center and bond midpoint using Stone's distributed multipolar analysis. The exchange-repulsion potential between two fragments is composed of the overlap and kinetic energy integrals and the nondiagonal KS matrices in the localized molecular orbital basis. The polarization potential is derived from the static molecular polarizability. The dispersion potential includes the intermolecular D3 dispersion correction of Grimme et al. [J. Chem. Phys. 132, 154104 (2010)]. The potential generated from the CAMB3LYP functional has mean unsigned errors (MUEs) with respect to results from coupled cluster singles, doubles, and perturbative triples with a complete basis set limit (CCSD(T)/CBS) extrapolation, of 1.7, 2.2, 2.0, and 0.5 kcal/mol, for the S22, water-benzene clusters, water clusters, and n-alkane dimers benchmark sets, respectively. The corresponding EFP2-HF errors for the respective benchmarks are 2.41, 3.1, 1.8, and 2.5 kcal/mol. Thus, the new EFP2-DFT-D3 method with the CAMB3LYP functional provides comparable or improved results at lower computational cost and, therefore, extends the range of applicability of EFP2 to larger system sizes.

  9. Density functional theory based generalized effective fragment potential method

    NASA Astrophysics Data System (ADS)

    Nguyen, Kiet A.; Pachter, Ruth; Day, Paul N.

    2014-06-01

    We present a generalized Kohn-Sham (KS) density functional theory (DFT) based effective fragment potential (EFP2-DFT) method for the treatment of solvent effects. Similar to the original Hartree-Fock (HF) based potential with fitted parameters for water (EFP1) and the generalized HF based potential (EFP2-HF), EFP2-DFT includes electrostatic, exchange-repulsion, polarization, and dispersion potentials, which are generated for a chosen DFT functional for a given isolated molecule. The method does not have fitted parameters, except for implicit parameters within a chosen functional and the dispersion correction to the potential. The electrostatic potential is modeled with a multipolar expansion at each atomic center and bond midpoint using Stone's distributed multipolar analysis. The exchange-repulsion potential between two fragments is composed of the overlap and kinetic energy integrals and the nondiagonal KS matrices in the localized molecular orbital basis. The polarization potential is derived from the static molecular polarizability. The dispersion potential includes the intermolecular D3 dispersion correction of Grimme et al. [J. Chem. Phys. 132, 154104 (2010)]. The potential generated from the CAMB3LYP functional has mean unsigned errors (MUEs) with respect to results from coupled cluster singles, doubles, and perturbative triples with a complete basis set limit (CCSD(T)/CBS) extrapolation, of 1.7, 2.2, 2.0, and 0.5 kcal/mol, for the S22, water-benzene clusters, water clusters, and n-alkane dimers benchmark sets, respectively. The corresponding EFP2-HF errors for the respective benchmarks are 2.41, 3.1, 1.8, and 2.5 kcal/mol. Thus, the new EFP2-DFT-D3 method with the CAMB3LYP functional provides comparable or improved results at lower computational cost and, therefore, extends the range of applicability of EFP2 to larger system sizes.

  10. A variable-order laminated plate theory based on the variational-asymptotical method

    NASA Technical Reports Server (NTRS)

    Lee, Bok W.; Sutyrin, Vladislav G.; Hodges, Dewey H.

    1993-01-01

    The variational-asymptotical method is a mathematical technique by which the three-dimensional analysis of laminated plate deformation can be split into a linear, one-dimensional, through-the-thickness analysis and a nonlinear, two-dimensional, plate analysis. The elastic constants used in the plate analysis are obtained from the through-the-thickness analysis, along with approximate, closed-form three-dimensional distributions of displacement, strain, and stress. In this paper, a theory based on this technique is developed which is capable of approximating three-dimensional elasticity to any accuracy desired. The asymptotical method allows for the approximation of the through-the-thickness behavior in terms of the eigenfunctions of a certain Sturm-Liouville problem associated with the thickness coordinate. These eigenfunctions contain all the necessary information about the nonhomogeneities along the thickness coordinate of the plate and thus possess the appropriate discontinuities in the derivatives of displacement. The theory is presented in this paper along with numerical results for the eigenfunctions of various laminated plates.

  11. An information theory based search for homogeneity on the largest accessible scale

    NASA Astrophysics Data System (ADS)

    Sarkar, Suman; Pandey, Biswajit

    2016-11-01

    We analyze the SDSS DR12 quasar catalogue to test the large-scale smoothness in the quasar distribution. We quantify the degree of inhomogeneity in the quasar distribution using information theory based measures and find that the degree of inhomogeneity diminishes with increasing length scales which finally reach a plateau at $\\sim 250 \\, h^{-1}\\, {\\rm Mpc}$. The residual inhomogeneity at the plateau is consistent with that expected for a Poisson point process. Our results indicate that the quasar distribution is homogeneous beyond length scales of $250 \\, h^{-1}\\, {\\rm Mpc}$.

  12. Using a Mixed Methods Sequential Design to Identify Factors Associated with African American Mothers' Intention to Vaccinate Their Daughters Aged 9 to 12 for HPV with a Purpose of Informing a Culturally-Relevant, Theory-Based Intervention

    ERIC Educational Resources Information Center

    Cunningham, Jennifer L.

    2013-01-01

    The purpose of this sequential, explanatory mixed methods research study was to understand what factors influenced African American maternal intentions to get their daughters aged 9 years to 12 years vaccinated in Alabama. In the first, quantitative phase of the study, the research questions focused on identifying the predictive power of eleven…

  13. Information-theory-based band selection and utility evaluation for reflective spectral systems

    NASA Astrophysics Data System (ADS)

    Shen, Sylvia S.; Bassett, Edward M., III

    2002-08-01

    We have developed a methodology for wavelength band selection. This methodology can be used in system design studies to provide an optimal sensor cost, data reduction, and data utility trade-off relative to a specific application. The methodology combines an information theory- based criterion for band selection with a genetic algorithm to search for a near-optimal solution. We have applied this methodology to 612 material spectra from a combined database to determine the band locations for 6, 9, 15, 30, and 60- band sets in the 0.42 to 2.5 microns spectral region that permit the best material separation. These optimal bands sets were then evaluated in terms of their utility related to anomaly ddetection and material identification using multi-band data cubes generated from two HYDICE cubes. The optimal band locations and their corresponding entropies are given in this paper. Our optimal band locations for the 6, 9, and 15-band sets are compared to the bands of existing multi-band systems such as Landsat 7, Multispectral Thermal Imager, Advanced Land Imager, Daedalus, and M7. Also presented are the anomaly detection and material identification results obtained from our generalted multi- band data cubes. Comparisons are made between these exploitation results with those obtained from the original 210-band HYDICE data cubes.

  14. An efficient graph theory based method to identify every minimal reaction set in a metabolic network

    PubMed Central

    2014-01-01

    Background Development of cells with minimal metabolic functionality is gaining importance due to their efficiency in producing chemicals and fuels. Existing computational methods to identify minimal reaction sets in metabolic networks are computationally expensive. Further, they identify only one of the several possible minimal reaction sets. Results In this paper, we propose an efficient graph theory based recursive optimization approach to identify all minimal reaction sets. Graph theoretical insights offer systematic methods to not only reduce the number of variables in math programming and increase its computational efficiency, but also provide efficient ways to find multiple optimal solutions. The efficacy of the proposed approach is demonstrated using case studies from Escherichia coli and Saccharomyces cerevisiae. In case study 1, the proposed method identified three minimal reaction sets each containing 38 reactions in Escherichia coli central metabolic network with 77 reactions. Analysis of these three minimal reaction sets revealed that one of them is more suitable for developing minimal metabolism cell compared to other two due to practically achievable internal flux distribution. In case study 2, the proposed method identified 256 minimal reaction sets from the Saccharomyces cerevisiae genome scale metabolic network with 620 reactions. The proposed method required only 4.5 hours to identify all the 256 minimal reaction sets and has shown a significant reduction (approximately 80%) in the solution time when compared to the existing methods for finding minimal reaction set. Conclusions Identification of all minimal reactions sets in metabolic networks is essential since different minimal reaction sets have different properties that effect the bioprocess development. The proposed method correctly identified all minimal reaction sets in a both the case studies. The proposed method is computationally efficient compared to other methods for finding minimal

  15. Analysis and Comparison of Information Theory-based Distances for Genomic Strings

    NASA Astrophysics Data System (ADS)

    Balzano, Walter; Cicalese, Ferdinando; Del Sorbo, Maria Rosaria; Vaccaro, Ugo

    2008-07-01

    Genomic string comparison via alignment are widely applied for mining and retrieval of information in biological databases. In some situation, the effectiveness of such alignment based comparison is still unclear, e.g., for sequences with non-uniform length and with significant shuffling of identical substrings. An alternative approach is the one based on information theory distances. Biological data information content is stored in very long strings of only four characters. In last ten years, several entropic measures have been proposed for genomic string analysis. Notwithstanding their individual merit and experimental validation, to the nest of our knowledge, there is no direct comparison of these different metrics. We shall present four of the most representative alignment-free distance measures, based on mutual information. Each one has a different origin and expression. Our comparison involves a sort of arrangement, to reduce different concepts to a unique formalism, so as it has been possible to construct a phylogenetic tree for each of them. The trees produced via these metrics are compared to the ones widely accepted as biologically validated. In general the results provided more evidence of the reliability of the alignment-free distance models. Also, we observe that one of the metrics appeared to be more robust than the other three. We believe that this result can be object of further researches and observations. Many of the results of experimentation, the graphics and the table are available at the following URL: http://people.na.infn.it/˜wbalzano/BIO

  16. Scale effects on information theory-based measures applied to streamflow patterns in two rural watersheds

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Understanding streamflow patterns in space and time is important to improve the flood and drought forecasting, water resources management, and predictions of ecological changes. The objectives of this work were (a) to characterize the spatial and temporal patterns of streamflow using information the...

  17. An information theory based framework for the measurement of population health.

    PubMed

    Nesson, Erik T; Robinson, Joshua J

    2015-04-01

    This paper proposes a new framework for the measurement of population health and the ranking of the health of different geographies. Since population health is a latent variable, studies which measure and rank the health of different geographies must aggregate observable health attributes into one summary measure. We show that the methods used in nearly all the literature to date implicitly assume that all attributes are infinitely substitutable. Our method, based on the measurement of multidimensional welfare and inequality, minimizes the entropic distance between the summary measure of population health and the distribution of the underlying attributes. This summary function coincides with the constant elasticity of substitution and Cobb-Douglas production functions and naturally allows different assumptions regarding attribute substitutability or complementarity. To compare methodologies, we examine a well-known ranking of the population health of U.S. states, America's Health Rankings. We find that states' rankings are somewhat sensitive to changes in the weight given to each attribute, but very sensitive to changes in aggregation methodology. Our results have broad implications for well-known health rankings such as the 2000 World Health Report, as well as other measurements of population and individual health levels and the measurement and decomposition of health inequality.

  18. Novel information theory-based measures for quantifying incongruence among phylogenetic trees.

    PubMed

    Salichos, Leonidas; Stamatakis, Alexandros; Rokas, Antonis

    2014-05-01

    Phylogenies inferred from different data matrices often conflict with each other necessitating the development of measures that quantify this incongruence. Here, we introduce novel measures that use information theory to quantify the degree of conflict or incongruence among all nontrivial bipartitions present in a set of trees. The first measure, internode certainty (IC), calculates the degree of certainty for a given internode by considering the frequency of the bipartition defined by the internode (internal branch) in a given set of trees jointly with that of the most prevalent conflicting bipartition in the same tree set. The second measure, IC All (ICA), calculates the degree of certainty for a given internode by considering the frequency of the bipartition defined by the internode in a given set of trees in conjunction with that of all conflicting bipartitions in the same underlying tree set. Finally, the tree certainty (TC) and TC All (TCA) measures are the sum of IC and ICA values across all internodes of a phylogeny, respectively. IC, ICA, TC, and TCA can be calculated from different types of data that contain nontrivial bipartitions, including from bootstrap replicate trees to gene trees or individual characters. Given a set of phylogenetic trees, the IC and ICA values of a given internode reflect its specific degree of incongruence, and the TC and TCA values describe the global degree of incongruence between trees in the set. All four measures are implemented and freely available in version 8.0.0 and subsequent versions of the widely used program RAxML.

  19. A second-order accurate kinetic-theory-based method for inviscid compressible flows

    NASA Technical Reports Server (NTRS)

    Deshpande, Suresh M.

    1986-01-01

    An upwind method for the numerical solution of the Euler equations is presented. This method, called the kinetic numerical method (KNM), is based on the fact that the Euler equations are moments of the Boltzmann equation of the kinetic theory of gases when the distribution function is Maxwellian. The KNM consists of two phases, the convection phase and the collision phase. The method is unconditionally stable and explicit. It is highly vectorizable and can be easily made total variation diminishing for the distribution function by a suitable choice of the interpolation strategy. The method is applied to a one-dimensional shock-propagation problem and to a two-dimensional shock-reflection problem.

  20. Analytic Gradient for Density Functional Theory Based on the Fragment Molecular Orbital Method.

    PubMed

    Brorsen, Kurt R; Zahariev, Federico; Nakata, Hiroya; Fedorov, Dmitri G; Gordon, Mark S

    2014-12-01

    The equations for the response terms for the fragment molecular orbital (FMO) method interfaced with the density functional theory (DFT) gradient are derived and implemented. Compared to the previous FMO-DFT gradient, which lacks response terms, the FMO-DFT analytic gradient has improved accuracy for a variety of functionals, when compared to numerical gradients. The FMO-DFT gradient agrees with the fully ab initio DFT gradient in which no fragmentation is performed, while reducing the nonlinear scaling associated with standard DFT. Solving for the response terms requires the solution of the coupled perturbed Kohn-Sham (CPKS) equations, where the CPKS equations are solved through a decoupled Z-vector procedure called the self-consistent Z-vector method. FMO-DFT is a nonvariational method and the FMO-DFT gradient is unique compared to standard DFT gradients in that the FMO-DFT gradient requires terms from both DFT and time-dependent density functional theory (TDDFT) theories.

  1. Gas-Kinetic Theory Based Flux Splitting Method for Ideal Magnetohydrodynamics

    NASA Technical Reports Server (NTRS)

    Xu, Kun

    1998-01-01

    A gas-kinetic solver is developed for the ideal magnetohydrodynamics (MHD) equations. The new scheme is based on the direct splitting of the flux function of the MHD equations with the inclusion of "particle" collisions in the transport process. Consequently, the artificial dissipation in the new scheme is much reduced in comparison with the MHD Flux Vector Splitting Scheme. At the same time, the new scheme is compared with the well-developed Roe-type MHD solver. It is concluded that the kinetic MHD scheme is more robust and efficient than the Roe- type method, and the accuracy is competitive. In this paper the general principle of splitting the macroscopic flux function based on the gas-kinetic theory is presented. The flux construction strategy may shed some light on the possible modification of AUSM- and CUSP-type schemes for the compressible Euler equations, as well as to the development of new schemes for a non-strictly hyperbolic system.

  2. Practical application of game theory based production flow planning method in virtual manufacturing networks

    NASA Astrophysics Data System (ADS)

    Olender, M.; Krenczyk, D.

    2016-08-01

    Modern enterprises have to react quickly to dynamic changes in the market, due to changing customer requirements and expectations. One of the key area of production management, that must continuously evolve by searching for new methods and tools for increasing the efficiency of manufacturing systems is the area of production flow planning and control. These aspects are closely connected with the ability to implement the concept of Virtual Enterprises (VE) and Virtual Manufacturing Network (VMN) in which integrated infrastructure of flexible resources are created. In the proposed approach, the players role perform the objects associated with the objective functions, allowing to solve the multiobjective production flow planning problems based on the game theory, which is based on the theory of the strategic situation. For defined production system and production order models ways of solving the problem of production route planning in VMN on computational examples for different variants of production flow is presented. Possible decision strategy to use together with an analysis of calculation results is shown.

  3. Information theory-based scoring function for the structure-based prediction of protein-ligand binding affinity.

    PubMed

    Kulharia, Mahesh; Goody, Roger S; Jackson, Richard M

    2008-10-01

    The development and validation of a new knowledge based scoring function (SIScoreJE) to predict binding energy between proteins and ligands is presented. SIScoreJE efficiently predicts the binding energy between a small molecule and its protein receptor. Protein-ligand atomic contact information was derived from a Non-Redundant Data set (NRD) of over 3000 X-ray crystal structures of protein-ligand complexes. This information was classified for individual "atom contact pairs" (ACP) which is used to calculate the atomic contact preferences. In addition to the two schemes generated in this study we have assessed a number of other common atom-type classification schemes. The preferences were calculated using an information theoretic relationship of joint entropy. Among 18 different atom-type classification schemes "ScoreJE Atom Type set2" (SATs2) was found to be the most suitable for our approach. To test the sensitivity of the method to the inclusion of solvent, Single-body Solvation Potentials (SSP) were also derived from the atomic contacts between the protein atom types and water molecules modeled using AQUARIUS2. Validation was carried out using an evaluation data set of 100 protein-ligand complexes with known binding energies to test the ability of the scoring functions to reproduce known binding affinities. In summary, it was found that a combined SSP/ScoreJE (SIScoreJE) performed significantly better than ScoreJE alone, and SIScoreJE and ScoreJE performed better than GOLD::GoldScore, GOLD::ChemScore, and XScore.

  4. Battling the challenges of training nurses to use information systems through theory-based training material design.

    PubMed

    Galani, Malatsi; Yu, Ping; Paas, Fred; Chandler, Paul

    2014-01-01

    The attempts to train nurses to effectively use information systems have had mixed results. One problem is that training materials are not adequately designed to guide trainees to gradually learn to use a system without experiencing a heavy cognitive load. This is because training design often does not take into consideration a learner's cognitive ability to absorb new information in a short training period. Given the high cost and difficulty of organising training in healthcare organisations, there is an urgent need for information system trainers to be aware of how cognitive overload or information overload affect a trainee's capability to acquire new knowledge and skills, and what instructional techniques can be used to facilitate effective learning. This paper introduces the concept of cognitive load and how it affects nurses when learning to use a new health information system. This is followed by the relevant strategies for instructional design, underpinned by the principles of cognitive load theory, which may be helpful for the development of effective instructional materials and activities for training nurses to use information systems. PMID:25087524

  5. Battling the challenges of training nurses to use information systems through theory-based training material design.

    PubMed

    Galani, Malatsi; Yu, Ping; Paas, Fred; Chandler, Paul

    2014-01-01

    The attempts to train nurses to effectively use information systems have had mixed results. One problem is that training materials are not adequately designed to guide trainees to gradually learn to use a system without experiencing a heavy cognitive load. This is because training design often does not take into consideration a learner's cognitive ability to absorb new information in a short training period. Given the high cost and difficulty of organising training in healthcare organisations, there is an urgent need for information system trainers to be aware of how cognitive overload or information overload affect a trainee's capability to acquire new knowledge and skills, and what instructional techniques can be used to facilitate effective learning. This paper introduces the concept of cognitive load and how it affects nurses when learning to use a new health information system. This is followed by the relevant strategies for instructional design, underpinned by the principles of cognitive load theory, which may be helpful for the development of effective instructional materials and activities for training nurses to use information systems.

  6. Nonlinear gyrokinetic theory based on a new method and computation of the guiding-center orbit in tokamaks

    SciTech Connect

    Xu, Yingfeng Dai, Zongliang; Wang, Shaojie

    2014-04-15

    The nonlinear gyrokinetic theory in the tokamak configuration based on the two-step transform is developed; in the first step, we transform the magnetic potential perturbation to the Hamiltonian part, and in the second step, we transform away the gyroangle-dependent part of the perturbed Hamiltonian. Then the I-transform method is used to decoupled the perturbation part of the motion from the unperturbed motion. The application of the I-transform method to the computation of the guiding-center orbit and the guiding-center distribution function in tokamaks is presented. It is demonstrated that the I-transform method of the orbit computation which involves integrating only along the unperturbed orbit agrees with the conventional method which integrates along the full orbit. A numerical code based on the I-transform method is developed and two numerical examples are given to verify the new method.

  7. Did you have an impact? A theory-based method for planning and evaluating knowledge-transfer and exchange activities in occupational health and safety.

    PubMed

    Kramer, Desré M; Wells, Richard P; Carlan, Nicolette; Aversa, Theresa; Bigelow, Philip P; Dixon, Shane M; McMillan, Keith

    2013-01-01

    Few evaluation tools are available to assess knowledge-transfer and exchange interventions. The objective of this paper is to develop and demonstrate a theory-based knowledge-transfer and exchange method of evaluation (KEME) that synthesizes 3 theoretical frameworks: the promoting action on research implementation of health services (PARiHS) model, the transtheoretical model of change, and a model of knowledge use. It proposes a new term, keme, to mean a unit of evidence-based transferable knowledge. The usefulness of the evaluation method is demonstrated with 4 occupational health and safety knowledge transfer and exchange (KTE) implementation case studies that are based upon the analysis of over 50 pre-existing interviews. The usefulness of the evaluation model has enabled us to better understand stakeholder feedback, frame our interpretation, and perform a more comprehensive evaluation of the knowledge use outcomes of our KTE efforts. PMID:23498710

  8. A third-generation density-functional-theory-based method for calculating canonical molecular orbitals of large molecules.

    PubMed

    Hirano, Toshiyuki; Sato, Fumitoshi

    2014-07-28

    We used grid-free modified Cholesky decomposition (CD) to develop a density-functional-theory (DFT)-based method for calculating the canonical molecular orbitals (CMOs) of large molecules. Our method can be used to calculate standard CMOs, analytically compute exchange-correlation terms, and maximise the capacity of next-generation supercomputers. Cholesky vectors were first analytically downscaled using low-rank pivoted CD and CD with adaptive metric (CDAM). The obtained Cholesky vectors were distributed and stored on each computer node in a parallel computer, and the Coulomb, Fock exchange, and pure exchange-correlation terms were calculated by multiplying the Cholesky vectors without evaluating molecular integrals in self-consistent field iterations. Our method enables DFT and massively distributed memory parallel computers to be used in order to very efficiently calculate the CMOs of large molecules. PMID:24622472

  9. A third-generation density-functional-theory-based method for calculating canonical molecular orbitals of large molecules.

    PubMed

    Hirano, Toshiyuki; Sato, Fumitoshi

    2014-07-28

    We used grid-free modified Cholesky decomposition (CD) to develop a density-functional-theory (DFT)-based method for calculating the canonical molecular orbitals (CMOs) of large molecules. Our method can be used to calculate standard CMOs, analytically compute exchange-correlation terms, and maximise the capacity of next-generation supercomputers. Cholesky vectors were first analytically downscaled using low-rank pivoted CD and CD with adaptive metric (CDAM). The obtained Cholesky vectors were distributed and stored on each computer node in a parallel computer, and the Coulomb, Fock exchange, and pure exchange-correlation terms were calculated by multiplying the Cholesky vectors without evaluating molecular integrals in self-consistent field iterations. Our method enables DFT and massively distributed memory parallel computers to be used in order to very efficiently calculate the CMOs of large molecules.

  10. Fuzzy theory based control method for an in-pipe robot to move in variable resistance environment

    NASA Astrophysics Data System (ADS)

    Li, Te; Ma, Shugen; Li, Bin; Wang, Minghui; Wang, Yuechao

    2015-11-01

    Most of the existing screw drive in-pipe robots cannot actively adjust the maximum traction capacity, which limits the adaptability to the wide range of variable environment resistance, especially in curved pipes. In order to solve this problem, a screw drive in-pipe robot based on adaptive linkage mechanism is proposed. The differential property of the adaptive linkage mechanism allows the robot to move without motion interference in the straight and varied curved pipes by adjusting inclining angles of rollers self-adaptively. The maximum traction capacity of the robot can be changed by actively adjusting the inclining angles of rollers. In order to improve the adaptability to the variable resistance, a torque control method based on the fuzzy controller is proposed. For the variable environment resistance, the proposed control method can not only ensure enough traction force, but also limit the output torque in a feasible region. In the simulations, the robot with the proposed control method is compared to the robot with fixed inclining angles of rollers. The results show that the combination of the torque control method and the proposed robot achieves the better adaptability to the variable resistance in the straight and curved pipes.

  11. Evolution of graph theory-based QSAR methods and their applications to the search for new antibacterial agents.

    PubMed

    Speck-Planche, Alejandro; Cordeiro, M N D S

    2013-01-01

    Resistance of bacteria to current antibiotics has increased worldwide, being one of the leading unresolved situations in public health. Due to negligence regarding the treatment of community-acquired diseases, even healthcare facilities have been highly impacted by an emerging problem: nosocomial infections. Moreover, infectious diseases, including nosocomial infections, have been found to depend on multiple pathogenicity factors, confirming the need to discover of multi-target antibacterial agents. Drug discovery is a very complex, expensive, and time-consuming process. In this sense, Quantitative Structure-Activity Relationships (QSAR) methods have become complementary tools for medicinal chemistry, permitting the efficient screening of potential drugs, and consequently, rationalizing the organic synthesis as well as the biological evaluation of compounds. In the consolidation of QSAR methods as important components of chemoinformatics, the use of mathematical chemistry, and more specifically, the use of graph-theoretical approaches has played a vital role. Here, we focus our attention on the evolution of QSAR methods, citing the most relevant works devoted to the development of promising graph-theoretical approaches in the last 8 years, and their applications to the prediction of antibacterial activities of chemicals against pathogens causing both community-acquired and nosocomial infections.

  12. Evolution of graph theory-based QSAR methods and their applications to the search for new antibacterial agents.

    PubMed

    Speck-Planche, Alejandro; Cordeiro, M N D S

    2013-01-01

    Resistance of bacteria to current antibiotics has increased worldwide, being one of the leading unresolved situations in public health. Due to negligence regarding the treatment of community-acquired diseases, even healthcare facilities have been highly impacted by an emerging problem: nosocomial infections. Moreover, infectious diseases, including nosocomial infections, have been found to depend on multiple pathogenicity factors, confirming the need to discover of multi-target antibacterial agents. Drug discovery is a very complex, expensive, and time-consuming process. In this sense, Quantitative Structure-Activity Relationships (QSAR) methods have become complementary tools for medicinal chemistry, permitting the efficient screening of potential drugs, and consequently, rationalizing the organic synthesis as well as the biological evaluation of compounds. In the consolidation of QSAR methods as important components of chemoinformatics, the use of mathematical chemistry, and more specifically, the use of graph-theoretical approaches has played a vital role. Here, we focus our attention on the evolution of QSAR methods, citing the most relevant works devoted to the development of promising graph-theoretical approaches in the last 8 years, and their applications to the prediction of antibacterial activities of chemicals against pathogens causing both community-acquired and nosocomial infections. PMID:24200354

  13. Analytic second derivative of the energy for density functional theory based on the three-body fragment molecular orbital method

    SciTech Connect

    Nakata, Hiroya; Fedorov, Dmitri G.; Zahariev, Federico; Schmidt, Michael W.; Gordon, Mark S.; Kitaura, Kazuo; Nakamura, Shinichiro

    2015-03-28

    Analytic second derivatives of the energy with respect to nuclear coordinates have been developed for spin restricted density functional theory (DFT) based on the fragment molecular orbital method (FMO). The derivations were carried out for the three-body expansion (FMO3), and the two-body expressions can be obtained by neglecting the three-body corrections. Also, the restricted Hartree-Fock (RHF) Hessian for FMO3 can be obtained by neglecting the density-functional related terms. In both the FMO-RHF and FMO-DFT Hessians, certain terms with small magnitudes are neglected for computational efficiency. The accuracy of the FMO-DFT Hessian in terms of the Gibbs free energy is evaluated for a set of polypeptides and water clusters and found to be within 1 kcal/mol of the corresponding full (non-fragmented) ab initio calculation. The FMO-DFT method is also applied to transition states in S{sub N}2 reactions and for the computation of the IR and Raman spectra of a small Trp-cage protein (PDB: 1L2Y). Some computational timing analysis is also presented.

  14. Analytic second derivative of the energy for density functional theory based on the three-body fragment molecular orbital method.

    PubMed

    Nakata, Hiroya; Fedorov, Dmitri G; Zahariev, Federico; Schmidt, Michael W; Kitaura, Kazuo; Gordon, Mark S; Nakamura, Shinichiro

    2015-03-28

    Analytic second derivatives of the energy with respect to nuclear coordinates have been developed for spin restricted density functional theory (DFT) based on the fragment molecular orbital method (FMO). The derivations were carried out for the three-body expansion (FMO3), and the two-body expressions can be obtained by neglecting the three-body corrections. Also, the restricted Hartree-Fock (RHF) Hessian for FMO3 can be obtained by neglecting the density-functional related terms. In both the FMO-RHF and FMO-DFT Hessians, certain terms with small magnitudes are neglected for computational efficiency. The accuracy of the FMO-DFT Hessian in terms of the Gibbs free energy is evaluated for a set of polypeptides and water clusters and found to be within 1 kcal/mol of the corresponding full (non-fragmented) ab initio calculation. The FMO-DFT method is also applied to transition states in SN2 reactions and for the computation of the IR and Raman spectra of a small Trp-cage protein (PDB: 1L2Y). Some computational timing analysis is also presented.

  15. Analytic second derivative of the energy for density functional theory based on the three-body fragment molecular orbital method

    NASA Astrophysics Data System (ADS)

    Nakata, Hiroya; Fedorov, Dmitri G.; Zahariev, Federico; Schmidt, Michael W.; Kitaura, Kazuo; Gordon, Mark S.; Nakamura, Shinichiro

    2015-03-01

    Analytic second derivatives of the energy with respect to nuclear coordinates have been developed for spin restricted density functional theory (DFT) based on the fragment molecular orbital method (FMO). The derivations were carried out for the three-body expansion (FMO3), and the two-body expressions can be obtained by neglecting the three-body corrections. Also, the restricted Hartree-Fock (RHF) Hessian for FMO3 can be obtained by neglecting the density-functional related terms. In both the FMO-RHF and FMO-DFT Hessians, certain terms with small magnitudes are neglected for computational efficiency. The accuracy of the FMO-DFT Hessian in terms of the Gibbs free energy is evaluated for a set of polypeptides and water clusters and found to be within 1 kcal/mol of the corresponding full (non-fragmented) ab initio calculation. The FMO-DFT method is also applied to transition states in SN2 reactions and for the computation of the IR and Raman spectra of a small Trp-cage protein (PDB: 1L2Y). Some computational timing analysis is also presented.

  16. A comparison of item response theory-based methods for examining differential item functioning in object naming test by language of assessment among older Latinos

    PubMed Central

    Yang, Frances M.; Heslin, Kevin C.; Mehta, Kala M.; Yang, Cheng-Wu; Ocepek-Welikson, Katja; Kleinman, Marjorie; Morales, Leo S.; Hays, Ron D.; Stewart, Anita L.; Mungas, Dan; Jones, Richard N.; Teresi, Jeanne A.

    2012-01-01

    Object naming tests are commonly included in neuropsychological test batteries. Differential item functioning (DIF) in these tests due to cultural and language differences may compromise the validity of cognitive measures in diverse populations. We evaluated 26 object naming items for DIF due to Spanish and English language translations among Latinos (n=1,159), mean age of 70.5 years old (Standard Deviation (SD)±7.2), using the following four item response theory-based approaches: Mplus/Multiple Indicator, Multiple Causes (Mplus/MIMIC; Muthén & Muthén, 1998–2011), Item Response Theory Likelihood Ratio Differential Item Functioning (IRTLRDIF/MULTILOG; Thissen, 1991, 2001), difwithpar/Parscale (Crane, Gibbons, Jolley, & van Belle, 2006; Muraki & Bock, 2003), and Differential Functioning of Items and Tests/MULTILOG (DFIT/MULTILOG; Flowers, Oshima, & Raju, 1999; Thissen, 1991). Overall, there was moderate to near perfect agreement across methods. Fourteen items were found to exhibit DIF and 5 items observed consistently across all methods, which were more likely to be answered correctly by individuals tested in Spanish after controlling for overall ability. PMID:23471423

  17. Theory-Based Stakeholder Evaluation

    ERIC Educational Resources Information Center

    Hansen, Morten Balle; Vedung, Evert

    2010-01-01

    This article introduces a new approach to program theory evaluation called theory-based stakeholder evaluation or the TSE model for short. Most theory-based approaches are program theory driven and some are stakeholder oriented as well. Practically, all of the latter fuse the program perceptions of the various stakeholder groups into one unitary…

  18. NbIT--a new information theory-based analysis of allosteric mechanisms reveals residues that underlie function in the leucine transporter LeuT.

    PubMed

    LeVine, Michael V; Weinstein, Harel

    2014-05-01

    Complex networks of interacting residues and microdomains in the structures of biomolecular systems underlie the reliable propagation of information from an input signal, such as the concentration of a ligand, to sites that generate the appropriate output signal, such as enzymatic activity. This information transduction often carries the signal across relatively large distances at the molecular scale in a form of allostery that is essential for the physiological functions performed by biomolecules. While allosteric behaviors have been documented from experiments and computation, the mechanism of this form of allostery proved difficult to identify at the molecular level. Here, we introduce a novel analysis framework, called N-body Information Theory (NbIT) analysis, which is based on information theory and uses measures of configurational entropy in a biomolecular system to identify microdomains and individual residues that act as (i)-channels for long-distance information sharing between functional sites, and (ii)-coordinators that organize dynamics within functional sites. Application of the new method to molecular dynamics (MD) trajectories of the occluded state of the bacterial leucine transporter LeuT identifies a channel of allosteric coupling between the functionally important intracellular gate and the substrate binding sites known to modulate it. NbIT analysis is shown also to differentiate residues involved primarily in stabilizing the functional sites, from those that contribute to allosteric couplings between sites. NbIT analysis of MD data thus reveals rigorous mechanistic elements of allostery underlying the dynamics of biomolecular systems. PMID:24785005

  19. NbIT--a new information theory-based analysis of allosteric mechanisms reveals residues that underlie function in the leucine transporter LeuT.

    PubMed

    LeVine, Michael V; Weinstein, Harel

    2014-05-01

    Complex networks of interacting residues and microdomains in the structures of biomolecular systems underlie the reliable propagation of information from an input signal, such as the concentration of a ligand, to sites that generate the appropriate output signal, such as enzymatic activity. This information transduction often carries the signal across relatively large distances at the molecular scale in a form of allostery that is essential for the physiological functions performed by biomolecules. While allosteric behaviors have been documented from experiments and computation, the mechanism of this form of allostery proved difficult to identify at the molecular level. Here, we introduce a novel analysis framework, called N-body Information Theory (NbIT) analysis, which is based on information theory and uses measures of configurational entropy in a biomolecular system to identify microdomains and individual residues that act as (i)-channels for long-distance information sharing between functional sites, and (ii)-coordinators that organize dynamics within functional sites. Application of the new method to molecular dynamics (MD) trajectories of the occluded state of the bacterial leucine transporter LeuT identifies a channel of allosteric coupling between the functionally important intracellular gate and the substrate binding sites known to modulate it. NbIT analysis is shown also to differentiate residues involved primarily in stabilizing the functional sites, from those that contribute to allosteric couplings between sites. NbIT analysis of MD data thus reveals rigorous mechanistic elements of allostery underlying the dynamics of biomolecular systems.

  20. A recursively formulated first-order semianalytic artificial satellite theory based on the generalized method of averaging. Volume 1: The generalized method of averaging applied to the artificial satellite problem

    NASA Technical Reports Server (NTRS)

    Mcclain, W. D.

    1977-01-01

    A recursively formulated, first-order, semianalytic artificial satellite theory, based on the generalized method of averaging is presented in two volumes. Volume I comprehensively discusses the theory of the generalized method of averaging applied to the artificial satellite problem. Volume II presents the explicit development in the nonsingular equinoctial elements of the first-order average equations of motion. The recursive algorithms used to evaluate the first-order averaged equations of motion are also presented in Volume II. This semianalytic theory is, in principle, valid for a term of arbitrary degree in the expansion of the third-body disturbing function (nonresonant cases only) and for a term of arbitrary degree and order in the expansion of the nonspherical gravitational potential function.

  1. Methods of Organizational Information Security

    NASA Astrophysics Data System (ADS)

    Martins, José; Dos Santos, Henrique

    The principle objective of this article is to present a literature review for the methods used in the security of information at the level of organizations. Some of the principle problems are identified and a first group of relevant dimensions is presented for an efficient management of information security. The study is based on the literature review made, using some of the more relevant certified articles of this theme, in international reports and in the principle norms of management of information security. From the readings that were done, we identified some of the methods oriented for risk management, norms of certification and good practice of security of information. Some of the norms are oriented for the certification of the product or system and others oriented to the processes of the business. There are also studies with the proposal of Frameworks that suggest the integration of different approaches with the foundation of norms focused on technologies, in processes and taking into consideration the organizational and human environment of the organizations. In our perspective, the biggest contribute to the security of information is the development of a method of security of information for an organization in a conflicting environment. This should make available the security of information, against the possible dimensions of attack that the threats could exploit, through the vulnerability of the organizational actives. This method should support the new concepts of "Network centric warfare", "Information superiority" and "Information warfare" especially developed in this last decade, where information is seen simultaneously as a weapon and as a target.

  2. Derivation of a measure of systolic blood pressure mutability: a novel information theory-based metric from ambulatory blood pressure tests.

    PubMed

    Contreras, Danitza J; Vogel, Eugenio E; Saravia, Gonzalo; Stockins, Benjamin

    2016-03-01

    We provide ambulatory blood pressure (BP) exams with tools based on information theory to quantify fluctuations thus increasing the capture of dynamic test components. Data from 515 ambulatory 24-hour BP exams were considered. Average age was 54 years, 54% were women, and 53% were under BP treatment. The average systolic pressure (SP) was 127 ± 8 mm Hg. A data compressor (wlzip) designed to recognize meaningful information is invoked to measure mutability which is a form of dynamical variability. For patients with the same average SP, different mutability values are obtained which reflects the differences in dynamical variability. In unadjusted linear regression models, mutability had low association with the mean systolic BP (R(2) = 0.056; P < .000001) but larger association with the SP deviation (R(2) = 0.761; P < .001). Wlzip allows detecting levels of variability in SP that could be hazardous. This new indicator can be easily added to the 24-hour BP monitors improving information toward diagnosis.

  3. Information storage media and method

    DOEpatents

    Miller, Steven D.; Endres, George W.

    1999-01-01

    Disclosed is a method for storing and retrieving information. More specifically, the present invention is a method for forming predetermined patterns, or data structures, using materials which exhibit enhanced absorption of light at certain wavelengths or, when interrogated with a light having a first wavelength, provide a luminescent response at a second wavelength. These materials may exhibit this response to light inherently, or may be made to exhibit this response by treating the materials with ionizing radiation.

  4. Levels of Reconstruction as Complementarity in Mixed Methods Research: A Social Theory-Based Conceptual Framework for Integrating Qualitative and Quantitative Research

    PubMed Central

    Carroll, Linda J.; Rothe, J. Peter

    2010-01-01

    Like other areas of health research, there has been increasing use of qualitative methods to study public health problems such as injuries and injury prevention. Likewise, the integration of qualitative and quantitative research (mixed-methods) is beginning to assume a more prominent role in public health studies. Likewise, using mixed-methods has great potential for gaining a broad and comprehensive understanding of injuries and their prevention. However, qualitative and quantitative research methods are based on two inherently different paradigms, and their integration requires a conceptual framework that permits the unity of these two methods. We present a theory-driven framework for viewing qualitative and quantitative research, which enables us to integrate them in a conceptually sound and useful manner. This framework has its foundation within the philosophical concept of complementarity, as espoused in the physical and social sciences, and draws on Bergson’s metaphysical work on the ‘ways of knowing’. Through understanding how data are constructed and reconstructed, and the different levels of meaning that can be ascribed to qualitative and quantitative findings, we can use a mixed-methods approach to gain a conceptually sound, holistic knowledge about injury phenomena that will enhance our development of relevant and successful interventions. PMID:20948937

  5. Levels of reconstruction as complementarity in mixed methods research: a social theory-based conceptual framework for integrating qualitative and quantitative research.

    PubMed

    Carroll, Linda J; Rothe, J Peter

    2010-09-01

    Like other areas of health research, there has been increasing use of qualitative methods to study public health problems such as injuries and injury prevention. Likewise, the integration of qualitative and quantitative research (mixed-methods) is beginning to assume a more prominent role in public health studies. Likewise, using mixed-methods has great potential for gaining a broad and comprehensive understanding of injuries and their prevention. However, qualitative and quantitative research methods are based on two inherently different paradigms, and their integration requires a conceptual framework that permits the unity of these two methods. We present a theory-driven framework for viewing qualitative and quantitative research, which enables us to integrate them in a conceptually sound and useful manner. This framework has its foundation within the philosophical concept of complementarity, as espoused in the physical and social sciences, and draws on Bergson's metaphysical work on the 'ways of knowing'. Through understanding how data are constructed and reconstructed, and the different levels of meaning that can be ascribed to qualitative and quantitative findings, we can use a mixed-methods approach to gain a conceptually sound, holistic knowledge about injury phenomena that will enhance our development of relevant and successful interventions.

  6. High-resolution wave-theory-based ultrasound reflection imaging using the split-step fourier and globally optimized fourier finite-difference methods

    DOEpatents

    Huang, Lianjie

    2013-10-29

    Methods for enhancing ultrasonic reflection imaging are taught utilizing a split-step Fourier propagator in which the reconstruction is based on recursive inward continuation of ultrasonic wavefields in the frequency-space and frequency-wave number domains. The inward continuation within each extrapolation interval consists of two steps. In the first step, a phase-shift term is applied to the data in the frequency-wave number domain for propagation in a reference medium. The second step consists of applying another phase-shift term to data in the frequency-space domain to approximately compensate for ultrasonic scattering effects of heterogeneities within the tissue being imaged (e.g., breast tissue). Results from various data input to the method indicate significant improvements are provided in both image quality and resolution.

  7. Unrestricted density functional theory based on the fragment molecular orbital method for the ground and excited state calculations of large systems

    SciTech Connect

    Nakata, Hiroya; Fedorov, Dmitri G.; Yokojima, Satoshi; Kitaura, Kazuo; Sakurai, Minoru; Nakamura, Shinichiro

    2014-04-14

    We extended the fragment molecular orbital (FMO) method interfaced with density functional theory (DFT) into spin unrestricted formalism (UDFT) and developed energy gradients for the ground state and single point excited state energies based on time-dependent DFT. The accuracy of FMO is evaluated in comparison to the full calculations without fragmentation. Electronic excitations in solvated organic radicals and in the blue copper protein, plastocyanin (PDB code: 1BXV), are reported. The contributions of solvent molecules to the electronic excitations are analyzed in terms of the fragment polarization and quantum effects such as interfragment charge transfer.

  8. Unrestricted density functional theory based on the fragment molecular orbital method for the ground and excited state calculations of large systems.

    PubMed

    Nakata, Hiroya; Fedorov, Dmitri G; Yokojima, Satoshi; Kitaura, Kazuo; Sakurai, Minoru; Nakamura, Shinichiro

    2014-04-14

    We extended the fragment molecular orbital (FMO) method interfaced with density functional theory (DFT) into spin unrestricted formalism (UDFT) and developed energy gradients for the ground state and single point excited state energies based on time-dependent DFT. The accuracy of FMO is evaluated in comparison to the full calculations without fragmentation. Electronic excitations in solvated organic radicals and in the blue copper protein, plastocyanin (PDB code: 1BXV), are reported. The contributions of solvent molecules to the electronic excitations are analyzed in terms of the fragment polarization and quantum effects such as interfragment charge transfer.

  9. Calculations of Lamb wave band gaps and dispersions for piezoelectric phononic plates using mindlin's theory-based plane wave expansion method.

    PubMed

    Hsu, Jin-Chen; Wu, Tsung-Tsong

    2008-02-01

    Based on Mindlin's piezoelectric plate theory and the plane wave expansion method, a formulation is proposed to study the frequency band gaps and dispersion relations of the lower-order Lamb waves in two-dimensional piezoelectric phononic plates. The method is applied to analyze the phononic plates composed of solid-solid and airsolid constituents with square and triangular lattices, respectively. Factors that influence the opening and width of the complete Lamb wave gaps are identified and discussed. For solid/solid phononic plates, it is suggested that the filling material be chosen with larger mass density, proper stiffness, and weak anisotropic factor embedded in a soft matrix in order to obtain wider complete band gaps of the lower-order Lamb waves. By comparing to the calculated results without considering the piezoelectricity, the influences of piezoelectric effect on Lamb waves are analyzed as well. On the other hand, for air/solid phononic plates, a background material itself with proper anisotropy and a high filling fraction of air may favor the opening of the complete Lamb wave gaps.

  10. Dynamic mean field theory for lattice gas models of fluids confined in porous materials: Higher order theory based on the Bethe-Peierls and path probability method approximations

    SciTech Connect

    Edison, John R.; Monson, Peter A.

    2014-07-14

    Recently we have developed a dynamic mean field theory (DMFT) for lattice gas models of fluids in porous materials [P. A. Monson, J. Chem. Phys. 128(8), 084701 (2008)]. The theory can be used to describe the relaxation processes in the approach to equilibrium or metastable states for fluids in pores and is especially useful for studying system exhibiting adsorption/desorption hysteresis. In this paper we discuss the extension of the theory to higher order by means of the path probability method (PPM) of Kikuchi and co-workers. We show that this leads to a treatment of the dynamics that is consistent with thermodynamics coming from the Bethe-Peierls or Quasi-Chemical approximation for the equilibrium or metastable equilibrium states of the lattice model. We compare the results from the PPM with those from DMFT and from dynamic Monte Carlo simulations. We find that the predictions from PPM are qualitatively similar to those from DMFT but give somewhat improved quantitative accuracy, in part due to the superior treatment of the underlying thermodynamics. This comes at the cost of greater computational expense associated with the larger number of equations that must be solved.

  11. Information technology equipment cooling method

    SciTech Connect

    Schultz, Mark D.

    2015-10-20

    According to one embodiment, a system for removing heat from a rack of information technology equipment may include a sidecar indoor air to liquid heat exchanger that cools air utilized by the rack of information technology equipment to cool the rack of information technology equipment. The system may also include a liquid to liquid heat exchanger and an outdoor heat exchanger. The system may further include configurable pathways to connect and control fluid flow through the sidecar heat exchanger, the liquid to liquid heat exchanger, the rack of information technology equipment, and the outdoor heat exchanger based upon ambient temperature and/or ambient humidity to remove heat generated by the rack of information technology equipment.

  12. Theory-Based Considerations Influence the Interpretation of Generic Sentences

    ERIC Educational Resources Information Center

    Cimpian, Andrei; Gelman, Susan A.; Brandone, Amanda C.

    2010-01-01

    Under what circumstances do people agree that a kind-referring generic sentence (e.g., "Swans are beautiful") is true? We hypothesised that theory-based considerations are sufficient, independently of prevalence/frequency information, to lead to acceptance of a generic statement. To provide evidence for this general point, we focused on…

  13. Theory-Based Evaluation: Reflections Ten Years On. Theory-Based Evaluation: Past, Present, and Future

    ERIC Educational Resources Information Center

    Rogers, Patricia J.; Weiss, Carol H.

    2007-01-01

    This chapter begins with a brief introduction by Rogers, in which she highlights the continued salience of Carol Weiss's decade-old questions about theory-based evaluation. Theory-based evaluation has developed significantly since Carol Weiss's chapter was first published ten years ago. In 1997 Weiss pointed to theory-based evaluation being mostly…

  14. Intervention mapping protocol for developing a theory-based diabetes self-management education program.

    PubMed

    Song, Misoon; Choi, Suyoung; Kim, Se-An; Seo, Kyoungsan; Lee, Soo Jin

    2015-01-01

    Development of behavior theory-based health promotion programs is encouraged with the paradigm shift from contents to behavior outcomes. This article describes the development process of the diabetes self-management program for older Koreans (DSME-OK) using intervention mapping (IM) protocol. The IM protocol includes needs assessment, defining goals and objectives, identifying theory and determinants, developing a matrix to form change objectives, selecting strategies and methods, structuring the program, and planning for evaluation and pilot testing. The DSME-OK adopted seven behavior objectives developed by the American Association of Diabetes Educators as behavioral outcomes. The program applied an information-motivation-behavioral skills model, and interventions were targeted to 3 determinants to change health behaviors. Specific methods were selected to achieve each objective guided by IM protocol. As the final step, program evaluation was planned including a pilot test. The DSME-OK was structured as the 3 determinants of the IMB model were intervened to achieve behavior objectives in each session. The program has 12 weekly 90-min sessions tailored for older adults. Using the IM protocol in developing a theory-based self-management program was beneficial in terms of providing a systematic guide to developing theory-based and behavior outcome-focused health education programs.

  15. Intervention mapping protocol for developing a theory-based diabetes self-management education program.

    PubMed

    Song, Misoon; Choi, Suyoung; Kim, Se-An; Seo, Kyoungsan; Lee, Soo Jin

    2015-01-01

    Development of behavior theory-based health promotion programs is encouraged with the paradigm shift from contents to behavior outcomes. This article describes the development process of the diabetes self-management program for older Koreans (DSME-OK) using intervention mapping (IM) protocol. The IM protocol includes needs assessment, defining goals and objectives, identifying theory and determinants, developing a matrix to form change objectives, selecting strategies and methods, structuring the program, and planning for evaluation and pilot testing. The DSME-OK adopted seven behavior objectives developed by the American Association of Diabetes Educators as behavioral outcomes. The program applied an information-motivation-behavioral skills model, and interventions were targeted to 3 determinants to change health behaviors. Specific methods were selected to achieve each objective guided by IM protocol. As the final step, program evaluation was planned including a pilot test. The DSME-OK was structured as the 3 determinants of the IMB model were intervened to achieve behavior objectives in each session. The program has 12 weekly 90-min sessions tailored for older adults. Using the IM protocol in developing a theory-based self-management program was beneficial in terms of providing a systematic guide to developing theory-based and behavior outcome-focused health education programs. PMID:26062288

  16. Method for gathering and summarizing internet information

    DOEpatents

    Potok, Thomas E [Oak Ridge, TN; Elmore, Mark Thomas [Oak Ridge, TN; Reed, Joel Wesley [Knoxville, TN; Treadwell, Jim N; Samatova, Nagiza Faridovna [Oak Ridge, TN

    2008-01-01

    A computer method of gathering and summarizing large amounts of information comprises collecting information from a plurality of information sources (14, 51) according to respective maps (52) of the information sources (14), converting the collected information from a storage format to XML-language documents (26, 53) and storing the XML-language documents in a storage medium, searching for documents (55) according to a search query (13) having at least one term and identifying the documents (26) found in the search, and displaying the documents as nodes (33) of a tree structure (32) having links (34) and nodes (33) so as to indicate similarity of the documents to each other.

  17. Method for gathering and summarizing internet information

    DOEpatents

    Potok, Thomas E.; Elmore, Mark Thomas; Reed, Joel Wesley; Treadwell, Jim N.; Samatova, Nagiza Faridovna

    2010-04-06

    A computer method of gathering and summarizing large amounts of information comprises collecting information from a plurality of information sources (14, 51) according to respective maps (52) of the information sources (14), converting the collected information from a storage format to XML-language documents (26, 53) and storing the XML-language documents in a storage medium, searching for documents (55) according to a search query (13) having at least one term and identifying the documents (26) found in the search, and displaying the documents as nodes (33) of a tree structure (32) having links (34) and nodes (33) so as to indicate similarity of the documents to each other.

  18. Analysis of methods. [information systems evolution environment

    NASA Technical Reports Server (NTRS)

    Mayer, Richard J. (Editor); Ackley, Keith A.; Wells, M. Sue; Mayer, Paula S. D.; Blinn, Thomas M.; Decker, Louis P.; Toland, Joel A.; Crump, J. Wesley; Menzel, Christopher P.; Bodenmiller, Charles A.

    1991-01-01

    Information is one of an organization's most important assets. For this reason the development and maintenance of an integrated information system environment is one of the most important functions within a large organization. The Integrated Information Systems Evolution Environment (IISEE) project has as one of its primary goals a computerized solution to the difficulties involved in the development of integrated information systems. To develop such an environment a thorough understanding of the enterprise's information needs and requirements is of paramount importance. This document is the current release of the research performed by the Integrated Development Support Environment (IDSE) Research Team in support of the IISEE project. Research indicates that an integral part of any information system environment would be multiple modeling methods to support the management of the organization's information. Automated tool support for these methods is necessary to facilitate their use in an integrated environment. An integrated environment makes it necessary to maintain an integrated database which contains the different kinds of models developed under the various methodologies. In addition, to speed the process of development of models, a procedure or technique is needed to allow automatic translation from one methodology's representation to another while maintaining the integrity of both. The purpose for the analysis of the modeling methods included in this document is to examine these methods with the goal being to include them in an integrated development support environment. To accomplish this and to develop a method for allowing intra-methodology and inter-methodology model element reuse, a thorough understanding of multiple modeling methodologies is necessary. Currently the IDSE Research Team is investigating the family of Integrated Computer Aided Manufacturing (ICAM) DEFinition (IDEF) languages IDEF(0), IDEF(1), and IDEF(1x), as well as ENALIM, Entity

  19. Research Investigation of Information Access Methods

    ERIC Educational Resources Information Center

    Heinrichs, John H.; Sharkey, Thomas W.; Lim, Jeen-Su

    2006-01-01

    This study investigates the satisfaction of library users at Wayne State University who utilize alternative information access methods. The LibQUAL+[TM] desired and perceived that satisfaction ratings are used to determine the user's "superiority gap." By focusing limited library resources to address "superiority gap" issues identified by each…

  20. Scenistic Methods for Training: Applications and Practice

    ERIC Educational Resources Information Center

    Lyons, Paul R.

    2011-01-01

    Purpose: This paper aims to complement an earlier article (2010) in "Journal of European Industrial Training" in which the description and theory bases of scenistic methods were presented. This paper also offers a description of scenistic methods and information on theory bases. However, the main thrust of this paper is to describe, give suggested…

  1. Method and apparatus for displaying information

    NASA Technical Reports Server (NTRS)

    Ingber, Donald E. (Inventor); Huang, Sui (Inventor); Eichler, Gabriel (Inventor)

    2010-01-01

    A method for displaying large amounts of information. The method includes the steps of forming a spatial layout of tiles each corresponding to a representative reference element; mapping observed elements onto the spatial layout of tiles of representative reference elements; assigning a respective value to each respective tile of the spatial layout of the representative elements; and displaying an image of the spatial layout of tiles of representative elements. Each tile includes atomic attributes of representative elements. The invention also relates to an apparatus for displaying large amounts of information. The apparatus includes a tiler forming a spatial layout of tiles, each corresponding to a representative reference element; a comparator mapping observed elements onto said spatial layout of tiles of representative reference elements; an assigner assigning a respective value to each respective tile of said spatial layout of representative reference elements; and a display displaying an image of the spatial layout of tiles of representative reference elements.

  2. Knowledge information management toolkit and method

    DOEpatents

    Hempstead, Antoinette R.; Brown, Kenneth L.

    2006-08-15

    A system is provided for managing user entry and/or modification of knowledge information into a knowledge base file having an integrator support component and a data source access support component. The system includes processing circuitry, memory, a user interface, and a knowledge base toolkit. The memory communicates with the processing circuitry and is configured to store at least one knowledge base. The user interface communicates with the processing circuitry and is configured for user entry and/or modification of knowledge pieces within a knowledge base. The knowledge base toolkit is configured for converting knowledge in at least one knowledge base from a first knowledge base form into a second knowledge base form. A method is also provided.

  3. Information Work Analysis: An Approach to Research on Information Interactions and Information Behaviour in Context

    ERIC Educational Resources Information Center

    Huvila, Isto

    2008-01-01

    Introduction: A work roles and role theory-based approach to conceptualise human information activity, denoted information work analysis is discussed. The present article explicates the approach and its special characteristics and benefits in comparison to earlier methods of analysing human information work. Method: The approach is discussed in…

  4. Methods of Cost Reduction in Information Retrieval.

    ERIC Educational Resources Information Center

    Wilmoth, James Noel

    Cost effectiveness of the QUERY program for searching the Educational Resources Information Center (ERIC) data base has been an important issue at Auburn University. At least two broad categories of costs are associated with information retrieval from a data base such as ERIC: fixed costs or overhead and data base associated costs. The concern at…

  5. Compressed sensing theory-based channel estimation for optical orthogonal frequency division multiplexing communication system

    NASA Astrophysics Data System (ADS)

    Zhao, Hui; Li, Minghui; Wang, Ruyan; Liu, Yuanni; Song, Daiping

    2014-09-01

    Due to the spare multipath property of the channel, a channel estimation method, which is based on partial superimposed training sequence and compressed sensing theory, is proposed for line of sight optical orthogonal frequency division multiplexing communication systems. First, a continuous training sequence is added at variable power ratio to the cyclic prefix of orthogonal frequency division multiplexing symbols at the transmitter prior to transmission. Then the observation matrix of compressed sensing theory is structured by the use of the training symbols at receiver. Finally, channel state information is estimated using sparse signal reconstruction algorithm. Compared to traditional training sequences, the proposed partial superimposed training sequence not only improves the spectral efficiency, but also reduces the influence to information symbols. In addition, compared with classical least squares and linear minimum mean square error methods, the proposed compressed sensing theory based channel estimation method can improve both the estimation accuracy and the system performance. Simulation results are given to demonstrate the performance of the proposed method.

  6. Evaluating Theory-Based Evaluation: Information, Norms, and Adherence

    ERIC Educational Resources Information Center

    Jacobs, W. Jake; Sisco, Melissa; Hill, Dawn; Malter, Frederic; Figueredo, Aurelio Jose

    2012-01-01

    Programmatic social interventions attempt to produce appropriate social-norm-guided behavior in an open environment. A marriage of applicable psychological theory, appropriate program evaluation theory, and outcome of evaluations of specific social interventions assures the acquisition of cumulative theory and the production of successful social…

  7. Governance Methods Used in Externalizing Information Technology

    ERIC Educational Resources Information Center

    Chan, Steven King-Lun

    2012-01-01

    Information technology (IT) is the largest capital expenditure in many firms and is an integral part of many organizations' strategies. However, the benefits that each company receives from its IT investments vary. One study by Weill (2004) found that the top performer in the sample was estimated to have as high as a 40% greater return on its…

  8. Informing Patients About Placebo Effects: Using Evidence, Theory, and Qualitative Methods to Develop a New Website

    PubMed Central

    Greville-Harris, Maddy; Bostock, Jennifer; Din, Amy; Graham, Cynthia A; Lewith, George; Liossi, Christina; O’Riordan, Tim; White, Peter; Yardley, Lucy

    2016-01-01

    Background According to established ethical principles and guidelines, patients in clinical trials should be fully informed about the interventions they might receive. However, information about placebo-controlled clinical trials typically focuses on the new intervention being tested and provides limited and at times misleading information about placebos. Objective We aimed to create an informative, scientifically accurate, and engaging website that could be used to improve understanding of placebo effects among patients who might be considering taking part in a placebo-controlled clinical trial. Methods Our approach drew on evidence-, theory-, and person-based intervention development. We used existing evidence and theory about placebo effects to develop content that was scientifically accurate. We used existing evidence and theory of health behavior to ensure our content would be communicated persuasively, to an audience who might currently be ignorant or misinformed about placebo effects. A qualitative ‘think aloud’ study was conducted in which 10 participants viewed prototypes of the website and spoke their thoughts out loud in the presence of a researcher. Results The website provides information about 10 key topics and uses text, evidence summaries, quizzes, audio clips of patients’ stories, and a short film to convey key messages. Comments from participants in the think aloud study highlighted occasional misunderstandings and off-putting/confusing features. These were addressed by modifying elements of content, style, and navigation to improve participants’ experiences of using the website. Conclusions We have developed an evidence-based website that incorporates theory-based techniques to inform members of the public about placebos and placebo effects. Qualitative research ensured our website was engaging and convincing for our target audience who might not perceive a need to learn about placebo effects. Before using the website in clinical trials

  9. 5 CFR 1640.6 - Methods of providing information.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 5 Administrative Personnel 3 2011-01-01 2011-01-01 false Methods of providing information. 1640.6 Section 1640.6 Administrative Personnel FEDERAL RETIREMENT THRIFT INVESTMENT BOARD PERIODIC PARTICIPANT STATEMENTS § 1640.6 Methods of providing information. The TSP will furnish the information described in...

  10. Cable Television: A Method for Delivering Information.

    ERIC Educational Resources Information Center

    Nebraska Univ., Lincoln. Cooperative Extension Service.

    This report presents the recommendations of a committee that was formed to explore the possibility of using cable television networks as a method of delivering extension education programs to urban audiences. After developing and testing a pilot project that used cable television as a mode to disseminate horticulture and 4-H leader training…

  11. Applying Human Computation Methods to Information Science

    ERIC Educational Resources Information Center

    Harris, Christopher Glenn

    2013-01-01

    Human Computation methods such as crowdsourcing and games with a purpose (GWAP) have each recently drawn considerable attention for their ability to synergize the strengths of people and technology to accomplish tasks that are challenging for either to do well alone. Despite this increased attention, much of this transformation has been focused on…

  12. Using Qualitative Methods to Inform Scale Development

    ERIC Educational Resources Information Center

    Rowan, Noell; Wulff, Dan

    2007-01-01

    This article describes the process by which one study utilized qualitative methods to create items for a multi dimensional scale to measure twelve step program affiliation. The process included interviewing fourteen addicted persons while in twelve step focused treatment about specific pros (things they like or would miss out on by not being…

  13. Observational attachment theory-based parenting measures predict children's attachment narratives independently from social learning theory-based measures.

    PubMed

    Matias, Carla; O'Connor, Thomas G; Futh, Annabel; Scott, Stephen

    2014-01-01

    Conceptually and methodologically distinct models exist for assessing quality of parent-child relationships, but few studies contrast competing models or assess their overlap in predicting developmental outcomes. Using observational methodology, the current study examined the distinctiveness of attachment theory-based and social learning theory-based measures of parenting in predicting two key measures of child adjustment: security of attachment narratives and social acceptance in peer nominations. A total of 113 5-6-year-old children from ethnically diverse families participated. Parent-child relationships were rated using standard paradigms. Measures derived from attachment theory included sensitive responding and mutuality; measures derived from social learning theory included positive attending, directives, and criticism. Child outcomes were independently-rated attachment narrative representations and peer nominations. Results indicated that Attachment theory-based and Social Learning theory-based measures were modestly correlated; nonetheless, parent-child mutuality predicted secure child attachment narratives independently of social learning theory-based measures; in contrast, criticism predicted peer-nominated fighting independently of attachment theory-based measures. In young children, there is some evidence that attachment theory-based measures may be particularly predictive of attachment narratives; however, no single model of measuring parent-child relationships is likely to best predict multiple developmental outcomes. Assessment in research and applied settings may benefit from integration of different theoretical and methodological paradigms.

  14. Application of geo-information science methods in ecotourism exploitation

    NASA Astrophysics Data System (ADS)

    Dong, Suocheng; Hou, Xiaoli

    2004-11-01

    Application of geo-information science methods in ecotourism development was discussed in the article. Since 1990s, geo-information science methods, which take the 3S (Geographic Information System, Global Positioning System, and Remote Sensing) as core techniques, has played an important role in resources reconnaissance, data management, environment monitoring, and regional planning. Geo-information science methods can easily analyze and convert geographic spatial data. The application of 3S methods is helpful to sustainable development in tourism. Various assignments are involved in the development of ecotourism, such as reconnaissance of ecotourism resources, drawing of tourism maps, dealing with mass data, and also tourism information inquire, employee management, quality management of products. The utilization of geo-information methods in ecotourism can make the development more efficient by promoting the sustainable development of tourism and the protection of eco-environment.

  15. Information in Our World: Conceptions of Information and Problems of Method in Information Science

    ERIC Educational Resources Information Center

    Ma, Lai

    2012-01-01

    Many concepts of information have been proposed and discussed in library and information science. These concepts of information can be broadly categorized as empirical and situational information. Unlike nomenclatures in many sciences, however, the concept of information in library and information science does not bear a generally accepted…

  16. Discourse and Practice in Information Literacy and Information Seeking: Gaps and Opportunities

    ERIC Educational Resources Information Center

    Julien, H.; Williamson, K.

    2010-01-01

    Introduction: This paper argues for increased research consideration of the conceptual overlap between information seeking and information literacy, and for scholarly attention to theory-based empirical research that has potential value to practitioners. Method: The paper reviews information seeking and information literacy research, and…

  17. Axiomatic Evaluation Method and Content Structure for Information Appliances

    ERIC Educational Resources Information Center

    Guo, Yinni

    2010-01-01

    Extensive studies have been conducted to determine how best to present information in order to enhance usability, but not what information is needed to be presented for effective decision making. Hence, this dissertation addresses the factor structure of the nature of information needed for presentation and proposes a more effective method than…

  18. Method and system of integrating information from multiple sources

    DOEpatents

    Alford, Francine A.; Brinkerhoff, David L.

    2006-08-15

    A system and method of integrating information from multiple sources in a document centric application system. A plurality of application systems are connected through an object request broker to a central repository. The information may then be posted on a webpage. An example of an implementation of the method and system is an online procurement system.

  19. Economic valuation of informal care: the contingent valuation method applied to informal caregiving.

    PubMed

    van den Berg, Bernard; Brouwer, Werner; van Exel, Job; Koopmanschap, Marc

    2005-02-01

    This paper reports the results of the application of the contingent valuation method (CVM) to determine a monetary value of informal care. We discuss the current practice in valuing informal care and a theoretical model of the costs and benefits related to the provision of informal care. In addition, we developed a survey in which informal caregivers' willingness to accept (WTA) to provide an additional hour of informal care was elicited. This method is better than normally recommended valuation methods able to capture the heterogeneity and dynamics of informal care. Data were obtained from postal surveys. A total of 153 informal caregivers and 149 care recipients with rheumatoid arthritis returned a completed survey. Informal caregivers reported a mean WTA to provide a hypothetical additional hour of informal care of 9.52 Euro (n=124). Many hypotheses derived from the theoretical model and the literature were supported by the data.CVM is a promising alternative for existing methods like the opportunity cost method and the proxy good method to determine a monetary value of informal care that can be incorporated in the numerator of any economic evaluation.

  20. XML-based product information processing method for product design

    NASA Astrophysics Data System (ADS)

    Zhang, Zhen Yu

    2011-12-01

    Design knowledge of modern mechatronics product is based on information processing as the center of the knowledge-intensive engineering, thus product design innovation is essentially the knowledge and information processing innovation. Analysis of the role of mechatronics product design knowledge and information management features, a unified model of XML-based product information processing method is proposed. Information processing model of product design includes functional knowledge, structural knowledge and their relationships. For the expression of product function element, product structure element, product mapping relationship between function and structure based on the XML model are proposed. The information processing of a parallel friction roller is given as an example, which demonstrates that this method is obviously helpful for knowledge-based design system and product innovation.

  1. XML-based product information processing method for product design

    NASA Astrophysics Data System (ADS)

    Zhang, Zhen Yu

    2012-01-01

    Design knowledge of modern mechatronics product is based on information processing as the center of the knowledge-intensive engineering, thus product design innovation is essentially the knowledge and information processing innovation. Analysis of the role of mechatronics product design knowledge and information management features, a unified model of XML-based product information processing method is proposed. Information processing model of product design includes functional knowledge, structural knowledge and their relationships. For the expression of product function element, product structure element, product mapping relationship between function and structure based on the XML model are proposed. The information processing of a parallel friction roller is given as an example, which demonstrates that this method is obviously helpful for knowledge-based design system and product innovation.

  2. Self-informant Agreement for Personality and Evaluative Person Descriptors: Comparing Methods for Creating Informant Measures

    PubMed Central

    Simms, Leonard J.; Zelazny, Kerry; Yam, Wern How; Gros, Daniel F.

    2011-01-01

    Little attention typically is paid to the way self-report measures are translated for use in self-informant agreement studies. We studied two possible methods for creating informant measures: (a) the traditional method in which self-report items were translated from the first- to the third-person and (b) an alternative meta-perceptual method in which informants were directed to rate their perception of the targets’ self-perception. We hypothesized that the latter method would yield stronger self-informant agreement for evaluative personality dimensions measured by indirect item markers. We studied these methods in a sample of 303 undergraduate friendship dyads. Results revealed mean-level differences between methods, similar self-informant agreement across methods, stronger agreement for Big Five dimensions than for evaluative dimensions, and incremental validity for meta-perceptual informant rating methods. Limited power reduced the interpretability of several sparse acquaintanceship effects. We conclude that traditional informant methods are appropriate for most personality traits, but meta-perceptual methods may be more appropriate when personality questionnaire items reflect indirect indicators of the trait being measured, which is particularly likely for evaluative traits. PMID:21541262

  3. Collecting Information for Rating Global Assessment of Functioning (GAF): Sources of Information and Methods for Information Collection

    PubMed Central

    Aas, I. H. Monrad

    2014-01-01

    Introduction: Global Assessment of Functioning (GAF) is an assessment instrument that is known worldwide. It is widely used for rating the severity of illness. Results from evaluations in psychiatry should characterize the patients. Rating of GAF is based on collected information. The aim of the study is to identify the factors involved in collecting information that is relevant for rating GAF, and gaps in knowledge where it is likely that further development would play a role for improved scoring. Methods: A literature search was conducted with a combination of thorough hand search and search in the bibliographic databases PubMed, PsycINFO, Google Scholar, and Campbell Collaboration Library of Systematic Reviews. Results: Collection of information for rating GAF depends on two fundamental factors: the sources of information and the methods for information collection. Sources of information are patients, informants, health personnel, medical records, letters of referral and police records about violence and substance abuse. Methods for information collection include the many different types of interview – unstructured, semi-structured, structured, interviews for Axis I and II disorders, semistructured interviews for rating GAF, and interviews of informants – as well as instruments for rating symptoms and functioning, and observation. The different sources of information, and methods for collection, frequently result in inconsistencies in the information collected. The variation in collected information, and lack of a generally accepted algorithm for combining collected information, is likely to be important for rated GAF values, but there is a fundamental lack of knowledge about the degree of importance. Conclusions: Research to improve GAF has not reached a high level. Rated GAF values are likely to be influenced by both the sources of information used and the methods employed for information collection, but the lack of research-based information about these

  4. Consent, Informal Organization and Job Rewards: A Mixed Methods Analysis

    ERIC Educational Resources Information Center

    Laubach, Marty

    2005-01-01

    This study uses a mixed methods approach to workplace dynamics. Ethnographic observations show that the consent deal underlies an informal stratification that divides the workplace into an "informal periphery," a "conventional core" and an "administrative clan." The "consent deal" is defined as an exchange of autonomy, voice and schedule…

  5. Opinion: Clarifying Two Controversies about Information Mapping's Method.

    ERIC Educational Resources Information Center

    Horn, Robert E.

    1992-01-01

    Describes Information Mapping, a methodology for the analysis, organization, sequencing, and presentation of information and explains three major parts of the method: (1) content analysis, (2) project life-cycle synthesis and integration of the content analysis, and (3) sequencing and formatting. Major criticisms of the methodology are addressed.…

  6. 48 CFR 2905.101 - Methods of disseminating information.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 7 2010-10-01 2010-10-01 false Methods of disseminating information. 2905.101 Section 2905.101 Federal Acquisition Regulations System DEPARTMENT OF LABOR ACQUISITION... dissemination of information concerning procurement actions. The Division of Acquisition Management...

  7. Information theory in living systems, methods, applications, and challenges.

    PubMed

    Gatenby, Robert A; Frieden, B Roy

    2007-02-01

    Living systems are distinguished in nature by their ability to maintain stable, ordered states far from equilibrium. This is despite constant buffeting by thermodynamic forces that, if unopposed, will inevitably increase disorder. Cells maintain a steep transmembrane entropy gradient by continuous application of information that permits cellular components to carry out highly specific tasks that import energy and export entropy. Thus, the study of information storage, flow and utilization is critical for understanding first principles that govern the dynamics of life. Initial biological applications of information theory (IT) used Shannon's methods to measure the information content in strings of monomers such as genes, RNA, and proteins. Recent work has used bioinformatic and dynamical systems to provide remarkable insights into the topology and dynamics of intracellular information networks. Novel applications of Fisher-, Shannon-, and Kullback-Leibler informations are promoting increased understanding of the mechanisms by which genetic information is converted to work and order. Insights into evolution may be gained by analysis of the the fitness contributions from specific segments of genetic information as well as the optimization process in which the fitness are constrained by the substrate cost for its storage and utilization. Recent IT applications have recognized the possible role of nontraditional information storage structures including lipids and ion gradients as well as information transmission by molecular flux across cell membranes. Many fascinating challenges remain, including defining the intercellular information dynamics of multicellular organisms and the role of disordered information storage and flow in disease. PMID:17083004

  8. Improving breast cancer control among Latinas: evaluation of a theory-based educational program.

    PubMed

    Mishra, S I; Chavez, L R; Magaña, J R; Nava, P; Burciaga Valdez, R; Hubbell, F A

    1998-10-01

    The study evaluated a theory-based breast cancer control program specially developed for less acculturated Latinas. The authors used a quasi-experimental design with random assignment of Latinas into experimental (n = 51) or control (n = 37) groups that completed one pretest and two posttest surveys. The experimental group received the educational program, which was based on Bandura's self-efficacy theory and Freire's empowerment pedagogy. Outcome measures included knowledge, perceived self-efficacy, attitudes, breast self-examination (BSE) skills, and mammogram use. At posttest 1, controlling for pretest scores, the experimental group was significantly more likely than the control group to have more medically recognized knowledge (sum of square [SS] = 17.0, F = 6.58, p < .01), have less medically recognized knowledge (SS = 128.8, F = 39.24, p < .001), greater sense of perceived self-efficacy (SS = 316.5, F = 9.63, p < .01), and greater adeptness in the conduct of BSE (SS = 234.8, F = 153.33, p < .001). Cancer control programs designed for less acculturated women should use informal and interactive educational methods that incorporate skill-enhancing and empowering techniques. PMID:9768384

  9. A queuing-theory-based interval-fuzzy robust two-stage programming model for environmental management under uncertainty

    NASA Astrophysics Data System (ADS)

    Sun, Y.; Li, Y. P.; Huang, G. H.

    2012-06-01

    In this study, a queuing-theory-based interval-fuzzy robust two-stage programming (QB-IRTP) model is developed through introducing queuing theory into an interval-fuzzy robust two-stage (IRTP) optimization framework. The developed QB-IRTP model can not only address highly uncertain information for the lower and upper bounds of interval parameters but also be used for analysing a variety of policy scenarios that are associated with different levels of economic penalties when the promised targets are violated. Moreover, it can reflect uncertainties in queuing theory problems. The developed method has been applied to a case of long-term municipal solid waste (MSW) management planning. Interval solutions associated with different waste-generation rates, different waiting costs and different arriving rates have been obtained. They can be used for generating decision alternatives and thus help managers to identify desired MSW management policies under various economic objectives and system reliability constraints.

  10. Theory-Based University Admissions Testing for a New Millennium

    ERIC Educational Resources Information Center

    Sternberg, Robert J.

    2004-01-01

    This article describes two projects based on Robert J. Sternberg's theory of successful intelligence and designed to provide theory-based testing for university admissions. The first, Rainbow Project, provided a supplementary test of analytical, practical, and creative skills to augment the SAT in predicting college performance. The Rainbow…

  11. Theory-Based Approaches to the Concept of Life

    ERIC Educational Resources Information Center

    El-Hani, Charbel Nino

    2008-01-01

    In this paper, I argue that characterisations of life through lists of properties have several shortcomings and should be replaced by theory-based accounts that explain the coexistence of a set of properties in living beings. The concept of life should acquire its meaning from its relationships with other concepts inside a theory. I illustrate…

  12. Is your phylogeny informative? Measuring the power of comparative methods.

    PubMed

    Boettiger, Carl; Coop, Graham; Ralph, Peter

    2012-07-01

    Phylogenetic comparative methods may fail to produce meaningful results when either the underlying model is inappropriate or the data contain insufficient information to inform the inference. The ability to measure the statistical power of these methods has become crucial to ensure that data quantity keeps pace with growing model complexity. Through simulations, we show that commonly applied model choice methods based on information criteria can have remarkably high error rates; this can be a problem because methods to estimate the uncertainty or power are not widely known or applied. Furthermore, the power of comparative methods can depend significantly on the structure of the data. We describe a Monte Carlo-based method which addresses both of these challenges, and show how this approach both quantifies and substantially reduces errors relative to information criteria. The method also produces meaningful confidence intervals for model parameters. We illustrate how the power to distinguish different models, such as varying levels of selection, varies both with number of taxa and structure of the phylogeny. We provide an open-source implementation in the pmc ("Phylogenetic Monte Carlo") package for the R programming language. We hope such power analysis becomes a routine part of model comparison in comparative methods. PMID:22759299

  13. System and method for acquisition management of subject position information

    DOEpatents

    Carrender, Curt

    2007-01-23

    A system and method for acquisition management of subject position information that utilizes radio frequency identification (RF ID) to store position information in position tags. Tag programmers receive position information from external positioning systems, such as the Global Positioning System (GPS), from manual inputs, such as keypads, or other tag programmers. The tag programmers program each position tag with the received position information. Both the tag programmers and the position tags can be portable or fixed. Implementations include portable tag programmers and fixed position tags for subject position guidance, and portable tag programmers for collection sample labeling. Other implementations include fixed tag programmers and portable position tags for subject route recordation. Position tags can contain other associated information such as destination address of an affixed subject for subject routing.

  14. System and method for acquisition management of subject position information

    DOEpatents

    Carrender, Curt

    2005-12-13

    A system and method for acquisition management of subject position information that utilizes radio frequency identification (RF ID) to store position information in position tags. Tag programmers receive position information from external positioning systems, such as the Global Positioning System (GPS), from manual inputs, such as keypads, or other tag programmers. The tag programmers program each position tag with the received position information. Both the tag programmers and the position tags can be portable or fixed. Implementations include portable tag programmers and fixed position tags for subject position guidance, and portable tag programmers for collection sample labeling. Other implementations include fixed tag programmers and portable position tags for subject route recordation. Position tags can contain other associated information such as destination address of an affixed subject for subject routing.

  15. Adaptive windowed range-constrained Otsu method using local information

    NASA Astrophysics Data System (ADS)

    Zheng, Jia; Zhang, Dinghua; Huang, Kuidong; Sun, Yuanxi; Tang, Shaojie

    2016-01-01

    An adaptive windowed range-constrained Otsu method using local information is proposed for improving the performance of image segmentation. First, the reason why traditional thresholding methods do not perform well in the segmentation of complicated images is analyzed. Therein, the influences of global and local thresholdings on the image segmentation are compared. Second, two methods that can adaptively change the size of the local window according to local information are proposed by us. The characteristics of the proposed methods are analyzed. Thereby, the information on the number of edge pixels in the local window of the binarized variance image is employed to adaptively change the local window size. Finally, the superiority of the proposed method over other methods such as the range-constrained Otsu, the active contour model, the double Otsu, the Bradley's, and the distance-regularized level set evolution is demonstrated. It is validated by the experiments that the proposed method can keep more details and acquire much more satisfying area overlap measure as compared with the other conventional methods.

  16. Financial time series analysis based on information categorization method

    NASA Astrophysics Data System (ADS)

    Tian, Qiang; Shang, Pengjian; Feng, Guochen

    2014-12-01

    The paper mainly applies the information categorization method to analyze the financial time series. The method is used to examine the similarity of different sequences by calculating the distances between them. We apply this method to quantify the similarity of different stock markets. And we report the results of similarity in US and Chinese stock markets in periods 1991-1998 (before the Asian currency crisis), 1999-2006 (after the Asian currency crisis and before the global financial crisis), and 2007-2013 (during and after global financial crisis) by using this method. The results show the difference of similarity between different stock markets in different time periods and the similarity of the two stock markets become larger after these two crises. Also we acquire the results of similarity of 10 stock indices in three areas; it means the method can distinguish different areas' markets from the phylogenetic trees. The results show that we can get satisfactory information from financial markets by this method. The information categorization method can not only be used in physiologic time series, but also in financial time series.

  17. Reverse engineering cellular networks with information theoretic methods.

    PubMed

    Villaverde, Alejandro F; Ross, John; Banga, Julio R

    2013-01-01

    Building mathematical models of cellular networks lies at the core of systems biology. It involves, among other tasks, the reconstruction of the structure of interactions between molecular components, which is known as network inference or reverse engineering. Information theory can help in the goal of extracting as much information as possible from the available data. A large number of methods founded on these concepts have been proposed in the literature, not only in biology journals, but in a wide range of areas. Their critical comparison is difficult due to the different focuses and the adoption of different terminologies. Here we attempt to review some of the existing information theoretic methodologies for network inference, and clarify their differences. While some of these methods have achieved notable success, many challenges remain, among which we can mention dealing with incomplete measurements, noisy data, counterintuitive behaviour emerging from nonlinear relations or feedback loops, and computational burden of dealing with large data sets.

  18. Integrating Informative Priors from Experimental Research with Bayesian Methods

    PubMed Central

    Hamra, Ghassan; Richardson, David; MacLehose, Richard; Wing, Steve

    2013-01-01

    Informative priors can be a useful tool for epidemiologists to handle problems of sparse data in regression modeling. It is sometimes the case that an investigator is studying a population exposed to two agents, X and Y, where Y is the agent of primary interest. Previous research may suggest that the exposures have different effects on the health outcome of interest, one being more harmful than the other. Such information may be derived from epidemiologic analyses; however, in the case where such evidence is unavailable, knowledge can be drawn from toxicologic studies or other experimental research. Unfortunately, using toxicologic findings to develop informative priors in epidemiologic analyses requires strong assumptions, with no established method for its utilization. We present a method to help bridge the gap between animal and cellular studies and epidemiologic research by specification of an order-constrained prior. We illustrate this approach using an example from radiation epidemiology. PMID:23222512

  19. Principals' Informal Methods for Appraising Poor-Performing Teachers

    ERIC Educational Resources Information Center

    Yariv, Eliezer

    2009-01-01

    Teacher appraisal is never an easy task, especially of teachers experiencing difficulties and failures. Nevertheless it is a requirement for good management, in our schools no less than our corporations. Forty elementary school principals in Israel described the informal methods they use to appraise teachers who are performing poorly. Most…

  20. The value of value of information: best informing research design and prioritization using current methods.

    PubMed

    Eckermann, Simon; Karnon, Jon; Willan, Andrew R

    2010-01-01

    Value of information (VOI) methods have been proposed as a systematic approach to inform optimal research design and prioritization. Four related questions arise that VOI methods could address. (i) Is further research for a health technology assessment (HTA) potentially worthwhile? (ii) Is the cost of a given research design less than its expected value? (iii) What is the optimal research design for an HTA? (iv) How can research funding be best prioritized across alternative HTAs? Following Occam's razor, we consider the usefulness of VOI methods in informing questions 1-4 relative to their simplicity of use. Expected value of perfect information (EVPI) with current information, while simple to calculate, is shown to provide neither a necessary nor a sufficient condition to address question 1, given that what EVPI needs to exceed varies with the cost of research design, which can vary from very large down to negligible. Hence, for any given HTA, EVPI does not discriminate, as it can be large and further research not worthwhile or small and further research worthwhile. In contrast, each of questions 1-4 are shown to be fully addressed (necessary and sufficient) where VOI methods are applied to maximize expected value of sample information (EVSI) minus expected costs across designs. In comparing complexity in use of VOI methods, applying the central limit theorem (CLT) simplifies analysis to enable easy estimation of EVSI and optimal overall research design, and has been shown to outperform bootstrapping, particularly with small samples. Consequently, VOI methods applying the CLT to inform optimal overall research design satisfy Occam's razor in both improving decision making and reducing complexity. Furthermore, they enable consideration of relevant decision contexts, including option value and opportunity cost of delay, time, imperfect implementation and optimal design across jurisdictions. More complex VOI methods such as bootstrapping of the expected value of

  1. Evaluation methods for retrieving information from interferograms of biomedical objects

    NASA Astrophysics Data System (ADS)

    Podbielska, Halina; Rottenkolber, Matthias

    1996-04-01

    Interferograms in the form of fringe patterns can be produced in two-beam interferometers, holographic or speckle interferometers, in setups realizing moire techniques or in deflectometers. Optical metrology based on the principle of interference can be applied as a testing tool in biomedical research. By analyzing of the fringe pattern images, information about the shape or mechanical behavior of the object under study can be retrieved. Here, some of the techniques for creating fringe pattern images were presented along with methods of analysis. Intensity based analysis as well as methods of phase measurements, are mentioned. Applications of inteferometric methods, especially in the field of experimental orthopedics, endoscopy and ophthalmology are pointed out.

  2. A Model-Driven Development Method for Management Information Systems

    NASA Astrophysics Data System (ADS)

    Mizuno, Tomoki; Matsumoto, Keinosuke; Mori, Naoki

    Traditionally, a Management Information System (MIS) has been developed without using formal methods. By the informal methods, the MIS is developed on its lifecycle without having any models. It causes many problems such as lack of the reliability of system design specifications. In order to overcome these problems, a model theory approach was proposed. The approach is based on an idea that a system can be modeled by automata and set theory. However, it is very difficult to generate automata of the system to be developed right from the start. On the other hand, there is a model-driven development method that can flexibly correspond to changes of business logics or implementing technologies. In the model-driven development, a system is modeled using a modeling language such as UML. This paper proposes a new development method for management information systems applying the model-driven development method to a component of the model theory approach. The experiment has shown that a reduced amount of efforts is more than 30% of all the efforts.

  3. Theory-informed design of values clarification methods: a cognitive psychological perspective on patient health-related decision making.

    PubMed

    Pieterse, Arwen H; de Vries, Marieke; Kunneman, Marleen; Stiggelbout, Anne M; Feldman-Stewart, Deb

    2013-01-01

    Healthcare decisions, particularly those involving weighing benefits and harms that may significantly affect quality and/or length of life, should reflect patients' preferences. To support patients in making choices, patient decision aids and values clarification methods (VCM) in particular have been developed. VCM intend to help patients to determine the aspects of the choices that are important to their selection of a preferred option. Several types of VCM exist. However, they are often designed without clear reference to theory, which makes it difficult for their development to be systematic and internally coherent. Our goal was to provide theory-informed recommendations for the design of VCM. Process theories of decision making specify components of decision processes, thus, identify particular processes that VCM could aim to facilitate. We conducted a review of the MEDLINE and PsycINFO databases and of references to theories included in retrieved papers, to identify process theories of decision making. We selected a theory if (a) it fulfilled criteria for a process theory; (b) provided a coherent description of the whole process of decision making; and (c) empirical evidence supports at least some of its postulates. Four theories met our criteria: Image Theory, Differentiation and Consolidation theory, Parallel Constraint Satisfaction theory, and Fuzzy-trace Theory. Based on these, we propose that VCM should: help optimize mental representations; encourage considering all potentially appropriate options; delay selection of an initially favoured option; facilitate the retrieval of relevant values from memory; facilitate the comparison of options and their attributes; and offer time to decide. In conclusion, our theory-based design recommendations are explicit and transparent, providing an opportunity to test each in a systematic manner.

  4. Theory-informed design of values clarification methods: a cognitive psychological perspective on patient health-related decision making.

    PubMed

    Pieterse, Arwen H; de Vries, Marieke; Kunneman, Marleen; Stiggelbout, Anne M; Feldman-Stewart, Deb

    2013-01-01

    Healthcare decisions, particularly those involving weighing benefits and harms that may significantly affect quality and/or length of life, should reflect patients' preferences. To support patients in making choices, patient decision aids and values clarification methods (VCM) in particular have been developed. VCM intend to help patients to determine the aspects of the choices that are important to their selection of a preferred option. Several types of VCM exist. However, they are often designed without clear reference to theory, which makes it difficult for their development to be systematic and internally coherent. Our goal was to provide theory-informed recommendations for the design of VCM. Process theories of decision making specify components of decision processes, thus, identify particular processes that VCM could aim to facilitate. We conducted a review of the MEDLINE and PsycINFO databases and of references to theories included in retrieved papers, to identify process theories of decision making. We selected a theory if (a) it fulfilled criteria for a process theory; (b) provided a coherent description of the whole process of decision making; and (c) empirical evidence supports at least some of its postulates. Four theories met our criteria: Image Theory, Differentiation and Consolidation theory, Parallel Constraint Satisfaction theory, and Fuzzy-trace Theory. Based on these, we propose that VCM should: help optimize mental representations; encourage considering all potentially appropriate options; delay selection of an initially favoured option; facilitate the retrieval of relevant values from memory; facilitate the comparison of options and their attributes; and offer time to decide. In conclusion, our theory-based design recommendations are explicit and transparent, providing an opportunity to test each in a systematic manner. PMID:23219164

  5. Determination of nuclear level densities from experimental information

    SciTech Connect

    Cole, B.J. ); Davidson, N.J. , P.O. Box 88, Manchester M60 1QD ); Miller, H.G. )

    1994-10-01

    A novel information theory based method for determining the density of states from prior information is presented. The energy dependence of the density of states is determined from the observed number of states per energy interval, and model calculations suggest that the method is sufficiently reliable to calculate the thermal properties of nuclei over a reasonable temperature range.

  6. Networks of informal caring: a mixed-methods approach.

    PubMed

    Rutherford, Alasdair; Bowes, Alison

    2014-12-01

    Care for older people is a complex phenomenon, and is an area of pressing policy concern. Bringing together literature on care from social gerontology and economics, we report the findings of a mixed-methods project exploring networks of informal caring. Using quantitative data from the British Household Panel Survey (official survey of British households), together with qualitative interviews with older people and informal carers, we describe differences in formal care networks, and the factors and decision-making processes that have contributed to the formation of the networks. A network approach to care permits both quantitative and qualitative study, and the approach can be used to explore many important questions.

  7. Formative research to develop theory-based messages for a Western Australian child drowning prevention television campaign: study protocol

    PubMed Central

    Denehy, Mel; Crawford, Gemma; Leavy, Justine; Nimmo, Lauren; Jancey, Jonine

    2016-01-01

    Introduction Worldwide, children under the age of 5 years are at particular risk of drowning. Responding to this need requires the development of evidence-informed drowning prevention strategies. Historically, drowning prevention strategies have included denying access, learning survival skills and providing supervision, as well as education and information which includes the use of mass media. Interventions underpinned by behavioural theory and formative evaluation tend to be more effective, yet few practical examples exist in the drowning and/or injury prevention literature. The Health Belief Model and Social Cognitive Theory will be used to explore participants' perspectives regarding proposed mass media messaging. This paper describes a qualitative protocol to undertake formative research to develop theory-based messages for a child drowning prevention campaign. Methods and analysis The primary data source will be focus group interviews with parents and caregivers of children under 5 years of age in metropolitan and regional Western Australia. Qualitative content analysis will be used to analyse the data. Ethics and dissemination This study will contribute to the drowning prevention literature to inform the development of future child drowning prevention mass media campaigns. Findings from the study will be disseminated to practitioners, policymakers and researchers via international conferences, peer and non-peer-reviewed journals and evidence summaries. The study was submitted and approved by the Curtin University Human Research Ethics Committee. PMID:27207621

  8. Application of information theory methods to food web reconstruction

    USGS Publications Warehouse

    Moniz, L.J.; Cooch, E.G.; Ellner, S.P.; Nichols, J.D.; Nichols, J.M.

    2007-01-01

    In this paper we use information theory techniques on time series of abundances to determine the topology of a food web. At the outset, the food web participants (two consumers, two resources) are known; in addition we know that each consumer prefers one of the resources over the other. However, we do not know which consumer prefers which resource, and if this preference is absolute (i.e., whether or not the consumer will consume the non-preferred resource). Although the consumers and resources are identified at the beginning of the experiment, we also provide evidence that the consumers are not resources for each other, and the resources do not consume each other. We do show that there is significant mutual information between resources; the model is seasonally forced and some shared information between resources is expected. Similarly, because the model is seasonally forced, we expect shared information between consumers as they respond to the forcing of the resources. The model that we consider does include noise, and in an effort to demonstrate that these methods may be of some use in other than model data, we show the efficacy of our methods with decreasing time series size; in this particular case we obtain reasonably clear results with a time series length of 400 points. This approaches ecological time series lengths from real systems.

  9. Towards the exploitation of formal methods for information fusion

    NASA Astrophysics Data System (ADS)

    Clemens, Joachim; Wille, Robert; Schill, Kerstin

    2016-05-01

    When an autonomous system has to act in or interact with an environment, a suitable representation of it is required. In the past decades, many different representation forms - especially spacial ones - have been proposed and even more information fusion techniques were developed in order to build these representations from multiple information sources. However, most of these algorithms do not exploit the full potential of the available information. This is caused by the fact that they are not able to handle the full complexity of all possible solutions compatible with the information and that they rely on restrictive assumptions (i.e. independencies) in order to make the computation feasible. In this work, a new methodology is envisioned that utilizes formal methods, in particular solvers for Pseudo-Boolean Optimization, to drop some of these assumptions. In order to illustrate the ideas, information fusion based on belief functions and occupancy grid maps are considered. It is shown that this approach allows for considering dependencies among multiple cells and thus significantly reduces the uncertainty in the resulting representation.

  10. An inventory-theory-based interval-parameter two-stage stochastic programming model for water resources management

    NASA Astrophysics Data System (ADS)

    Suo, M. Q.; Li, Y. P.; Huang, G. H.

    2011-09-01

    In this study, an inventory-theory-based interval-parameter two-stage stochastic programming (IB-ITSP) model is proposed through integrating inventory theory into an interval-parameter two-stage stochastic optimization framework. This method can not only address system uncertainties with complex presentation but also reflect transferring batch (the transferring quantity at once) and period (the corresponding cycle time) in decision making problems. A case of water allocation problems in water resources management planning is studied to demonstrate the applicability of this method. Under different flow levels, different transferring measures are generated by this method when the promised water cannot be met. Moreover, interval solutions associated with different transferring costs also have been provided. They can be used for generating decision alternatives and thus help water resources managers to identify desired policies. Compared with the ITSP method, the IB-ITSP model can provide a positive measure for solving water shortage problems and afford useful information for decision makers under uncertainty.

  11. An Organizational Model to Distinguish between and Integrate Research and Evaluation Activities in a Theory Based Evaluation

    ERIC Educational Resources Information Center

    Sample McMeeking, Laura B.; Basile, Carole; Cobb, R. Brian

    2012-01-01

    Theory-based evaluation (TBE) is an evaluation method that shows how a program will work under certain conditions and has been supported as a viable, evidence-based option in cases where randomized trials or high-quality quasi-experiments are not feasible. Despite the model's widely accepted theoretical appeal there are few examples of its…

  12. A Rapid Usability Evaluation (RUE) Method for Health Information Technology.

    PubMed

    Russ, Alissa L; Baker, Darrell A; Fahner, W Jeffrey; Milligan, Bryce S; Cox, Leeann; Hagg, Heather K; Saleem, Jason J

    2010-11-13

    Usability testing can help generate design ideas to enhance the quality and safety of health information technology. Despite these potential benefits, few healthcare organizations conduct systematic usability testing prior to software implementation. We used a Rapid Usability Evaluation (RUE) method to apply usability testing to software development at a major VA Medical Center. We describe the development of the RUE method, provide two examples of how it was successfully applied, and discuss key insights gained from this work. Clinical informaticists with limited usability training were able to apply RUE to improve software evaluation and elected to continue to use this technique. RUE methods are relatively simple, do not require advanced training or usability software, and should be easy to adopt. Other healthcare organizations may be able to implement RUE to improve software effectiveness, efficiency, and safety.

  13. Emotion identification method using RGB information of human face

    NASA Astrophysics Data System (ADS)

    Kita, Shinya; Mita, Akira

    2015-03-01

    Recently, the number of single households is drastically increased due to the growth of the aging society and the diversity of lifestyle. Therefore, the evolution of building spaces is demanded. Biofied Building we propose can help to avoid this situation. It helps interaction between the building and residents' conscious and unconscious information using robots. The unconscious information includes emotion, condition, and behavior. One of the important information is thermal comfort. We assume we can estimate it from human face. There are many researchs about face color analysis, but a few of them are conducted in real situations. In other words, the existing methods were not used with disturbance such as room lumps. In this study, Kinect was used with face-tracking. Room lumps and task lumps were used to verify that our method could be applicable to real situation. In this research, two rooms at 22 and 28 degrees C were prepared. We showed that the transition of thermal comfort by changing temperature can be observed from human face. Thus, distinction between the data of 22 and 28 degrees C condition from face color was proved to be possible.

  14. A method to stabilize linear systems using eigenvalue gradient information

    NASA Technical Reports Server (NTRS)

    Wieseman, C. D.

    1985-01-01

    Formal optimization methods and eigenvalue gradient information are used to develop a stabilizing control law for a closed loop linear system that is initially unstable. The method was originally formulated by using direct, constrained optimization methods with the constraints being the real parts of the eigenvalues. However, because of problems in trying to achieve stabilizing control laws, the problem was reformulated to be solved differently. The method described uses the Davidon-Fletcher-Powell minimization technique to solve an indirect, constrained minimization problem in which the performance index is the Kreisselmeier-Steinhauser function of the real parts of all the eigenvalues. The method is applied successfully to solve two different problems: the determination of a fourth-order control law stabilizes a single-input single-output active flutter suppression system and the determination of a second-order control law for a multi-input multi-output lateral-directional flight control system. Various sets of design variables and initial starting points were chosen to show the robustness of the method.

  15. Information processing systems, reasoning modules, and reasoning system design methods

    DOEpatents

    Hohimer, Ryan E.; Greitzer, Frank L.; Hampton, Shawn D.

    2016-08-23

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  16. Information processing systems, reasoning modules, and reasoning system design methods

    DOEpatents

    Hohimer, Ryan E; Greitzer, Frank L; Hampton, Shawn D

    2014-03-04

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  17. Information processing systems, reasoning modules, and reasoning system design methods

    DOEpatents

    Hohimer, Ryan E.; Greitzer, Frank L.; Hampton, Shawn D.

    2015-08-18

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  18. Information bias in health research: definition, pitfalls, and adjustment methods

    PubMed Central

    Althubaiti, Alaa

    2016-01-01

    As with other fields, medical sciences are subject to different sources of bias. While understanding sources of bias is a key element for drawing valid conclusions, bias in health research continues to be a very sensitive issue that can affect the focus and outcome of investigations. Information bias, otherwise known as misclassification, is one of the most common sources of bias that affects the validity of health research. It originates from the approach that is utilized to obtain or confirm study measurements. This paper seeks to raise awareness of information bias in observational and experimental research study designs as well as to enrich discussions concerning bias problems. Specifying the types of bias can be essential to limit its effects and, the use of adjustment methods might serve to improve clinical evaluation and health care practice. PMID:27217764

  19. a Task-Oriented Disaster Information Correlation Method

    NASA Astrophysics Data System (ADS)

    Linyao, Q.; Zhiqiang, D.; Qing, Z.

    2015-07-01

    With the rapid development of sensor networks and Earth observation technology, a large quantity of disaster-related data is available, such as remotely sensed data, historic data, case data, simulated data, and disaster products. However, the efficiency of current data management and service systems has become increasingly difficult due to the task variety and heterogeneous data. For emergency task-oriented applications, the data searches primarily rely on artificial experience based on simple metadata indices, the high time consumption and low accuracy of which cannot satisfy the speed and veracity requirements for disaster products. In this paper, a task-oriented correlation method is proposed for efficient disaster data management and intelligent service with the objectives of 1) putting forward disaster task ontology and data ontology to unify the different semantics of multi-source information, 2) identifying the semantic mapping from emergency tasks to multiple data sources on the basis of uniform description in 1), and 3) linking task-related data automatically and calculating the correlation between each data set and a certain task. The method goes beyond traditional static management of disaster data and establishes a basis for intelligent retrieval and active dissemination of disaster information. The case study presented in this paper illustrates the use of the method on an example flood emergency relief task.

  20. Control theory based airfoil design using the Euler equations

    NASA Technical Reports Server (NTRS)

    Jameson, Antony; Reuther, James

    1994-01-01

    This paper describes the implementation of optimization techniques based on control theory for airfoil design. In our previous work it was shown that control theory could be employed to devise effective optimization procedures for two-dimensional profiles by using the potential flow equation with either a conformal mapping or a general coordinate system. The goal of our present work is to extend the development to treat the Euler equations in two-dimensions by procedures that can readily be generalized to treat complex shapes in three-dimensions. Therefore, we have developed methods which can address airfoil design through either an analytic mapping or an arbitrary grid perturbation method applied to a finite volume discretization of the Euler equations. Here the control law serves to provide computationally inexpensive gradient information to a standard numerical optimization method. Results are presented for both the inverse problem and drag minimization problem.

  1. Methods and systems for advanced spaceport information management

    NASA Technical Reports Server (NTRS)

    Fussell, Ronald M. (Inventor); Ely, Donald W. (Inventor); Meier, Gary M. (Inventor); Halpin, Paul C. (Inventor); Meade, Phillip T. (Inventor); Jacobson, Craig A. (Inventor); Blackwell-Thompson, Charlie (Inventor)

    2007-01-01

    Advanced spaceport information management methods and systems are disclosed. In one embodiment, a method includes coupling a test system to the payload and transmitting one or more test signals that emulate an anticipated condition from the test system to the payload. One or more responsive signals are received from the payload into the test system and are analyzed to determine whether one or more of the responsive signals comprises an anomalous signal. At least one of the steps of transmitting, receiving, analyzing and determining includes transmitting at least one of the test signals and the responsive signals via a communications link from a payload processing facility to a remotely located facility. In one particular embodiment, the communications link is an Internet link from a payload processing facility to a remotely located facility (e.g. a launch facility, university, etc.).

  2. Methods and Systems for Advanced Spaceport Information Management

    NASA Technical Reports Server (NTRS)

    Fussell, Ronald M. (Inventor); Ely, Donald W. (Inventor); Meier, Gary M. (Inventor); Halpin, Paul C. (Inventor); Meade, Phillip T. (Inventor); Jacobson, Craig A. (Inventor); Blackwell-Thompson, Charlie (Inventor)

    2007-01-01

    Advanced spaceport information management methods and systems are disclosed. In one embodiment, a method includes coupling a test system to the payload and transmitting one or more test signals that emulate an anticipated condition from the test system to the payload. One or more responsive signals are received from the payload into the test system and are analyzed to determine whether one or more of the responsive signals comprises an anomalous signal. At least one of the steps of transmitting, receiving, analyzing and determining includes transmitting at least one of the test signals and the responsive signals via a communications link from a payload processing facility to a remotely located facility. In one particular embodiment, the communications link is an Internet link from a payload processing facility to a remotely located facility (e.g. a launch facility, university, etc.).

  3. Urban drainage control applying rational method and geographic information technologies

    NASA Astrophysics Data System (ADS)

    Aldalur, Beatriz; Campo, Alicia; Fernández, Sandra

    2013-09-01

    The objective of this study is to develop a method of controlling urban drainages in the town of Ingeniero White motivated by the problems arising as a result of floods, water logging and the combination of southeasterly and high tides. A Rational Method was applied to control urban watersheds and used tools of Geographic Information Technology (GIT). A Geographic Information System was developed on the basis of 28 panchromatic aerial photographs of 2005. They were georeferenced with control points measured with Global Positioning Systems (basin: 6 km2). Flow rates of basins and sub-basins were calculated and it was verified that the existing open channels have a low slope with the presence of permanent water and generate stagnation of water favored by the presence of trash. It is proposed for the output of storm drains, the use of an existing channel to evacuate the flow. The solution proposed in this work is complemented by the placement of three pumping stations: one on a channel to drain rain water which will allow the drain of the excess water from the lower area where is located the Ingeniero White city and the two others that will drain the excess liquid from the port area.

  4. A danger-theory-based immune network optimization algorithm.

    PubMed

    Zhang, Ruirui; Li, Tao; Xiao, Xin; Shi, Yuanquan

    2013-01-01

    Existing artificial immune optimization algorithms reflect a number of shortcomings, such as premature convergence and poor local search ability. This paper proposes a danger-theory-based immune network optimization algorithm, named dt-aiNet. The danger theory emphasizes that danger signals generated from changes of environments will guide different levels of immune responses, and the areas around danger signals are called danger zones. By defining the danger zone to calculate danger signals for each antibody, the algorithm adjusts antibodies' concentrations through its own danger signals and then triggers immune responses of self-regulation. So the population diversity can be maintained. Experimental results show that the algorithm has more advantages in the solution quality and diversity of the population. Compared with influential optimization algorithms, CLONALG, opt-aiNet, and dopt-aiNet, the algorithm has smaller error values and higher success rates and can find solutions to meet the accuracies within the specified function evaluation times.

  5. A diffusive information preservation method for small Knudsen number flows

    NASA Astrophysics Data System (ADS)

    Fei, Fei; Fan, Jing

    2013-06-01

    The direct simulation Monte Carlo (DSMC) method is a powerful particle-based method for modeling gas flows. It works well for relatively large Knudsen (Kn) numbers, typically larger than 0.01, but quickly becomes computationally intensive as Kn decreases due to its time step and cell size limitations. An alternative approach was proposed to relax or remove these limitations, based on replacing pairwise collisions with a stochastic model corresponding to the Fokker-Planck equation [J. Comput. Phys., 229, 1077 (2010); J. Fluid Mech., 680, 574 (2011)]. Similar to the DSMC method, the downside of that approach suffers from computationally statistical noise. To solve the problem, a diffusion-based information preservation (D-IP) method has been developed. The main idea is to track the motion of a simulated molecule from the diffusive standpoint, and obtain the flow velocity and temperature through sampling and averaging the IP quantities. To validate the idea and the corresponding model, several benchmark problems with Kn ˜ 10-3-10-4 have been investigated. It is shown that the IP calculations are not only accurate, but also efficient because they make possible using a time step and cell size over an order of magnitude larger than the mean collision time and mean free path, respectively.

  6. A diffusive information preservation method for small Knudsen number flows

    SciTech Connect

    Fei, Fei; Fan, Jing

    2013-06-15

    The direct simulation Monte Carlo (DSMC) method is a powerful particle-based method for modeling gas flows. It works well for relatively large Knudsen (Kn) numbers, typically larger than 0.01, but quickly becomes computationally intensive as Kn decreases due to its time step and cell size limitations. An alternative approach was proposed to relax or remove these limitations, based on replacing pairwise collisions with a stochastic model corresponding to the Fokker–Planck equation [J. Comput. Phys., 229, 1077 (2010); J. Fluid Mech., 680, 574 (2011)]. Similar to the DSMC method, the downside of that approach suffers from computationally statistical noise. To solve the problem, a diffusion-based information preservation (D-IP) method has been developed. The main idea is to track the motion of a simulated molecule from the diffusive standpoint, and obtain the flow velocity and temperature through sampling and averaging the IP quantities. To validate the idea and the corresponding model, several benchmark problems with Kn ∼ 10{sup −3}–10{sup −4} have been investigated. It is shown that the IP calculations are not only accurate, but also efficient because they make possible using a time step and cell size over an order of magnitude larger than the mean collision time and mean free path, respectively.

  7. Acoustic emission source location and damage detection in a metallic structure using a graph-theory-based geodesic approach

    NASA Astrophysics Data System (ADS)

    Gangadharan, R.; Prasanna, G.; Bhat, M. R.; Murthy, C. R. L.; Gopalakrishnan, S.

    2009-11-01

    A geodesic-based approach using Lamb waves is proposed to locate the acoustic emission (AE) source and damage in an isotropic metallic structure. In the case of the AE (passive) technique, the elastic waves take the shortest path from the source to the sensor array distributed in the structure. The geodesics are computed on the meshed surface of the structure using graph theory based on Dijkstra's algorithm. By propagating the waves in reverse virtually from these sensors along the geodesic path and by locating the first intersection point of these waves, one can get the AE source location. The same approach is extended for detection of damage in a structure. The wave response matrix of the given sensor configuration for the healthy and the damaged structure is obtained experimentally. The healthy and damage response matrix is compared and their difference gives the information about the reflection of waves from the damage. These waves are backpropagated from the sensors and the above method is used to locate the damage by finding the point where intersection of geodesics occurs. In this work, the geodesic approach is shown to be suitable to obtain a practicable source location solution in a more general set-up on any arbitrary surface containing finite discontinuities. Experiments were conducted on aluminum specimens of simple and complex geometry to validate this new method.

  8. Dissemination of a theory-based online bone health program: Two intervention approaches.

    PubMed

    Nahm, Eun-Shim; Resnick, Barbara; Bellantoni, Michele; Zhu, Shijun; Brown, Clayton; Brennan, Patricia F; Charters, Kathleen; Brown, Jeanine; Rietschel, Matthew; Pinna, Joanne; An, Minjeong; Park, Bu Kyung; Plummer, Lisa

    2015-06-01

    With the increasing nationwide emphasis on eHealth, there has been a rapid growth in the use of the Internet to deliver health promotion interventions. Although there has been a great deal of research in this field, little information is available regarding the methodologies to develop and implement effective online interventions. This article describes two social cognitive theory-based online health behavior interventions used in a large-scale dissemination study (N = 866), their implementation processes, and the lessons learned during the implementation processes. The two interventions were a short-term (8-week) intensive online Bone Power program and a longer term (12-month) Bone Power Plus program, including the Bone Power program followed by a 10-month online booster intervention (biweekly eHealth newsletters). This study used a small-group approach (32 intervention groups), and to effectively manage those groups, an eLearning management program was used as an upper layer of the Web intervention. Both interventions were implemented successfully with high retention rates (80.7% at 18 months). The theory-based approaches and the online infrastructure used in this study showed a promising potential as an effective platform for online behavior studies. Further replication studies with different samples and settings are needed to validate the utility of this intervention structure. PMID:26021668

  9. A theory-based approach to teaching young children about health: A recipe for understanding

    PubMed Central

    Nguyen, Simone P.; McCullough, Mary Beth; Noble, Ashley

    2011-01-01

    The theory-theory account of conceptual development posits that children’s concepts are integrated into theories. Concept learning studies have documented the central role that theories play in children’s learning of experimenter-defined categories, but have yet to extensively examine complex, real-world concepts such as health. The present study examined whether providing young children with coherent and causally-related information in a theory-based lesson would facilitate their learning about the concept of health. This study used a pre-test/lesson/post-test design, plus a five month follow-up. Children were randomly assigned to one of three conditions: theory (i.e., 20 children received a theory-based lesson); nontheory (i.e., 20 children received a nontheory-based lesson); and control (i.e., 20 children received no lesson). Overall, the results showed that children in the theory condition had a more accurate conception of health than children in the nontheory and control conditions, suggesting the importance of theories in children’s learning of complex, real-world concepts. PMID:21894237

  10. Dissolved oxygen prediction using a possibility theory based fuzzy neural network

    NASA Astrophysics Data System (ADS)

    Khan, Usman T.; Valeo, Caterina

    2016-06-01

    A new fuzzy neural network method to predict minimum dissolved oxygen (DO) concentration in a highly urbanised riverine environment (in Calgary, Canada) is proposed. The method uses abiotic factors (non-living, physical and chemical attributes) as inputs to the model, since the physical mechanisms governing DO in the river are largely unknown. A new two-step method to construct fuzzy numbers using observations is proposed. Then an existing fuzzy neural network is modified to account for fuzzy number inputs and also uses possibility theory based intervals to train the network. Results demonstrate that the method is particularly well suited to predicting low DO events in the Bow River. Model performance is compared with a fuzzy neural network with crisp inputs, as well as with a traditional neural network. Model output and a defuzzification technique are used to estimate the risk of low DO so that water resource managers can implement strategies to prevent the occurrence of low DO.

  11. Dissolved oxygen prediction using a possibility-theory based fuzzy neural network

    NASA Astrophysics Data System (ADS)

    Khan, U. T.; Valeo, C.

    2015-11-01

    A new fuzzy neural network method to predict minimum dissolved oxygen (DO) concentration in a highly urbanised riverine environment (in Calgary, Canada) is proposed. The method uses abiotic (non-living, physical and chemical attributes) as inputs to the model, since the physical mechanisms governing DO in the river are largely unknown. A new two-step method to construct fuzzy numbers using observations is proposed. Then an existing fuzzy neural network is modified to account for fuzzy number inputs and also uses possibility-theory based intervals to train the network. Results demonstrate that the method is particularly well suited to predict low DO events in the Bow River. Model output and a defuzzification technique is used to estimate the risk of low DO so that water resource managers can implement strategies to prevent the occurrence of low DO.

  12. Methods of Information Geometry to model complex shapes

    NASA Astrophysics Data System (ADS)

    De Sanctis, A.; Gattone, S. A.

    2016-09-01

    In this paper, a new statistical method to model patterns emerging in complex systems is proposed. A framework for shape analysis of 2- dimensional landmark data is introduced, in which each landmark is represented by a bivariate Gaussian distribution. From Information Geometry we know that Fisher-Rao metric endows the statistical manifold of parameters of a family of probability distributions with a Riemannian metric. Thus this approach allows to reconstruct the intermediate steps in the evolution between observed shapes by computing the geodesic, with respect to the Fisher-Rao metric, between the corresponding distributions. Furthermore, the geodesic path can be used for shape predictions. As application, we study the evolution of the rat skull shape. A future application in Ophthalmology is introduced.

  13. System and Method for RFID-Enabled Information Collection

    NASA Technical Reports Server (NTRS)

    Fink, Patrick W. (Inventor); Lin, Gregory Y. (Inventor); Kennedy, Timothy F. (Inventor); Ngo, Phong H. (Inventor); Byerly, Diane (Inventor)

    2016-01-01

    Methods, apparatuses and systems for radio frequency identification (RFID)-enabled information collection are disclosed, including an enclosure, a collector coupled to the enclosure, an interrogator, a processor, and one or more RFID field sensors, each having an individual identification, disposed within the enclosure. In operation, the interrogator transmits an incident signal to the collector, causing the collector to generate an electromagnetic field within the enclosure. The electromagnetic field is affected by one or more influences. RFID sensors respond to the electromagnetic field by transmitting reflected signals containing the individual identifications of the responding RFID sensors to the interrogator. The interrogator receives the reflected signals, measures one or more returned signal strength indications ("RSSI") of the reflected signals and sends the RSSI measurements and identification of the responding RFID sensors to the processor to determine one or more facts about the influences. Other embodiments are also described.

  14. Testing a Theory-Based Mobility Monitoring Protocol Using In-Home Sensors: A Feasibility Study

    PubMed Central

    Reeder, Blaine; Chung, Jane; Lazar, Amanda; Joe, Jonathan; Demiris, George; Thompson, Hilaire J.

    2014-01-01

    Mobility is a key factor in the performance of many everyday tasks required for independent living as a person grows older. The purpose of this mixed methods study was to test a theory-based mobility monitoring protocol by comparing sensor-based measures to self-report measures of mobility and assessing the acceptability of in-home sensors with older adults. Standardized instruments to measure physical, psychosocial and cognitive parameters were administered to 8 community-dwelling older adults at baseline, 3 month and 6 month visits (examples: FES, GDS-SF, Mini-cog). Semi-structured interviews to characterize acceptability of the technology were conducted at 3 month and 6 month visits. Technical issues prevented comparison of sensor-based measures with self-report measures. In-home sensor technology for monitoring mobility is acceptable to older adults. Implementing our theory-based mobility monitoring protocol in a field study in the homes of older adults is a feasible undertaking but requires more robust technology for sensor-based measure validation. PMID:23938159

  15. A communication-theory based view on telemedical communication.

    PubMed

    Schall, Thomas; Roeckelein, Wolfgang; Mohr, Markus; Kampshoff, Joerg; Lange, Tim; Nerlich, Michael

    2003-01-01

    Communication theory based analysis sheds new light on the use of health telematics. This analysis of structures in electronic medical communication shows communicative structures with special features. Current and evolving telemedical applications are analyzed. The methodology of communicational theory (focusing on linguistic pragmatics) is used to compare it with its conventional counterpart. The semiotic model, the roles of partners, the respective message and their relation are discussed. Channels, sender, addressee, and other structural roles are analyzed for different types of electronic medical communication. The communicative processes are shown as mutual, rational action towards a common goal. The types of communication/texts are analyzed in general. Furthermore the basic communicative structures of medical education via internet are presented with their special features. The analysis shows that electronic medical communication has special features compared to everyday communication: A third participant role often is involved: the patient. Messages often are addressed to an unspecified partner or to an unspecified partner within a group. Addressing in this case is (at least partially) role-based. Communication and message often directly (rather than indirectly) influence actions of the participants. Communication often is heavily regulated including legal implications like liability, and more. The conclusion from the analysis is that the development of telemedical applications so far did not sufficiently take communicative structures into consideration. Based on these results recommendations for future developments of telemedical applications/services are given.

  16. A RE-AIM evaluation of theory-based physical activity interventions.

    PubMed

    Antikainen, Iina; Ellis, Rebecca

    2011-04-01

    Although physical activity interventions have been shown to effectively modify behavior, little research has examined the potential of these interventions for adoption in real-world settings. The purpose of this literature review was to evaluate the external validity of 57 theory-based physical activity interventions using the RE-AIM framework. The physical activity interventions included were more likely to report on issues of internal, rather than external validity and on individual, rather than organizational components of the RE-AIM framework, making the translation of many interventions into practice difficult. Furthermore, most studies included motivated, healthy participants, thus reducing the generalizability of the interventions to real-world settings that provide services to more diverse populations. To determine if a given intervention is feasible and effective in translational research, more information should be reported about the factors that affect external validity.

  17. An address geocoding method for improving rural spatial information infrastructure

    NASA Astrophysics Data System (ADS)

    Pan, Yuchun; Chen, Baisong; Lu, Zhou; Li, Shuhua; Zhang, Jingbo; Zhou, YanBing

    2010-11-01

    The transition of rural and agricultural management from divisional to integrated mode has highlighted the importance of data integration and sharing. Current data are mostly collected by specific department to satisfy their own needs and lake of considering on wider potential uses. This led to great difference in data format, semantic, and precision even in same area, which is a significant barrier for constructing an integrated rural spatial information system to support integrated management and decision-making. Considering the rural cadastral management system and postal zones, the paper designs a rural address geocoding method based on rural cadastral parcel. It puts forward a geocoding standard which consists of absolute position code, relative position code and extended code. It designs a rural geocoding database model, and addresses collection and update model. Then, based on the rural address geocoding model, it proposed a data model for rural agricultural resources management. The results show that the address coding based on postal code is stable and easy to memorize, two-dimensional coding based on the direction and distance is easy to be located and memorized, while extended code can enhance the extensibility and flexibility of address geocoding.

  18. An address geocoding method for improving rural spatial information infrastructure

    NASA Astrophysics Data System (ADS)

    Pan, Yuchun; Chen, Baisong; Lu, Zhou; Li, Shuhua; Zhang, Jingbo; Zhou, Yanbing

    2009-09-01

    The transition of rural and agricultural management from divisional to integrated mode has highlighted the importance of data integration and sharing. Current data are mostly collected by specific department to satisfy their own needs and lake of considering on wider potential uses. This led to great difference in data format, semantic, and precision even in same area, which is a significant barrier for constructing an integrated rural spatial information system to support integrated management and decision-making. Considering the rural cadastral management system and postal zones, the paper designs a rural address geocoding method based on rural cadastral parcel. It puts forward a geocoding standard which consists of absolute position code, relative position code and extended code. It designs a rural geocoding database model, and addresses collection and update model. Then, based on the rural address geocoding model, it proposed a data model for rural agricultural resources management. The results show that the address coding based on postal code is stable and easy to memorize, two-dimensional coding based on the direction and distance is easy to be located and memorized, while extended code can enhance the extensibility and flexibility of address geocoding.

  19. Agent-based method for distributed clustering of textual information

    DOEpatents

    Potok, Thomas E [Oak Ridge, TN; Reed, Joel W [Knoxville, TN; Elmore, Mark T [Oak Ridge, TN; Treadwell, Jim N [Louisville, TN

    2010-09-28

    A computer method and system for storing, retrieving and displaying information has a multiplexing agent (20) that calculates a new document vector (25) for a new document (21) to be added to the system and transmits the new document vector (25) to master cluster agents (22) and cluster agents (23) for evaluation. These agents (22, 23) perform the evaluation and return values upstream to the multiplexing agent (20) based on the similarity of the document to documents stored under their control. The multiplexing agent (20) then sends the document (21) and the document vector (25) to the master cluster agent (22), which then forwards it to a cluster agent (23) or creates a new cluster agent (23) to manage the document (21). The system also searches for stored documents according to a search query having at least one term and identifying the documents found in the search, and displays the documents in a clustering display (80) of similarity so as to indicate similarity of the documents to each other.

  20. Informative Parameters of Dynamic Geo-electricity Methods

    NASA Astrophysics Data System (ADS)

    Tursunmetov, R.

    With growing complexity of geological tasks and revealing abnormality zones con- nected with ore, oil, gas and water availability, methods of dynamic geo-electricity started to be used. In these methods geological environment is considered as inter- phase irregular one. Main dynamic element of this environment is double electric layer, which develops on the boundary between solid and liquid phase. In ore or wa- ter saturated environment double electric layers become electrochemical or electro- kinetic active elements of geo-electric environment, which, in turn, form natural elec- tric field. Mentioned field influences artificially created field distribution and inter- action bear complicated super-position or non-linear character. Therefore, geological environment is considered as active one, which is able to accumulate and transform artificially superpositioned fields. Main dynamic property of this environment is non- liner behavior of specific electric resistance and soil polarization depending on current density and measurements frequency, which serve as informative parameters for dy- namic geo-electricity methods. Study of disperse soil electric properties in impulse- frequency regime with study of temporal and frequency characteristics of electric field is of main interest for definition of geo-electric abnormality. Volt-amperic characteris- tics of electromagnetic field study has big practical significance. These characteristics are determined by electric-chemically active ore and water saturated fields. Mentioned parameters depend on initiated field polarity, in particular on ore saturated zone's character, composition and mineralization and natural electric field availability un- der cathode and anode mineralization. Non-linear behavior of environment's dynamic properties impacts initiated field structure that allows to define abnormal zone loca- tion. And, finally, study of soil anisotropy dynamic properties in space will allow to identify filtration flows

  1. Remote multi-position information gathering system and method

    DOEpatents

    Hirschfeld, T.B.

    1986-12-02

    A technique for gathering specific information from various remote locations, especially fluorimetric information characteristic of particular materials at the various locations is disclosed herein. This technique uses a single source of light disposed at still a different, central location and an overall optical network including an arrangement of optical fibers cooperating with the light source for directing individual light beams into the different information bearing locations. The incoming light beams result in corresponding displays of light, e.g., fluorescent light, containing the information to be obtained. The optical network cooperates with these light displays at the various locations for directing outgoing light beams containing the same information as their cooperating displays from these locations to the central location. Each of these outgoing beams is applied to a detection arrangement, e.g., a fluorescence spectroscope, for retrieving the information contained thereby. 9 figs.

  2. Remote multi-position information gathering system and method

    DOEpatents

    Hirschfeld, T.B.

    1989-01-24

    A technique for gathering specific information from various remote locations, especially fluorimetric information characteristic of particular materials at the various locations is disclosed herein. This technique uses a single source of light disposed at still a different, central location and an overall optical network including an arrangement of optical fibers cooperating with the light source for directing individual light beams into the different information bearing locations. The incoming light beams result in corresponding displays of light, e.g., fluorescent light, containing the information to be obtained. The optical network cooperates with these light displays at the various locations for directing outgoing light beams containing the same information as their cooperating displays from these locations to the central location. Each of these outgoing beams is applied to a detection arrangement, e.g., a fluorescence spectroscope, for retrieving the information contained thereby. 9 figs.

  3. Remote multi-position information gathering system and method

    DOEpatents

    Hirschfeld, Tomas B.

    1989-01-01

    A technique for gathering specific information from various remote locations, especially fluorimetric information characteristic of particular materials at the various locations is disclosed herein. This technique uses a single source of light disposed at still a different, central location and an overall optical network including an arrangement of optical fibers cooperating with the light source for directing individual light beams into the different information bearing locations. The incoming light beams result in corresponding displays of light, e.g., fluorescent light, containing the information to be obtained. The optical network cooperates with these light displays at the various locations for directing outgoing light beams containing the same information as their cooperating displays from these locations to the central location. Each of these outgoing beams is applied to a detection arrangement, e.g., a fluorescence spectroscope, for retrieving the information contained thereby.

  4. Remote multi-position information gathering system and method

    DOEpatents

    Hirschfeld, Tomas B.

    1986-01-01

    A technique for gathering specific information from various remote locations, especially fluorimetric information characteristic of particular materials at the various locations is disclosed herein. This technique uses a single source of light disposed at still a different, central location and an overall optical network including an arrangement of optical fibers cooperating with the light source for directing individual light beams into the different information bearing locations. The incoming light beams result in corresponding displays of light, e.g., fluorescent light, containing the information to be obtained. The optical network cooperates with these light displays at the various locations for directing ongoing light beams containing the same information as their cooperating displays from these locations to the central location. Each of these outgoing beams is applied to a detection arrangement, e.g., a fluorescence spectroscope, for retrieving the information contained thereby.

  5. Remote multi-position information gathering system and method

    DOEpatents

    Hirschfeld, Tomas B.

    1986-01-01

    A technique for gathering specific information from various remote locations, especially fluorimetric information characteristic of particular materials at the various locations is disclosed herein. This technique uses a single source of light disposed at still a different, central location and an overall optical network including an arrangement of optical fibers cooperating with the light source for directing individual light beams into the different information bearing locations. The incoming light beams result in corresponding displays of light, e.g., fluorescent light, containing the information to be obtained. The optical network cooperates with these light displays at the various locations for directing outgoing light beams containing the same information as their cooperating displays from these locations to the central location. Each of these outgoing beams is applied to a detection arrangement, e.g., a fluorescence spectroscope, for retrieving the information contained thereby.

  6. Method and system for analyzing and classifying electronic information

    DOEpatents

    McGaffey, Robert W.; Bell, Michael Allen; Kortman, Peter J.; Wilson, Charles H.

    2003-04-29

    A data analysis and classification system that reads the electronic information, analyzes the electronic information according to a user-defined set of logical rules, and returns a classification result. The data analysis and classification system may accept any form of computer-readable electronic information. The system creates a hash table wherein each entry of the hash table contains a concept corresponding to a word or phrase which the system has previously encountered. The system creates an object model based on the user-defined logical associations, used for reviewing each concept contained in the electronic information in order to determine whether the electronic information is classified. The data analysis and classification system extracts each concept in turn from the electronic information, locates it in the hash table, and propagates it through the object model. In the event that the system can not find the electronic information token in the hash table, that token is added to a missing terms list. If any rule is satisfied during propagation of the concept through the object model, the electronic information is classified.

  7. Using the Work System Method with Freshman Information Systems Students

    ERIC Educational Resources Information Center

    Recker, Jan; Alter, Steven

    2012-01-01

    Recent surveys of information technology management professionals show that understanding business domains in terms of business productivity and cost reduction potential, knowledge of different vertical industry segments and their information requirements, understanding of business processes and client-facing skills are more critical for…

  8. Static analysis of rectangular nanoplates using trigonometric shear deformation theory based on nonlocal elasticity theory.

    PubMed

    Nami, Mohammad Rahim; Janghorban, Maziar

    2013-12-30

    In this article, a new higher order shear deformation theory based on trigonometric shear deformation theory is developed. In order to consider the size effects, the nonlocal elasticity theory is used. An analytical method is adopted to solve the governing equations for static analysis of simply supported nanoplates. In the present theory, the transverse shear stresses satisfy the traction free boundary conditions of the rectangular plates and these stresses can be calculated from the constitutive equations. The effects of different parameters such as nonlocal parameter and aspect ratio are investigated on both nondimensional deflections and deflection ratios. It may be important to mention that the present formulations are general and can be used for isotropic, orthotropic and anisotropic nanoplates.

  9. Removing barriers to rehabilitation: Theory-based family intervention in community settings after brain injury.

    PubMed

    Stejskal, Taryn M

    2012-01-01

    Rehabilitation professionals have become increasingly aware that family members play a critical role in the recovery process of individuals after brain injury. In addition, researchers have begun to identify a relationship between family member caregivers' well-being and survivors' outcomes. The idea of a continuum of care or following survivors from inpatient care to community reintegration has become an important model of treatment across many hospital and community-based settings. In concert with the continuum of care, present research literature indicates that family intervention may be a key component to successful rehabilitation after brain injury. Yet, clinicians interacting with family members and survivors often feel confounded about how exactly to intervene with the broader family system beyond the individual survivor. Drawing on the systemic nature of the field of marriage and family therapy (MFT), this article provides information to assist clinicians in effectively intervening with families using theory-based interventions in community settings. First, a rationale for the utilization of systems-based, as opposed to individual-based, therapies will be uncovered. Second, historically relevant publications focusing on family psychotherapy and intervention after brain injury are reviewed and their implications discussed. Recommendations for the utilization of systemic theory-based principles and strategies, specifically cognitive behavioral therapy (CBT), narrative therapy (NT), and solution-focused therapy (SFT) will be examined. Descriptions of common challenges families and couples face will be presented along with case examples to illustrate how these theoretical frameworks might be applied to these special concerns postinjury. Finally, the article concludes with an overview of the ideas presented in this manuscript to assist practitioners and systems of care in community-based settings to more effectively intervene with the family system as a whole

  10. 48 CFR 5.101 - Methods of disseminating information.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... information. (a) As required by the Small Business Act (15 U.S.C. 637(e)) and the Office of Federal... journals, magazines, or other mass communication media for publication without cost to the Government....

  11. Linear and nonlinear information flow based on time delayed mutual information method and its application to corticomuscular interaction

    PubMed Central

    Jin, Seung-Hyun; Lin, Peter; Hallett, Mark

    2010-01-01

    Objective To propose a model-free method to show linear and nonlinear information flow based on time delayed mutual information (TDMI) by employing uni- and bi-variate surrogate tests and to investigate whether there are contributions of the nonlinear information flow in corticomuscular (CM) interaction. Methods Using simulated data, we tested whether our method would successfully detect the direction of information flow and identify a relationship between two simulated time series. As an experimental data application, we applied this method to investigate CM interaction during a right wrist extension task. Results Results of simulation tests show that we can correctly detect the direction of information flow and the relationship between two time series without a prior knowledge of the dynamics of their generating systems. As experimental results, we found both linear and nonlinear information flow from contralateral sensorimotor cortex to muscle. Conclusions Our method is a viable model-free measure of temporally varying causal interactions that is capable of distinguishing linear and nonlinear information flow. With respect to experimental application, there are both linear and nonlinear information flows in CM interaction from contralateral sensorimotor cortex to muscle, which may reflect the motor command from brain to muscle. Significance This is the first study to show separate linear and nonlinear information flow in CM interaction. PMID:20044309

  12. Item Characteristic Curve Estimation of Signal Detection Theory-Based Personality Data: A Two-Stage Approach to Item Response Modeling.

    ERIC Educational Resources Information Center

    Williams, Kevin M.; Zumbo, Bruno D.

    2003-01-01

    Developed an item characteristic curve estimation of signal detection theory based personality data. Results for 266 college students taking the Overclaiming Questionnaire (D. Paulhus and N. Bruce, 1990) suggest that this method is a reasonable approach to describing item functioning and that there are advantages to this method over traditional…

  13. Informativeness Improvement of Hardness Test Methods for Metal Product Assessment

    NASA Astrophysics Data System (ADS)

    Osipov, S.; Podshivalov, I.; Osipov, O.; Zhantybaev, A.

    2016-06-01

    The paper presents a combination of theoretical suggestions, results, and observations allowing to improve the informativeness of hardness testing process in solving problems of metal product assessment while in operation. The hardness value of metal surface obtained by a single measurement is considered to be random. Various measures of location and scattering of the random variable were experimentally estimated for a number of test samples using the correlation analysis, and their close interaction was studied. It was stated that in metal assessment, the main informative characteristics of hardness testing process are its average value and mean-square deviation for measures of location and scattering, respectively.

  14. Information Theory: A Method for Human Communication Research.

    ERIC Educational Resources Information Center

    Black, John W.

    This paper describes seven experiments related to human communication research. The first two experiments discuss studies treating the aural responses of listeners. The third experiment was undertaken to estimate the information of sounds and diagrams which might lead to an estimate of the redundancy ascribed to the phonetic structure of words. A…

  15. Paper Trail: One Method of Information Literacy Assessment

    ERIC Educational Resources Information Center

    Nutefall, Jennifer

    2004-01-01

    Assessing students' information literacy skills can be difficult depending on the involvement of the librarian in a course. To overcome this, librarians created an assignment called the Paper Trail, where students wrote a short essay about their research process and reflected on what they would do differently. Through reviewing and grading these…

  16. Statistical methods of combining information: Applications to sensor data fusion

    SciTech Connect

    Burr, T.

    1996-12-31

    This paper reviews some statistical approaches to combining information from multiple sources. Promising new approaches will be described, and potential applications to combining not-so-different data sources such as sensor data will be discussed. Experiences with one real data set are described.

  17. Personality and Psychopathology: a Theory-Based Revision of Eysenck's PEN Model.

    PubMed

    van Kampen, Dirk

    2009-12-08

    The principal aim of this paper is to investigate whether it is possible to create a personality taxonomy of clinical relevance out of Eysenck's original PEN model by repairing the various shortcomings that can be noted in Eysenck's personality theory, particularly in relation to P or Psychoticism. Addressing three approaches that have been followed to answer the question 'which personality factors are basic?', arguments are listed to show that particularly the theory-informed approach, originally defended by Eysenck, may lead to scientific progress. However, also noting the many deficiencies in the nomological network surrounding P, the peculiar situation arises that we adhere to Eysenck's theory-informed methodology, but criticize his theory. These arguments and criticisms led to the replacement of P by three orthogonal and theory-based factors, Insensitivity (S), Orderliness (G), and Absorption (A), that together with the dimensions E or Extraversion and N or Neuroticism, that were retained from Eysenck's PEN model, appear to give a comprehensive account of the main vulnerability factors in schizophrenia and affective disorders, as well as in other psychopathological conditions.

  18. 10 CFR 207.3 - Method of collecting energy information under ESECA.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 3 2010-01-01 2010-01-01 false Method of collecting energy information under ESECA. 207.3 Section 207.3 Energy DEPARTMENT OF ENERGY OIL COLLECTION OF INFORMATION Collection of Information Under the Energy Supply and Environmental Coordination Act of 1974 § 207.3 Method of collecting...

  19. 10 CFR 207.3 - Method of collecting energy information under ESECA.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 3 2011-01-01 2011-01-01 false Method of collecting energy information under ESECA. 207.3 Section 207.3 Energy DEPARTMENT OF ENERGY OIL COLLECTION OF INFORMATION Collection of Information Under the Energy Supply and Environmental Coordination Act of 1974 § 207.3 Method of collecting...

  20. 10 CFR 207.3 - Method of collecting energy information under ESECA.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 3 2013-01-01 2013-01-01 false Method of collecting energy information under ESECA. 207.3 Section 207.3 Energy DEPARTMENT OF ENERGY OIL COLLECTION OF INFORMATION Collection of Information Under the Energy Supply and Environmental Coordination Act of 1974 § 207.3 Method of collecting...

  1. 10 CFR 207.3 - Method of collecting energy information under ESECA.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 3 2012-01-01 2012-01-01 false Method of collecting energy information under ESECA. 207.3 Section 207.3 Energy DEPARTMENT OF ENERGY OIL COLLECTION OF INFORMATION Collection of Information Under the Energy Supply and Environmental Coordination Act of 1974 § 207.3 Method of collecting...

  2. 10 CFR 207.3 - Method of collecting energy information under ESECA.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 3 2014-01-01 2014-01-01 false Method of collecting energy information under ESECA. 207.3 Section 207.3 Energy DEPARTMENT OF ENERGY OIL COLLECTION OF INFORMATION Collection of Information Under the Energy Supply and Environmental Coordination Act of 1974 § 207.3 Method of collecting...

  3. The Effects of Presentation Method and Information Density on Visual Search Ability and Working Memory Load

    ERIC Educational Resources Information Center

    Chang, Ting-Wen; Kinshuk; Chen, Nian-Shing; Yu, Pao-Ta

    2012-01-01

    This study investigates the effects of successive and simultaneous information presentation methods on learner's visual search ability and working memory load for different information densities. Since the processing of information in the brain depends on the capacity of visual short-term memory (VSTM), the limited information processing capacity…

  4. An efficient steganography method for hiding patient confidential information.

    PubMed

    Al-Dmour, Hayat; Al-Ani, Ahmed; Nguyen, Hung

    2014-01-01

    This paper deals with the important issue of security and confidentiality of patient information when exchanging or storing medical images. Steganography has recently been viewed as an alternative or complement to cryptography, as existing cryptographic systems are not perfect due to their vulnerability to certain types of attack. We propose in this paper a new steganography algorithm for hiding patient confidential information. It utilizes Pixel Value Differencing (PVD) to identify contrast regions in the image and a Hamming code that embeds 3 secret message bits into 4 bits of the cover image. In order to preserve the content of the region of interest (ROI), the embedding is only performed using the Region of Non-Interest (RONI).

  5. Methods of information theory and algorithmic complexity for network biology.

    PubMed

    Zenil, Hector; Kiani, Narsis A; Tegnér, Jesper

    2016-03-01

    We survey and introduce concepts and tools located at the intersection of information theory and network biology. We show that Shannon's information entropy, compressibility and algorithmic complexity quantify different local and global aspects of synthetic and biological data. We show examples such as the emergence of giant components in Erdös-Rényi random graphs, and the recovery of topological properties from numerical kinetic properties simulating gene expression data. We provide exact theoretical calculations, numerical approximations and error estimations of entropy, algorithmic probability and Kolmogorov complexity for different types of graphs, characterizing their variant and invariant properties. We introduce formal definitions of complexity for both labeled and unlabeled graphs and prove that the Kolmogorov complexity of a labeled graph is a good approximation of its unlabeled Kolmogorov complexity and thus a robust definition of graph complexity.

  6. An efficient steganography method for hiding patient confidential information.

    PubMed

    Al-Dmour, Hayat; Al-Ani, Ahmed; Nguyen, Hung

    2014-01-01

    This paper deals with the important issue of security and confidentiality of patient information when exchanging or storing medical images. Steganography has recently been viewed as an alternative or complement to cryptography, as existing cryptographic systems are not perfect due to their vulnerability to certain types of attack. We propose in this paper a new steganography algorithm for hiding patient confidential information. It utilizes Pixel Value Differencing (PVD) to identify contrast regions in the image and a Hamming code that embeds 3 secret message bits into 4 bits of the cover image. In order to preserve the content of the region of interest (ROI), the embedding is only performed using the Region of Non-Interest (RONI). PMID:25569937

  7. Health information system access control redesign - rationale and method.

    PubMed

    Moselle, Kenneth A

    2011-01-01

    This paper addresses the question of why a health service system might find it necessary to re-engineer the access control model that mediates the interaction of clinicians with health information systems. Factors that lead to increasingly complexity of the access control models are delineated, and consequences of that complexity are identified. Strategies are presented to address these factors, and a stepwise procedure is suggested to structure the access control model re-engineering process.

  8. Imaging systems and methods for obtaining and using biometric information

    DOEpatents

    McMakin, Douglas L [Richland, WA; Kennedy, Mike O [Richland, WA

    2010-11-30

    Disclosed herein are exemplary embodiments of imaging systems and methods of using such systems. In one exemplary embodiment, one or more direct images of the body of a clothed subject are received, and a motion signature is determined from the one or more images. In this embodiment, the one or more images show movement of the body of the subject over time, and the motion signature is associated with the movement of the subject's body. In certain implementations, the subject can be identified based at least in part on the motion signature. Imaging systems for performing any of the disclosed methods are also disclosed herein. Furthermore, the disclosed imaging, rendering, and analysis methods can be implemented, at least in part, as one or more computer-readable media comprising computer-executable instructions for causing a computer to perform the respective methods.

  9. Using Mixed Methods in Health Information Technology Evaluation.

    PubMed

    Sockolow, Paulina; Dowding, Dawn; Randell, Rebecca; Favela, Jesus

    2016-01-01

    With the increasing adoption of interactive systems in healthcare, there is a need to ensure that the benefits of such systems are formally evaluated. Traditionally quantitative research approaches have been used to gather evidence on measurable outcomes of health technology. Qualitative approaches have also been used to analyze how or why particular interventions did or did not work in specific healthcare contexts. Mixed methods research provides a framework for carrying out both quantitative and qualitative approaches within a single research study. In this paper an international group of four informatics scholars illustrate some of the benefits and challenges of using mixed methods in evaluation. The diversity of the research experience provides a broad overview of approaches in combining robust analysis of outcome data with qualitative methods that provide an understanding of the processes through which, and the contexts in which, those outcomes are achieved. This paper discussed the benefits that mixed methods brought to each study. PMID:27332167

  10. Information storage medium and method of recording and retrieving information thereon

    DOEpatents

    Marchant, D. D.; Begej, Stefan

    1986-01-01

    Information storage medium comprising a semiconductor doped with first and second impurities or dopants. Preferably, one of the impurities is introduced by ion implantation. Conductive electrodes are photolithographically formed on the surface of the medium. Information is recorded on the medium by selectively applying a focused laser beam to discrete regions of the medium surface so as to anneal discrete regions of the medium containing lattice defects introduced by the ion-implanted impurity. Information is retrieved from the storage medium by applying a focused laser beam to annealed and non-annealed regions so as to produce a photovoltaic signal at each region.

  11. A High Accuracy Method for Semi-supervised Information Extraction

    SciTech Connect

    Tratz, Stephen C.; Sanfilippo, Antonio P.

    2007-04-22

    Customization to specific domains of dis-course and/or user requirements is one of the greatest challenges for today’s Information Extraction (IE) systems. While demonstrably effective, both rule-based and supervised machine learning approaches to IE customization pose too high a burden on the user. Semi-supervised learning approaches may in principle offer a more resource effective solution but are still insufficiently accurate to grant realistic application. We demonstrate that this limitation can be overcome by integrating fully-supervised learning techniques within a semi-supervised IE approach, without increasing resource requirements.

  12. Redox potentials and pKa for benzoquinone from density functional theory based molecular dynamics.

    PubMed

    Cheng, Jun; Sulpizi, Marialore; Sprik, Michiel

    2009-10-21

    The density functional theory based molecular dynamics (DFTMD) method for the computation of redox free energies presented in previous publications and the more recent modification for computation of acidity constants are reviewed. The method uses a half reaction scheme based on reversible insertion/removal of electrons and protons. The proton insertion is assisted by restraining potentials acting as chaperones. The procedure for relating the calculated deprotonation free energies to Brønsted acidities (pK(a)) and the oxidation free energies to electrode potentials with respect to the normal hydrogen electrode is discussed in some detail. The method is validated in an application to the reduction of aqueous 1,4-benzoquinone. The conversion of hydroquinone to quinone can take place via a number of alternative pathways consisting of combinations of acid dissociations, oxidations, or dehydrogenations. The free energy changes of all elementary steps (ten in total) are computed. The accuracy of the calculations is assessed by comparing the energies of different pathways for the same reaction (Hess's law) and by comparison to experiment. This two-sided test enables us to separate the errors related with the restrictions on length and time scales accessible to DFTMD from the errors introduced by the DFT approximation. It is found that the DFT approximation is the main source of error for oxidation free energies. PMID:20568869

  13. Risk-Informed Safety Margin Characterization Methods Development Work

    SciTech Connect

    Smith, Curtis L; Ma, Zhegang; Tom Riley; Mandelli, Diego; Nielsen, Joseph W; Alfonsi, Andrea; Rabiti, Cristian

    2014-09-01

    This report summarizes the research activity developed during the Fiscal year 2014 within the Risk Informed Safety Margin and Characterization (RISMC) pathway within the Light Water Reactor Sustainability (LWRS) campaign. This research activity is complementary to the one presented in the INL/EXT-??? report which shows advances Probabilistic Risk Assessment Analysis using RAVEN and RELAP-7 in conjunction to novel flooding simulation tools. Here we present several analyses that prove the values of the RISMC approach in order to assess risk associated to nuclear power plants (NPPs). We focus on simulation based PRA which, in contrast to classical PRA, heavily employs system simulator codes. Firstly we compare, these two types of analyses, classical and RISMC, for a Boiling water reactor (BWR) station black out (SBO) initiating event. Secondly we present an extended BWR SBO analysis using RAVEN and RELAP-5 which address the comments and suggestions received about he original analysis presented in INL/EXT-???. This time we focus more on the stochastic analysis such probability of core damage and on the determination of the most risk-relevant factors. We also show some preliminary results regarding the comparison between RELAP5-3D and the new code RELAP-7 for a simplified Pressurized Water Reactors system. Lastly we present some conceptual ideas regarding the possibility to extended the RISMC capabilities from an off-line tool (i.e., as PRA analysis tool) to an online-tool. In this new configuration, RISMC capabilities can be used to assist and inform reactor operator during real accident scenarios.

  14. Game Theory Based Trust Model for Cloud Environment

    PubMed Central

    Gokulnath, K.; Uthariaraj, Rhymend

    2015-01-01

    The aim of this work is to propose a method to establish trust at bootload level in cloud computing environment. This work proposes a game theoretic based approach for achieving trust at bootload level of both resources and users perception. Nash equilibrium (NE) enhances the trust evaluation of the first-time users and providers. It also restricts the service providers and the users to violate service level agreement (SLA). Significantly, the problem of cold start and whitewashing issues are addressed by the proposed method. In addition appropriate mapping of cloud user's application to cloud service provider for segregating trust level is achieved as a part of mapping. Thus, time complexity and space complexity are handled efficiently. Experiments were carried out to compare and contrast the performance of the conventional methods and the proposed method. Several metrics like execution time, accuracy, error identification, and undecidability of the resources were considered. PMID:26380365

  15. Game Theory Based Trust Model for Cloud Environment.

    PubMed

    Gokulnath, K; Uthariaraj, Rhymend

    2015-01-01

    The aim of this work is to propose a method to establish trust at bootload level in cloud computing environment. This work proposes a game theoretic based approach for achieving trust at bootload level of both resources and users perception. Nash equilibrium (NE) enhances the trust evaluation of the first-time users and providers. It also restricts the service providers and the users to violate service level agreement (SLA). Significantly, the problem of cold start and whitewashing issues are addressed by the proposed method. In addition appropriate mapping of cloud user's application to cloud service provider for segregating trust level is achieved as a part of mapping. Thus, time complexity and space complexity are handled efficiently. Experiments were carried out to compare and contrast the performance of the conventional methods and the proposed method. Several metrics like execution time, accuracy, error identification, and undecidability of the resources were considered.

  16. a Probability-Based Statistical Method to Extract Water Body of TM Images with Missing Information

    NASA Astrophysics Data System (ADS)

    Lian, Shizhong; Chen, Jiangping; Luo, Minghai

    2016-06-01

    Water information cannot be accurately extracted using TM images because true information is lost in some images because of blocking clouds and missing data stripes, thereby water information cannot be accurately extracted. Water is continuously distributed in natural conditions; thus, this paper proposed a new method of water body extraction based on probability statistics to improve the accuracy of water information extraction of TM images with missing information. Different disturbing information of clouds and missing data stripes are simulated. Water information is extracted using global histogram matching, local histogram matching, and the probability-based statistical method in the simulated images. Experiments show that smaller Areal Error and higher Boundary Recall can be obtained using this method compared with the conventional methods.

  17. Development of StopAdvisor: A theory-based interactive internet-based smoking cessation intervention.

    PubMed

    Michie, Susan; Brown, Jamie; Geraghty, Adam W A; Miller, Sascha; Yardley, Lucy; Gardner, Benjamin; Shahab, Lion; McEwen, Andy; Stapleton, John A; West, Robert

    2012-09-01

    Reviews of internet-based behaviour-change interventions have shown that they can be effective but there is considerable heterogeneity and effect sizes are generally small. In order to advance science and technology in this area, it is essential to be able to build on principles and evidence of behaviour change in an incremental manner. We report the development of an interactive smoking cessation website, StopAdvisor, designed to be attractive and effective across the social spectrum. It was informed by a broad motivational theory (PRIME), empirical evidence, web-design expertise, and user-testing. The intervention was developed using an open-source web-development platform, 'LifeGuide', designed to facilitate optimisation and collaboration. We identified 19 theoretical propositions, 33 evidence- or theory-based behaviour change techniques, 26 web-design principles and nine principles from user-testing. These were synthesised to create the website, 'StopAdvisor' (see http://www.lifeguideonline.org/player/play/stopadvisordemonstration). The systematic and transparent application of theory, evidence, web-design expertise and user-testing within an open-source development platform can provide a basis for multi-phase optimisation contributing to an 'incremental technology' of behaviour change. PMID:24073123

  18. Game theory-based mode cooperative selection mechanism for device-to-device visible light communication

    NASA Astrophysics Data System (ADS)

    Liu, Yuxin; Huang, Zhitong; Li, Wei; Ji, Yuefeng

    2016-03-01

    Various patterns of device-to-device (D2D) communication, from Bluetooth to Wi-Fi Direct, are emerging due to the increasing requirements of information sharing between mobile terminals. This paper presents an innovative pattern named device-to-device visible light communication (D2D-VLC) to alleviate the growing traffic problem. However, the occlusion problem is a difficulty in D2D-VLC. This paper proposes a game theory-based solution in which the best-response dynamics and best-response strategies are used to realize a mode-cooperative selection mechanism. This mechanism uses system capacity as the utility function to optimize system performance and selects the optimal communication mode for each active user from three candidate modes. Moreover, the simulation and experimental results show that the mechanism can attain a significant improvement in terms of effectiveness and energy saving compared with the cases where the users communicate via only the fixed transceivers (light-emitting diode and photo diode) or via only D2D.

  19. Effects of a social cognitive theory-based hip fracture prevention web site for older adults.

    PubMed

    Nahm, Eun-Shim; Barker, Bausell; Resnick, Barbara; Covington, Barbara; Magaziner, Jay; Brennan, Patricia Flatley

    2010-01-01

    The purposes of this study were to develop a Social Cognitive Theory-based, structured Hip Fracture Prevention Web site for older adults and conduct a preliminary evaluation of its effectiveness. The Theory-based, structured Hip Fracture Prevention Web site is composed of learning modules and a moderated discussion board. A total of 245 older adults recruited from two Web sites and a newspaper advertisement were randomized into the Theory-based, structured Hip Fracture Prevention Web site and the conventional Web sites groups. Outcomes included (1) knowledge (hip fractures and osteoporosis), (2) self-efficacy and outcome expectations, and (3) calcium intake and exercise and were assessed at baseline, end of treatment (2 weeks), and follow-up (3 months). Both groups showed significant improvement in most outcomes. For calcium intake, only the Theory-based, structured Hip Fracture Prevention Web site group showed improvement. None of the group and time interactions were significant. The Theory-based, structured Hip Fracture Prevention Web site group, however, was more satisfied with the intervention. The discussion board usage was significantly correlated with outcome gains. Despite several limitations, the findings showed some preliminary effectiveness of Web-based health interventions for older adults and the use of a Theory-based, structured Hip Fracture Prevention Web site as a sustainable Web structure for online health behavior change interventions. PMID:20978408

  20. Effects of a social cognitive theory-based hip fracture prevention web site for older adults.

    PubMed

    Nahm, Eun-Shim; Barker, Bausell; Resnick, Barbara; Covington, Barbara; Magaziner, Jay; Brennan, Patricia Flatley

    2010-01-01

    The purposes of this study were to develop a Social Cognitive Theory-based, structured Hip Fracture Prevention Web site for older adults and conduct a preliminary evaluation of its effectiveness. The Theory-based, structured Hip Fracture Prevention Web site is composed of learning modules and a moderated discussion board. A total of 245 older adults recruited from two Web sites and a newspaper advertisement were randomized into the Theory-based, structured Hip Fracture Prevention Web site and the conventional Web sites groups. Outcomes included (1) knowledge (hip fractures and osteoporosis), (2) self-efficacy and outcome expectations, and (3) calcium intake and exercise and were assessed at baseline, end of treatment (2 weeks), and follow-up (3 months). Both groups showed significant improvement in most outcomes. For calcium intake, only the Theory-based, structured Hip Fracture Prevention Web site group showed improvement. None of the group and time interactions were significant. The Theory-based, structured Hip Fracture Prevention Web site group, however, was more satisfied with the intervention. The discussion board usage was significantly correlated with outcome gains. Despite several limitations, the findings showed some preliminary effectiveness of Web-based health interventions for older adults and the use of a Theory-based, structured Hip Fracture Prevention Web site as a sustainable Web structure for online health behavior change interventions.

  1. Successful Aging with Sickle Cell Disease: Using Qualitative Methods to Inform Theory

    PubMed Central

    Jenerette, Coretta M.; Lauderdale, Gloria

    2009-01-01

    Little is known about the lives of adults with sickle cell disease (SCD). This article reports findings from a qualitative pilot study, which used life review as a method to explore influences on health outcomes among middle-aged and older adults with SCD, Six females with SCD, recruited from two urban sickle cell clinics in the U.S., engaged in semi-structured, in-depth life review interviews. MaxQDA2 software was used for qualitative data coding and analysis. Three major themes were identified: vulnerability factors, self-care management resources, and health outcomes. These themes are consistent with the Theory of Self-Care Management for Sickle Cell Disease. Identifying vulnerability factors, self-care management resources, and health outcomes in adults with SCD may aid in developing theory-based interventions to meet health care needs of younger individuals with SCD. The life review process is a useful means to gain insight into successful aging with SCD and other chronic illnesses. PMID:19838320

  2. A Review of Web Information Seeking Research: Considerations of Method and Foci of Interest

    ERIC Educational Resources Information Center

    Martzoukou, Konstantina

    2005-01-01

    Introduction: This review shows that Web information seeking research suffers from inconsistencies in method and a lack of homogeneity in research foci. Background: Qualitative and quantitative methods are needed to produce a comprehensive view of information seeking. Studies also recommend observation as one of the most fundamental ways of…

  3. Explanation of Second-Order Asymptotic Theory Via Information Spectrum Method

    NASA Astrophysics Data System (ADS)

    Hayashi, Masahito

    We explain second-order asymptotic theory via the information spectrum method. From a unified viewpoint based on the generality of the information spectrum method, we consider second-order asymptotic theory for use in fixed-length data compression, uniform random number generation, and channel coding. Additionally, we discuss its application to quantum cryptography, folklore in source coding, and security analysis.

  4. A Theory-Based Exercise App to Enhance Exercise Adherence: A Pilot Study

    PubMed Central

    Voth, Elizabeth C; Oelke, Nelly D

    2016-01-01

    Background Use of mobile health (mHealth) technology is on an exponential rise. mHealth apps have the capability to reach a large number of individuals, but until now have lacked the integration of evidence-based theoretical constructs to increase exercise behavior in users. Objective The purpose of this study was to assess the effectiveness of a theory-based, self-monitoring app on exercise and self-monitoring behavior over 8 weeks. Methods A total of 56 adults (mean age 40 years, SD 13) were randomly assigned to either receive the mHealth app (experimental; n=28) or not to receive the app (control; n=28). All participants engaged in an exercise goal-setting session at baseline. Experimental condition participants received weekly short message service (SMS) text messages grounded in social cognitive theory and were encouraged to self-monitor exercise bouts on the app on a daily basis. Exercise behavior, frequency of self-monitoring exercise behavior, self-efficacy to self-monitor, and self-management of exercise behavior were collected at baseline and at postintervention. Results Engagement in exercise bouts was greater in the experimental condition (mean 7.24, SD 3.40) as compared to the control condition (mean 4.74, SD 3.70, P=.03, d=0.70) at week 8 postintervention. Frequency of self-monitoring increased significantly over the 8-week investigation between the experimental and control conditions (P<.001, partial η2=.599), with participants in the experimental condition self-monitoring significantly more at postintervention (mean 6.00, SD 0.93) in comparison to those in the control condition (mean 1.95, SD 2.58, P<.001, d=2.10). Self-efficacy to self-monitor and perceived self-management of exercise behavior were unaffected by this intervention. Conclusions The successful integration of social cognitive theory into an mHealth exercise self-monitoring app provides support for future research to feasibly integrate theoretical constructs into existing exercise apps

  5. Quantum mechanical embedding theory based on a unique embedding potential

    SciTech Connect

    Chen Huang; Pavone, Michele; Carter, Emily A.

    2011-04-21

    We remove the nonuniqueness of the embedding potential that exists in most previous quantum mechanical embedding schemes by letting the environment and embedded region share a common embedding (interaction) potential. To efficiently solve for the embedding potential, an optimized effective potential method is derived. This embedding potential, which eschews use of approximate kinetic energy density functionals, is then used to describe the environment while a correlated wavefunction (CW) treatment of the embedded region is employed. We first demonstrate the accuracy of this new embedded CW (ECW) method by calculating the van der Waals binding energy curve between a hydrogen molecule and a hydrogen chain. We then examine the prototypical adsorption of CO on a metal surface, here the Cu(111) surface. In addition to obtaining proper site ordering (top site most stable) and binding energies within this theory, the ECW exhibits dramatic changes in the p-character of the CO 4{sigma} and 5{sigma} orbitals upon adsorption that agree very well with x-ray emission spectra, providing further validation of the theory. Finally, we generalize our embedding theory to spin-polarized quantum systems and discuss the connection between our theory and partition density functional theory.

  6. A new method for high-capacity information hiding in video robust against temporal desynchronization

    NASA Astrophysics Data System (ADS)

    Mitekin, Vitaly; Fedoseev, Victor A.

    2015-02-01

    This paper presents a new method for high-capacity information hiding in digital video and algorithms of embedding and extraction of hidden information based on this method. These algorithms do not require temporal synchronization to provide robustness against both malicious and non-malicious frame dropping (temporal desynchronization). At the same time, due to randomized distribution of hidden information bits across the video frames, the proposed method allows to increase the hiding capacity proportionally to the number of frames used for information embedding. The proposed method is also robust against "watermark estimation" attack aimed at estimation of hidden information without knowing the embedding key or non-watermarked video. Presented experimental results demonstrate declared features of this method.

  7. Mixture theory-based poroelasticity as a model of interstitial tissue growth

    PubMed Central

    Cowin, Stephen C.; Cardoso, Luis

    2011-01-01

    This contribution presents an alternative approach to mixture theory-based poroelasticity by transferring some poroelastic concepts developed by Maurice Biot to mixture theory. These concepts are a larger RVE and the subRVE-RVE velocity average tensor, which Biot called the micro-macro velocity average tensor. This velocity average tensor is assumed here to depend upon the pore structure fabric. The formulation of mixture theory presented is directed toward the modeling of interstitial growth, that is to say changing mass and changing density of an organism. Traditional mixture theory considers constituents to be open systems, but the entire mixture is a closed system. In this development the mixture is also considered to be an open system as an alternative method of modeling growth. Growth is slow and accelerations are neglected in the applications. The velocity of a solid constituent is employed as the main reference velocity in preference to the mean velocity concept from the original formulation of mixture theory. The standard development of statements of the conservation principles and entropy inequality employed in mixture theory are modified to account for these kinematic changes and to allow for supplies of mass, momentum and energy to each constituent and to the mixture as a whole. The objective is to establish a basis for the development of constitutive equations for growth of tissues. PMID:22184481

  8. Parallel implementation of multireference coupled-cluster theories based on the reference-level parallelism

    SciTech Connect

    Brabec, Jiri; Pittner, Jiri; van Dam, Hubertus JJ; Apra, Edoardo; Kowalski, Karol

    2012-02-01

    A novel algorithm for implementing general type of multireference coupled-cluster (MRCC) theory based on the Jeziorski-Monkhorst exponential Ansatz [B. Jeziorski, H.J. Monkhorst, Phys. Rev. A 24, 1668 (1981)] is introduced. The proposed algorithm utilizes processor groups to calculate the equations for the MRCC amplitudes. In the basic formulation each processor group constructs the equations related to a specific subset of references. By flexible choice of processor groups and subset of reference-specific sufficiency conditions designated to a given group one can assure optimum utilization of available computing resources. The performance of this algorithm is illustrated on the examples of the Brillouin-Wigner and Mukherjee MRCC methods with singles and doubles (BW-MRCCSD and Mk-MRCCSD). A significant improvement in scalability and in reduction of time to solution is reported with respect to recently reported parallel implementation of the BW-MRCCSD formalism [J.Brabec, H.J.J. van Dam, K. Kowalski, J. Pittner, Chem. Phys. Lett. 514, 347 (2011)].

  9. Mixture theory-based poroelasticity as a model of interstitial tissue growth.

    PubMed

    Cowin, Stephen C; Cardoso, Luis

    2012-01-01

    This contribution presents an alternative approach to mixture theory-based poroelasticity by transferring some poroelastic concepts developed by Maurice Biot to mixture theory. These concepts are a larger RVE and the subRVE-RVE velocity average tensor, which Biot called the micro-macro velocity average tensor. This velocity average tensor is assumed here to depend upon the pore structure fabric. The formulation of mixture theory presented is directed toward the modeling of interstitial growth, that is to say changing mass and changing density of an organism. Traditional mixture theory considers constituents to be open systems, but the entire mixture is a closed system. In this development the mixture is also considered to be an open system as an alternative method of modeling growth. Growth is slow and accelerations are neglected in the applications. The velocity of a solid constituent is employed as the main reference velocity in preference to the mean velocity concept from the original formulation of mixture theory. The standard development of statements of the conservation principles and entropy inequality employed in mixture theory are modified to account for these kinematic changes and to allow for supplies of mass, momentum and energy to each constituent and to the mixture as a whole. The objective is to establish a basis for the development of constitutive equations for growth of tissues.

  10. Reducing sedentary behavior in minority girls via a theory-based, tailored classroom media intervention

    PubMed Central

    SPRUIJT-METZ, DONNA; NGUYEN-MICHEL, SELENA T.; GORAN, MICHAEL I.; CHOU, CHIH-PING; HUANG, TERRY T-K.

    2010-01-01

    Objective To develop, implement and test an innovative, theory-based classroom media intervention known as Get Moving! to increase physical activity and decrease sedentary behaviors in predominantly Latina middle school girls. Research methods and procedures School-based intervention on five to seven consecutive school days in seven schools (four intervention and three control) with high Latino populations (above 60%). Intervention schools were matched to control schools by ethnic makeup and socioeconomic status (SES). Measures conducted 3 months before and 3 months after intervention included height, weight, percentage body fat (bioimpedance analysis), physical activity and psychosocial aspects of activity by questionnaire. Subjects were middle school girls, mean age 12.5 years old, 73% Latina (N=459 girls). Results Get Moving! significantly reduced time spent on sedentary behavior (β± standard error, SE=−0.27±0.14, p<0.05) and significantly increased intrinsic motivation (β±SE=0.11±0.05, p<0.05). There was a trend for mediation effects of intrinsic motivation, but this did not reach significance. Discussion Get Moving! is a promising school-based approach that specifically targets physical activity and sedentary behavior in Latina girls, a population at high risk for obesity and related diseases. PMID:19023773

  11. Using a Marginal Structural Model to Design a Theory-Based Mass Media Campaign

    PubMed Central

    Taguri, Masataka; Ishikawa, Yoshiki

    2016-01-01

    Background The essential first step in the development of mass media health campaigns is to identify specific beliefs of the target audience. The challenge is to prioritize suitable beliefs derived from behavioral theory. The purpose of this study was to identify suitable beliefs to target in a mass media campaign to change behavior using a new method to estimate the possible effect size of a small set of beliefs. Methods Data were drawn from the 2010 Japanese Young Female Smoker Survey (n = 500), conducted by the Japanese Ministry of Health, Labor and Welfare. Survey measures included intention to quit smoking, psychological beliefs (attitude, norms, and perceived control) based on the theory of planned behavior and socioeconomic status (age, education, household income, and marital status). To identify suitable candidate beliefs for a mass media health campaign, we estimated the possible effect size required to change the intention to quit smoking among the population of young Japanese women using the population attributable fraction from a marginal structural model. Results Thirteen percent of study participants intended to quit smoking. The marginal structural model estimated a population attributable fraction of 47 psychological beliefs (21 attitudes, 6 norms, and 19 perceived controls) after controlling for socioeconomic status. The belief, “I could quit smoking if my husband or significant other recommended it” suggested a promising target for a mass media campaign (population attributable fraction = 0.12, 95% CI = 0.02–0.23). Messages targeting this belief could possibly improve intention rates by up to 12% among this population. The analysis also suggested the potential for regulatory action. Conclusions This study proposed a method by which campaign planners can develop theory-based mass communication strategies to change health behaviors at the population level. This method might contribute to improving the quality of future mass health

  12. An Information Theory-Based Approach to Assessing the Sustainability and Stability of an Island System

    EPA Science Inventory

    It is well-documented that a sustainable system is based on environmental stewardship, economic viability and social equity. What is often overlooked is the need for continuity such that desirable system behavior is maintained with mechanisms in place that facilitate the ability ...

  13. A research on scenic information prediction method based on RBF neural network

    NASA Astrophysics Data System (ADS)

    Li, Jingwen; Yin, Shouqiang; Wang, Ke

    2015-12-01

    Based on the rapid development of the wisdom tourism, it is conform to the trend of the development of the wisdom tourism through the scientific method to realize the prediction of the scenic information. The article,using the super nonlinear fitting ability of RBF neural network[1-2],builds a prediction and inference method of comprehensive information for the complex geographic time, space and attribute of scenic through the hyper-surface data organization of the scenic geographic entity information[3]. And it uses Guilin scenic area as an example to deduce the process of the forecasting of the whole information.

  14. Improving Diabetes care through Examining, Advising, and prescribing (IDEA): protocol for a theory-based cluster randomised controlled trial of a multiple behaviour change intervention aimed at primary healthcare professionals

    PubMed Central

    2014-01-01

    Background New clinical research findings may require clinicians to change their behaviour to provide high-quality care to people with type 2 diabetes, likely requiring them to change multiple different clinical behaviours. The present study builds on findings from a UK-wide study of theory-based behavioural and organisational factors associated with prescribing, advising, and examining consistent with high-quality diabetes care. Aim To develop and evaluate the effectiveness and cost of an intervention to improve multiple behaviours in clinicians involved in delivering high-quality care for type 2 diabetes. Design/methods We will conduct a two-armed cluster randomised controlled trial in 44 general practices in the North East of England to evaluate a theory-based behaviour change intervention. We will target improvement in six underperformed clinical behaviours highlighted in quality standards for type 2 diabetes: prescribing for hypertension; prescribing for glycaemic control; providing physical activity advice; providing nutrition advice; providing on-going education; and ensuring that feet have been examined. The primary outcome will be the proportion of patients appropriately prescribed and examined (using anonymised computer records), and advised (using anonymous patient surveys) at 12 months. We will use behaviour change techniques targeting motivational, volitional, and impulsive factors that we have previously demonstrated to be predictive of multiple health professional behaviours involved in high-quality type 2 diabetes care. We will also investigate whether the intervention was delivered as designed (fidelity) by coding audiotaped workshops and interventionist delivery reports, and operated as hypothesised (process evaluation) by analysing responses to theory-based postal questionnaires. In addition, we will conduct post-trial qualitative interviews with practice teams to further inform the process evaluation, and a post-trial economic analysis to

  15. Inter-instrumental method transfer of chiral capillary electrophoretic methods using robustness test information.

    PubMed

    De Cock, Bart; Borsuk, Agnieszka; Dejaegher, Bieke; Stiens, Johan; Mangelings, Debby; Vander Heyden, Yvan

    2014-08-01

    Capillary electrophoresis (CE) is an electrodriven separation technique that is often used for the separation of chiral molecules. Advantages of CE are its flexibility, low cost and efficiency. On the other hand, the precision and transfer of CE methods are well-known problems of the technique. Reasons for the more complicated method transfer are the more diverse instrumental differences, such as total capillary lengths and capillary cooling systems; and the higher response variability of CE methods compared to other techniques, such as liquid chromatography (HPLC). Therefore, a larger systematic change in peak resolutions, migration times and peak areas, with a loss of separation and efficiency may be seen when a CE method is transferred to another laboratory or another type of instrument. A swift and successful method transfer is required because development and routine use of analytical methods are usually not performed in the same laboratory and/or on the same type of equipment. The aim of our study was to develop transfer rules to facilitate CE method transfers between different laboratories and instruments. In our case study, three β-blockers were chirally separated and inter-instrumental transfers were performed. The first step of our study was to optimise the precision of the chiral CE method. Next, a robustness test was performed to identify the instrumental and experimental parameters that were most influencing the considered responses. The precision- and the robustness study results were used to adapt instrumental and/or method settings to improve the transfer between different instruments. Finally, the comparison of adapted and non-adapted transfers allowed deriving some rules to facilitate CE method transfers.

  16. 49 CFR 1135.2 - Revenue Shortfall Allocation Method: Annual State tax information.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 8 2010-10-01 2010-10-01 false Revenue Shortfall Allocation Method: Annual State... RECOVERY PROCEDURES § 1135.2 Revenue Shortfall Allocation Method: Annual State tax information. (a) To enable the Board to calculate the revenue shortfall allocation method (RSAM), which is one of the...

  17. A method for fast selecting feature wavelengths from the spectral information of crop nitrogen

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Research on a method for fast selecting feature wavelengths from the nitrogen spectral information is necessary, which can determine the nitrogen content of crops. Based on the uniformity of uniform design, this paper proposed an improved particle swarm optimization (PSO) method. The method can ch...

  18. Information/Knowledge Acquisition Methods for Decision Support Systems and Expert Systems.

    ERIC Educational Resources Information Center

    Yang, Heng-Li

    1995-01-01

    Compares information requirement-elicitation (IRE) methods for decision support systems (DSS) with knowledge acquisition (KA) methods for expert systems (ES) development. The definition and architectures of ES and DSS are compared and the systems' development cycles and IRE/KA methods are discussed. Differences are noted between ES and DSS…

  19. 76 FR 40448 - 2010 Tax Information for Use in the Revenue Shortfall Allocation Method

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-08

    ... Surface Transportation Board 2010 Tax Information for Use in the Revenue Shortfall Allocation Method... Allocation Method (RSAM). DATES: Comments are due by August 8, 2011. If any comment opposing AAR's... Shortfall Allocation Method, EP 646 (Sub-No. 2) (STB served Nov. 21, 2008). RSAM is intended to measure...

  20. 78 FR 34427 - 2012 Tax Information for Use In The Revenue Shortfall Allocation Method

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-07

    ... Surface Transportation Board 2012 Tax Information for Use In The Revenue Shortfall Allocation Method... Shortfall Allocation Method (RSAM). DATES: Comments are due by July 9, 2013. If any comment opposing AAR's... Revenue Shortfall Allocation Method, EP 646 (Sub-No. 2) (STB served Nov. 21, 2008). RSAM is intended...

  1. Method of bistable optical information storage using antiferroelectric phase PLZT ceramics

    DOEpatents

    Land, Cecil E.

    1990-01-01

    A method for bistable storage of binary optical information includes an antiferroelectric (AFE) lead lanthanum zirconate titanate (PLZT) layer having a stable antiferroelectric first phase and a ferroelectric (FE) second phase obtained by applying a switching electric field across the surface of the device. Optical information is stored by illuminating selected portions of the layer to photoactivate an FE to AFE transition in those portions. Erasure of the stored information is obtained by reapplying the switching field.

  2. Method of bistable optical information storage using antiferroelectric phase PLZT ceramics

    DOEpatents

    Land, C.E.

    1990-07-31

    A method for bistable storage of binary optical information includes an antiferroelectric (AFE) lead lanthanum zirconate titanate (PLZT) layer having a stable antiferroelectric first phase and a ferroelectric (FE) second phase obtained by applying a switching electric field across the surface of the device. Optical information is stored by illuminating selected portions of the layer to photoactivate an FE to AFE transition in those portions. Erasure of the stored information is obtained by reapplying the switching field. 8 figs.

  3. Revisited: The South Dakota Board of Nursing theory-based regulatory decisioning model.

    PubMed

    Damgaard, Gloria; Bunkers, Sandra Schmidt

    2012-07-01

    The authors of this column describe the South Dakota Board of Nursing's 11 year journey utilizing a humanbecoming theory-based regulatory decisioning model. The column revisits the model with an emphasis on the cocreation of a strategic plan guiding the work of the South Dakota Board of Nursing through 2014. The strategic plan was influenced by the latest refinements of the humanbecoming postulates and the humanbecoming community change concepts. A graphic picture of the decisioning model is presented along with future plans for the theory-based model.

  4. “Please Don’t Send Us Spam!” A Participative, Theory-Based Methodology for Developing an mHealth Intervention

    PubMed Central

    2016-01-01

    Background Mobile health solutions have the potential of reducing burdens on health systems and empowering patients with important information. However, there is a lack of theory-based mHealth interventions. Objective The purpose of our study was to develop a participative, theory-based, mobile phone, audio messaging intervention attractive to recently circumcised men at voluntary medical male circumcision (VMMC) clinics in the Cape Town area in South Africa. We aimed to shift some of the tasks related to postoperative counselling on wound management and goal setting on safe sex. We place an emphasis on describing the full method of message generation to allow for replication. Methods We developed an mHealth intervention using a staggered qualitative methodology: (1) focus group discussions with 52 recently circumcised men and their partners to develop initial voice messages they felt were relevant and appropriate, (2) thematic analysis and expert consultation to select the final messages for pilot testing, and (3) cognitive interviews with 12 recent VMMC patients to judge message comprehension and rank the messages. Message content and phasing were guided by the theory of planned behavior and the health action process approach. Results Patients and their partners came up with 245 messages they thought would help men during the wound-healing period. Thematic analysis revealed 42 different themes. Expert review and cognitive interviews with more patients resulted in 42 messages with a clear division in terms of needs and expectations between the initial wound-healing recovery phase (weeks 1–3) and the adjustment phase (weeks 4–6). Discussions with patients also revealed potential barriers to voice messaging, such as lack of technical knowledge of mobile phones and concerns about the invasive nature of the intervention. Patients’ own suggested messages confirmed Ajzen’s theory of planned behavior that if a health promotion intervention can build trust and be

  5. Theories and Methods for Research on Informal Learning and Work: Towards Cross-Fertilization

    ERIC Educational Resources Information Center

    Sawchuk, Peter H.

    2008-01-01

    The topic of informal learning and work has quickly become a staple in contemporary work and adult learning research internationally. The narrow conceptualization of work is briefly challenged before the article turns to a review of the historical origins as well as contemporary theories and methods involved in researching informal learning and…

  6. Mathematical, Logical, and Formal Methods in Information Retrieval: An Introduction to the Special Issue.

    ERIC Educational Resources Information Center

    Crestani, Fabio; Dominich, Sandor; Lalmas, Mounia; van Rijsbergen, Cornelis Joost

    2003-01-01

    Discusses the importance of research on the use of mathematical, logical, and formal methods in information retrieval to help enhance retrieval effectiveness and clarify underlying concepts of information retrieval. Highlights include logic; probability; spaces; and future research needs. (Author/LRW)

  7. Extraction Method for Earthquake-Collapsed Building Information Based on High-Resolution Remote Sensing

    NASA Astrophysics Data System (ADS)

    Chen, Peng; Wu, Jian; Liu, Yaolin; Wang, Jing

    2014-03-01

    At present, the extraction of earthquake disaster information from remote sensing data relies on visual interpretation. However, this technique cannot effectively and quickly obtain precise and efficient information for earthquake relief and emergency management. Collapsed buildings in the town of Zipingpu after the Wenchuan earthquake were used as a case study to validate two kinds of rapid extraction methods for earthquake-collapsed building information based on pixel-oriented and object-oriented theories. The pixel-oriented method is based on multi-layer regional segments that embody the core layers and segments of the object-oriented method. The key idea is to mask layer by layer all image information, including that on the collapsed buildings. Compared with traditional techniques, the pixel-oriented method is innovative because it allows considerably rapid computer processing. As for the object-oriented method, a multi-scale segment algorithm was applied to build a three-layer hierarchy. By analyzing the spectrum, texture, shape, location, and context of individual object classes in different layers, the fuzzy determined rule system was established for the extraction of earthquake-collapsed building information. We compared the two sets of results using three variables: precision assessment, visual effect, and principle. Both methods can extract earthquake-collapsed building information quickly and accurately. The object-oriented method successfully overcomes the pepper salt noise caused by the spectral diversity of high-resolution remote sensing data and solves the problem of same object, different spectrums and that of same spectrum, different objects. With an overall accuracy of 90.38%, the method achieves more scientific and accurate results compared with the pixel-oriented method (76.84%). The object-oriented image analysis method can be extensively applied in the extraction of earthquake disaster information based on high-resolution remote sensing.

  8. Exploring racial/ethnic differences in substance use: a preliminary theory-based investigation with juvenile justice-involved youth

    PubMed Central

    2011-01-01

    Background Racial/ethnic differences in representation, substance use, and its correlates may be linked to differential long-term health outcomes for justice-involved youth. Determining the nature of these differences is critical to informing more efficacious health prevention and intervention efforts. In this study, we employed a theory-based approach to evaluate the nature of these potential differences. Specifically, we hypothesized that (1) racial/ethnic minority youth would be comparatively overrepresented in the juvenile justice system, (2) the rates of substance use would be different across racial/ethnic groups, and (3) individual-level risk factors would be better predictors of substance use for Caucasian youth than for youth of other racial/ethnic groups. Methods To evaluate these hypotheses, we recruited a large, diverse sample of justice-involved youth in the southwest (N = 651; M age = 15.7, SD = 1.05, range = 14-18 years); 66% male; 41% Hispanic, 24% African American, 15% Caucasian, 11% American Indian/Alaska Native). All youth were queried about their substance use behavior (alcohol, marijuana, tobacco, illicit hard drug use) and individual-level risk factors (school involvement, employment, self-esteem, level of externalizing behaviors). Results As predicted, racial/ethnic minority youth were significantly overrepresented in the juvenile justice system. Additionally, Caucasian youth reported the greatest rates of substance use and substance-related individual-level risk factors. In contrast, African American youth showed the lowest rates for substance use and individual risk factors. Contrary to predictions, a racial/ethnic group by risk factor finding emerged for only one risk factor and one substance use category. Conclusions This research highlights the importance of more closely examining racial/ethnic differences in justice populations, as there are likely to be differing health needs, and subsequent treatment approaches, by racial/ethnic group

  9. Preventing Postpartum Smoking Relapse Among Inner City Women: Development of a Theory-Based and Evidence-Guided Text Messaging Intervention

    PubMed Central

    Wen, Kuang-Yi; Kilby, Linda; Fleisher, Linda; Belton, Tanisha D; Roy, Gem; Hernandez, Enrique

    2014-01-01

    Background Underserved women are at high risk for smoking relapse after childbirth due to their unique socioeconomic and postpartum stressors and barriers. Mobile text messaging technology allows delivery of relapse prevention programs targeted to their personal needs over time. Objective To describe the development of a social-cognitive theory-based and evidence-guided text messaging intervention for preventing postpartum smoking relapse among inner city women. Methods Guided by the cognitive-social health information processing framework, user-centered design, and health communication best practices, the intervention was developed through a systematic process that included needs assessment, followed by an iterative cycling through message drafting, health literacy evaluation and rewriting, review by target community members and a scientific advisory panel, and message revision, concluding with usability testing. Results All message content was theory-grounded, derived by needs assessment analysis and evidence-based materials, reviewed and revised by the target population, health literacy experts, and scientific advisors. The final program, “Txt2Commit,” was developed as a fully automated system, designed to deliver 3 proactive messages per day for a 1-month postpartum smoking relapse intervention, with crave and lapse user-initiated message functions available when needed. Conclusions The developmental process suggests that the application of theory and best practices in the design of text messaging smoking cessation interventions is not only feasible but necessary for ensuring that the interventions are evidence based and user-centered. PMID:24698804

  10. Evaluation of semantic-based information retrieval methods in the autism phenotype domain.

    PubMed

    Hassanpour, Saeed; O'Connor, Martin J; Das, Amar K

    2011-01-01

    Biomedical ontologies are increasingly being used to improve information retrieval methods. In this paper, we present a novel information retrieval approach that exploits knowledge specified by the Semantic Web ontology and rule languages OWL and SWRL. We evaluate our approach using an autism ontology that has 156 SWRL rules defining 145 autism phenotypes. Our approach uses a vector space model to correlate how well these phenotypes relate to the publications used to define them. We compare a vector space phenotype representation using class hierarchies with one that extends this method to incorporate additional semantics encoded in SWRL rules. From a PubMed-extracted corpus of 75 articles, we show that average rank of a related paper using the class hierarchy method is 4.6 whereas the average rank using the extended rule-based method is 3.3. Our results indicate that incorporating rule-based definitions in information retrieval methods can improve search for relevant publications.

  11. An adaptive altitude information fusion method for autonomous landing processes of small unmanned aerial rotorcraft.

    PubMed

    Lei, Xusheng; Li, Jingjing

    2012-01-01

    This paper presents an adaptive information fusion method to improve the accuracy and reliability of the altitude measurement information for small unmanned aerial rotorcraft during the landing process. Focusing on the low measurement performance of sensors mounted on small unmanned aerial rotorcraft, a wavelet filter is applied as a pre-filter to attenuate the high frequency noises in the sensor output. Furthermore, to improve altitude information, an adaptive extended Kalman filter based on a maximum a posteriori criterion is proposed to estimate measurement noise covariance matrix in real time. Finally, the effectiveness of the proposed method is proved by static tests, hovering flight and autonomous landing flight tests. PMID:23201993

  12. An Adaptive Altitude Information Fusion Method for Autonomous Landing Processes of Small Unmanned Aerial Rotorcraft

    PubMed Central

    Lei, Xusheng; Li, Jingjing

    2012-01-01

    This paper presents an adaptive information fusion method to improve the accuracy and reliability of the altitude measurement information for small unmanned aerial rotorcraft during the landing process. Focusing on the low measurement performance of sensors mounted on small unmanned aerial rotorcraft, a wavelet filter is applied as a pre-filter to attenuate the high frequency noises in the sensor output. Furthermore, to improve altitude information, an adaptive extended Kalman filter based on a maximum a posteriori criterion is proposed to estimate measurement noise covariance matrix in real time. Finally, the effectiveness of the proposed method is proved by static tests, hovering flight and autonomous landing flight tests. PMID:23201993

  13. Increasing smoke alarm operability through theory-based health education: a randomised trial

    PubMed Central

    Miller, Ted R; Bergen, Gwen; Ballesteros, Michael F; Bhattacharya, Soma; Gielen, Andrea Carlson; Sheppard, Monique S

    2015-01-01

    Background Although working smoke alarms halve deaths in residential fires, many households do not keep alarms operational. We tested whether theory-based education increases alarm operability. Methods Randomised multiarm trial, with a single arm randomly selected for use each day, in low-income neighbourhoods in Maryland, USA. Intervention arms: (1) Full Education combining a health belief module with a social-cognitive theory module that provided hands-on practice installing alarm batteries and using the alarm’s hush button; (2) Hands-on Practice social-cognitive module supplemented by typical fire department education; (3) Current Norm receiving typical fire department education only. Four hundred and thirty-six homes recruited through churches or by knocking on doors in 2005–2008. Followup visits checked alarm operability in 370 homes (85%) 1–3.5 years after installation. Main outcome measures: number of homes with working alarms defined as alarms with working batteries or hard-wired and number of working alarms per home. Regressions controlled for alarm status preintervention; demographics and beliefs about fire risks and alarm effectiveness. Results Homes in the Full Education and Practice arms were more likely to have a functioning smoke alarm at follow-up (OR=2.77, 95% CI 1.09 to 7.03) and had an average of 0.32 more working alarms per home (95% CI 0.09 to 0.56). Working alarms per home rose 16%. Full Education and Practice had similar effectiveness (p=0.97 on both outcome measures). Conclusions Without exceeding typical fire department installation time, installers can achieve greater smoke alarm operability. Hands-on practice is key. Two years after installation, for every three homes that received hands-on practice, one had an additional working alarm. Trial registration number http://www.clinicaltrials.gov number NCT00139126. PMID:25165090

  14. Method and apparatus for bistable optical information storage for erasable optical disks

    DOEpatents

    Land, Cecil E.; McKinney, Ira D.

    1990-01-01

    A method and an optical device for bistable storage of optical information, together with reading and erasure of the optical information, using a photoactivated shift in a field dependent phase transition between a metastable or a bias-stabilized ferroelectric (FE) phase and a stable antiferroelectric (AFE) phase in an lead lanthanum zirconate titanate (PLZT). An optical disk contains the PLZT. Writing and erasing of optical information can be accomplished by a light beam normal to the disk. Reading of optical information can be accomplished by a light beam at an incidence angle of 15 to 60 degrees to the normal of the disk.

  15. Method and apparatus for bistable optical information storage for erasable optical disks

    DOEpatents

    Land, C.E.; McKinney, I.D.

    1988-05-31

    A method and an optical device for bistable storage of optical information, together with reading and erasure of the optical information, using a photoactivated shift in a field dependent phase transition between a metastable or a bias-stabilized ferroelectric (FE) phase and a stable antiferroelectric (AFE) phase in a lead lanthanum zirconate titanate (PLZT). An optical disk contains the PLZT. Writing and erasing of optical information can be accomplished by a light beam normal to the disk. Reading of optical information can be accomplished by a light beam at an incidence angle of 15 to 60 degrees to the normal of the disk. 10 figs.

  16. Closing the digital divide in HIV/AIDS care: development of a theory-based intervention to increase Internet access.

    PubMed

    Kalichman, S C; Weinhardt, L; Benotsch, E; Cherry, C

    2002-08-01

    Advances in information technology are revolutionizing medical patient education and the Internet is becoming a major source of information for people with chronic medical conditions, including HIV/AIDS. However, many AIDS patients do not have equal access to the Internet and are therefore at an information disadvantage, particularly minorities, persons of low-income levels and individuals with limited education. This paper describes the development and pilot testing of a workshop-style intervention designed to close the digital divide in AIDS care. Grounded in the Information-Motivation-Behavioral Skills (IMB) model of health behaviour change, we developed an intervention for persons with no prior history of using the Internet. The intervention included instruction in using hardware and search engines, motivational enhancement to increase interest and perceived relevance of the Internet, and skills for critically evaluating and using health information accessed via the Internet. Participants were also introduced to communication and support functions of the Internet including e-mail, newsgroups and chat groups. Pilot testing demonstrated feasibility, acceptability and promise for closing the digital divide in HIV/AIDS care using a relatively brief and intensive theory-based intervention that could be implemented in community settings. PMID:12204154

  17. Novel copyright information hiding method based on random phase matrix of Fresnel diffraction transforms

    NASA Astrophysics Data System (ADS)

    Cao, Chao; Chen, Ru-jun

    2009-10-01

    In this paper, we present a new copyright information hide method for digital images in Moiré fringe formats. The copyright information is embedded into the protected image and the detecting image based on Fresnel phase matrix. Firstly, using Fresnel diffraction transform, the random phase matrix of copyright information is generated. Then, according to Moiré fringe principle, the protected image and the detecting image are modulated respectively based on the random phase matrix, and the copyright information is embedded into them. When the protected image and the detecting image are overlapped, the copyright information can reappear. Experiment results show that our method has good concealment performance, and is a new way for copyright protection.

  18. A UMLS-based method for integrating information databases into an Intranet.

    PubMed

    Volot, F; Joubert, M; Fieschi, M; Fieschi, D

    1997-01-01

    The Internet and the World Wide Web provide today end-users with capabilities to access universally to information in various and heterogeneous databases. The biomedical domain benefits from this new technology, specially for information retrieval by searching and browsing various sites. Nevertheless, end-users may be disoriented by specific ways to access information on different servers. In the framework of an Intranet design and development, we present a method for integrating information databases based on knowledge sources of the UMLS. The method provides designers of a Web site with facilities to implement an easy and homogeneous access to information. The pages are built dynamically and displayed according to a style sheet and their content stored in a database during the design phase. The database also describes the links between pages. Moreover, this organization provides administrators with powerful capabilities to manage Web sites.

  19. Development and Content Validation of the Information Assessment Method for Patients and Consumers

    PubMed Central

    Bartlett, Gillian; Grad, Roland M; Tang, David L; Johnson-Lafleur, Janique; Shulha, Michael; Barbosa Galvão, Maria Cristiane; Ricarte, Ivan LM; Stephenson, Randolph; Shohet, Linda; Hutsul, Jo-Anne; Repchinsky, Carol A; Rosenberg, Ellen; Burnand, Bernard; Légaré, France; Dunikowski, Lynn; Murray, Susan; Boruff, Jill; Frati, Francesca; Kloda, Lorie; Macaulay, Ann; Lagarde, François; Doray, Geneviève

    2014-01-01

    Background Online consumer health information addresses health problems, self-care, disease prevention, and health care services and is intended for the general public. Using this information, people can improve their knowledge, participation in health decision-making, and health. However, there are no comprehensive instruments to evaluate the value of health information from a consumer perspective. Objective We collaborated with information providers to develop and validate the Information Assessment Method for all (IAM4all) that can be used to collect feedback from information consumers (including patients), and to enable a two-way knowledge translation between information providers and consumers. Methods Content validation steps were followed to develop the IAM4all questionnaire. The first version was based on a theoretical framework from information science, a critical literature review and prior work. Then, 16 laypersons were interviewed on their experience with online health information and specifically their impression of the IAM4all questionnaire. Based on the summaries and interpretations of interviews, questionnaire items were revised, added, and excluded, thus creating the second version of the questionnaire. Subsequently, a panel of 12 information specialists and 8 health researchers participated in an online survey to rate each questionnaire item for relevance, clarity, representativeness, and specificity. The result of this expert panel contributed to the third, current, version of the questionnaire. Results The current version of the IAM4all questionnaire is structured by four levels of outcomes of information seeking/receiving: situational relevance, cognitive impact, information use, and health benefits. Following the interviews and the expert panel survey, 9 questionnaire items were confirmed as relevant, clear, representative, and specific. To improve readability and accessibility for users with a lower level of literacy, 19 items were reworded

  20. Assessing Bayesian model averaging uncertainty of groundwater modeling based on information entropy method

    NASA Astrophysics Data System (ADS)

    Zeng, Xiankui; Wu, Jichun; Wang, Dong; Zhu, Xiaobin; Long, Yuqiao

    2016-07-01

    Because of groundwater conceptualization uncertainty, multi-model methods are usually used and the corresponding uncertainties are estimated by integrating Markov Chain Monte Carlo (MCMC) and Bayesian model averaging (BMA) methods. Generally, the variance method is used to measure the uncertainties of BMA prediction. The total variance of ensemble prediction is decomposed into within-model and between-model variances, which represent the uncertainties derived from parameter and conceptual model, respectively. However, the uncertainty of a probability distribution couldn't be comprehensively quantified by variance solely. A new measuring method based on information entropy theory is proposed in this study. Due to actual BMA process hard to meet the ideal mutually exclusive collectively exhaustive condition, BMA predictive uncertainty could be decomposed into parameter, conceptual model, and overlapped uncertainties, respectively. Overlapped uncertainty is induced by the combination of predictions from correlated model structures. In this paper, five simple analytical functions are firstly used to illustrate the feasibility of the variance and information entropy methods. A discrete distribution example shows that information entropy could be more appropriate to describe between-model uncertainty than variance. Two continuous distribution examples show that the two methods are consistent in measuring normal distribution, and information entropy is more appropriate to describe bimodal distribution than variance. The two examples of BMA uncertainty decomposition demonstrate that the two methods are relatively consistent in assessing the uncertainty of unimodal BMA prediction. Information entropy is more informative in describing the uncertainty decomposition of bimodal BMA prediction. Then, based on a synthetical groundwater model, the variance and information entropy methods are used to assess the BMA uncertainty of groundwater modeling. The uncertainty assessments of

  1. A Theory-based Faculty Development Program for Clinician-Educators.

    ERIC Educational Resources Information Center

    Hewson, Mariana G.

    2000-01-01

    Describes development, implementation, and evaluation of a theory-based faculty development program for physician-educators in medicine and pediatrics at the Cleveland Clinic (Ohio). The program includes a 12-hour course focused on precepting skills, bedside teaching, and effective feedback; on-site coaching; and innovative projects in clinical…

  2. Using Emergence Theory-Based Curriculum to Teach Compromise Skills to Students with Autistic Spectrum Disorders

    ERIC Educational Resources Information Center

    Fein, Lance; Jones, Don

    2015-01-01

    This study addresses the compromise skills that are taught to students diagnosed with autistic spectrum disorders (ASD) and related social and communication deficits. A private school in the southeastern United States implemented an emergence theory-based curriculum to address these skills, yet no formal analysis was conducted to determine its…

  3. Validating a Theory-Based Survey to Evaluate Teaching Effectiveness in Higher Education

    ERIC Educational Resources Information Center

    Amrein-Beardsley, A.; Haladyna, T.

    2012-01-01

    Surveys to evaluate instructor effectiveness are commonly used in higher education. Yet the survey items included are often drawn from other surveys without reference to a theory of adult learning. The authors present the results from a validation study of such a theory-based survey. They evidence that an evaluation survey based on a theory that…

  4. Assessment of Prevalence of Persons with Down Syndrome: A Theory-Based Demographic Model

    ERIC Educational Resources Information Center

    de Graaf, Gert; Vis, Jeroen C.; Haveman, Meindert; van Hove, Geert; de Graaf, Erik A. B.; Tijssen, Jan G. P.; Mulder, Barbara J. M.

    2011-01-01

    Background: The Netherlands are lacking reliable empirical data in relation to the development of birth and population prevalence of Down syndrome. For the UK and Ireland there are more historical empirical data available. A theory-based model is developed for predicting Down syndrome prevalence in the Netherlands from the 1950s onwards. It is…

  5. Development and Evaluation of a Theory-Based Physical Activity Guidebook for Breast Cancer Survivors

    ERIC Educational Resources Information Center

    Vallance, Jeffrey K.; Courneya, Kerry S.; Taylor, Lorian M.; Plotnikoff, Ronald C.; Mackey, John R.

    2008-01-01

    This study's objective was to develop and evaluate the suitability and appropriateness of a theory-based physical activity (PA) guidebook for breast cancer survivors. Guidebook content was constructed based on the theory of planned behavior (TPB) using salient exercise beliefs identified by breast cancer survivors in previous research. Expert…

  6. 77 FR 24684 - Proposed Information Collection; Comment Request; 2013-2015 American Community Survey Methods...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-25

    .... Census Bureau Proposed Information Collection; Comment Request; 2013-2015 American Community Survey... issues and needs. During the 2013-2015 period, the Methods Panel may include testing methods for... place to propose several tests: A 2013 Questionnaire Design Test, a 2015 ACS Content Test, and a...

  7. Evaluation of Television as a Method of Disseminating Solar Energy Information.

    ERIC Educational Resources Information Center

    Edington, Everett D.; And Others

    This project included three separate studies undertaken to determine the effectiveness of television instruction as a method of effectively delivering information about solar energy systems to present and future workers in related industries, and as a method of delivery for adult continuing education instruction. All three studies used a series of…

  8. A Qualitative Study about Performance Based Assesment Methods Used in Information Technologies Lesson

    ERIC Educational Resources Information Center

    Daghan, Gökhan; Akkoyunlu, Buket

    2014-01-01

    In this study, Information Technologies teachers' views and usage cases on performance based assesment methods (PBAMs) are examined. It is aimed to find out which of the PBAMs are used frequently or not used, preference reasons of these methods and opinions about the applicability of them. Study is designed with the phenomenological design…

  9. A Method for the Analysis of Information Use in Source-Based Writing

    ERIC Educational Resources Information Center

    Sormunen, Eero; Heinstrom, Jannica; Romu, Leena; Turunen, Risto

    2012-01-01

    Introduction: Past research on source-based writing assignments has hesitated to scrutinize how students actually use information afforded by sources. This paper introduces a method for the analysis of text transformations from sources to texts composed. The method is aimed to serve scholars in building a more detailed understanding of how…

  10. Mixed Methods Research of Adult Family Care Home Residents and Informal Caregivers

    ERIC Educational Resources Information Center

    Jeanty, Guy C.; Hibel, James

    2011-01-01

    This article describes a mixed methods approach used to explore the experiences of adult family care home (AFCH) residents and informal caregivers (IC). A rationale is presented for using a mixed methods approach employing the sequential exploratory design with this poorly researched population. The unique challenges attendant to the sampling…

  11. Farmers' Preferences for Methods of Receiving Information on New or Innovative Farming Practices.

    ERIC Educational Resources Information Center

    Riesenberg, Lou E.; Gor, Christopher Obel

    1989-01-01

    Survey of 386 Idaho farmers (response rate 58 percent) identified preferred methods of receiving information on new or innovative farming practices. Analysis revealed preference for interpersonal methods (demonstrations, tours, and field trips) over mass media such as computer-assisted instruction (CAI) and home study, although younger farmers,…

  12. Using Financial Information in Continuing Education. Accepted Methods and New Approaches.

    ERIC Educational Resources Information Center

    Matkin, Gary W.

    This book, which is intended as a resource/reference guide for experienced financial managers and course planners, examines accepted methods and new approaches for using financial information in continuing education. The introduction reviews theory and practice, traditional and new methods, planning and organizational management, and technology.…

  13. 77 FR 23674 - Proposed Information Collection Requests: Measures and Methods for the National Reporting System...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-20

    ... Proposed Information Collection Requests: Measures and Methods for the National Reporting System for Adult Education SUMMARY: The Office of Vocational and Adult Education (OVAE) requests a revision to its data... records. Title of Collection: Measures and Methods for the National Reporting System for Adult...

  14. Interconnected but underprotected? Parents' methods and motivations for information seeking on digital safety issues.

    PubMed

    Davis, Vauna

    2012-12-01

    Parents need information and skills to meet the demands of mediating connected technology in their homes. Parents' methods and motivations for learning to protect children from digital risks were reported through a survey. This study explores relationships between information seeking, parents' concerns, risks children have experienced, and access to connected devices, in addition to the use and satisfaction of various digital safety resources. Three types of information-seeking behavior were identified: (a) protective information seeking, to protect children from being confronted with harmful content; (b) problem-solving information seeking, to help children who have been negatively affected by connected technology; and (c) attentive learning, by attending to media resources passively encountered on this topic. Friends and family are the dominant source of digital safety information, followed by presentations and the Internet. Parents' top concerns for their children using connected technology were accidental exposure to pornography, and sexual content in Internet-based entertainment. Higher numbers of risks experienced by children were positively associated with parents' problem-solving information seeking and level of attentive learning. Parents who were more concerned exhibited more problem-solving information seeking; but despite the high level of concern for children's safety online, 65 percent of parents seek information on this subject less than twice per year. Children have access to a mean of five connected devices at home; a higher number of devices was correlated with increased risks experienced by children, but was not associated with increased concern or information seeking from parents.

  15. Data preprocessing method for fluorescence molecular tomography using a priori information provided by CT.

    PubMed

    Fu, Jianwei; Yang, Xiaoquan; Meng, Yuanzheng; Luo, Qingming; Gong, Hui

    2012-01-01

    The combined system of micro-CT and fluorescence molecular tomography (FMT) offers a new tool to provide anatomical and functional information of small animals in a single study. To take advantages of the combined system, a data preprocessing method is proposed to extract the valid data for FMT reconstruction algorithms using a priori information provided by CT. The boundary information of the animal and animal holder is extracted from reconstructed CT volume data. A ray tracing method is used to trace the path of the excitation beam, calculate the locations and directions of the optional sources and determine whether the optional sources are valid. To accurately calculate the projections of the detectors on optical images and judge their validity, a combination of perspective projection and inverse ray tracing method are adopted to offer optimal performance. The imaging performance of the combined system with the presented method is validated through experimental rat imaging.

  16. The Equivalence of Information-Theoretic and Likelihood-Based Methods for Neural Dimensionality Reduction

    PubMed Central

    Williamson, Ross S.; Sahani, Maneesh; Pillow, Jonathan W.

    2015-01-01

    Stimulus dimensionality-reduction methods in neuroscience seek to identify a low-dimensional space of stimulus features that affect a neuron’s probability of spiking. One popular method, known as maximally informative dimensions (MID), uses an information-theoretic quantity known as “single-spike information” to identify this space. Here we examine MID from a model-based perspective. We show that MID is a maximum-likelihood estimator for the parameters of a linear-nonlinear-Poisson (LNP) model, and that the empirical single-spike information corresponds to the normalized log-likelihood under a Poisson model. This equivalence implies that MID does not necessarily find maximally informative stimulus dimensions when spiking is not well described as Poisson. We provide several examples to illustrate this shortcoming, and derive a lower bound on the information lost when spiking is Bernoulli in discrete time bins. To overcome this limitation, we introduce model-based dimensionality reduction methods for neurons with non-Poisson firing statistics, and show that they can be framed equivalently in likelihood-based or information-theoretic terms. Finally, we show how to overcome practical limitations on the number of stimulus dimensions that MID can estimate by constraining the form of the non-parametric nonlinearity in an LNP model. We illustrate these methods with simulations and data from primate visual cortex. PMID:25831448

  17. Method of airborne SAR image match integrating multi-information for block adjustment

    NASA Astrophysics Data System (ADS)

    Yang, S. C.; Huang, G. M.; Zhao, Z.; Lu, L. J.

    2015-06-01

    For the automation of SAR image Block Adjustment, this paper proposed a method of SAR image matching integrating multiinformation. It takes full advantage of SAR image geometric information, feature information, gray-related information and external auxiliary terrain information for SAR image matching. And then Image Tie Points (ITPs) of Block Adjustment can be achieved automatically. The main parts of extracting ITPs automatically include: First, SAR images were rectified geometrically based on the geometric information and external auxiliary terrain information (existed DEM) before match. Second, ground grid points with a certain interval can be get in the block area and approximate ITPs were acquired based on external auxiliary terrain information. Then match reference point was extracted for homologous image blocks with Harris feature detection operator and ITPs were obtained with pyramid matching based on gray-related information. At last, ITPs were transferred from rectified images to original SAR images and used in block adjustment. In the experiment, X band airborne SAR images acquired by Chinese airborne SAR system - CASMSAR system were used to make up the block. The result had showed that the method is effective for block adjustment of SAR data.

  18. Comparison of methods of extracting information for meta-analysis of observational studies in nutritional epidemiology

    PubMed Central

    2016-01-01

    OBJECTIVES: A common method for conducting a quantitative systematic review (QSR) for observational studies related to nutritional epidemiology is the “highest versus lowest intake” method (HLM), in which only the information concerning the effect size (ES) of the highest category of a food item is collected on the basis of its lowest category. However, in the interval collapsing method (ICM), a method suggested to enable a maximum utilization of all available information, the ES information is collected by collapsing all categories into a single category. This study aimed to compare the ES and summary effect size (SES) between the HLM and ICM. METHODS: A QSR for evaluating the citrus fruit intake and risk of pancreatic cancer and calculating the SES by using the HLM was selected. The ES and SES were estimated by performing a meta-analysis using the fixed-effect model. The directionality and statistical significance of the ES and SES were used as criteria for determining the concordance between the HLM and ICM outcomes. RESULTS: No significant differences were observed in the directionality of SES extracted by using the HLM or ICM. The application of the ICM, which uses a broader information base, yielded more-consistent ES and SES, and narrower confidence intervals than the HLM. CONCLUSIONS: The ICM is advantageous over the HLM owing to its higher statistical accuracy in extracting information for QSR on nutritional epidemiology. The application of the ICM should hence be recommended for future studies. PMID:26797219

  19. Comparison of high and low intensity contact between secondary and primary care to detect people at ultra-high risk for psychosis: study protocol for a theory-based, cluster randomized controlled trial

    PubMed Central

    2013-01-01

    Background The early detection and referral to specialized services of young people at ultra-high risk (UHR) for psychosis may reduce the duration of untreated psychosis and, therefore, improve prognosis. General practitioners (GPs) are usually the healthcare professionals contacted first on the help-seeking pathway of these individuals. Methods/Design This is a cluster randomized controlled trial (cRCT) of primary care practices in Cambridgeshire and Peterborough, UK. Practices are randomly allocated into two groups in order to establish which is the most effective and cost-effective way to identify people at UHR for psychosis. One group will receive postal information about the local early intervention in psychosis service, including how to identify young people who may be in the early stages of a psychotic illness. The second group will receive the same information plus an additional, ongoing theory-based educational intervention with dedicated liaison practitioners to train clinical staff at each site. The primary outcome of this trial is count data over a 2-year period: the yield - number of UHR for psychosis referrals to a specialist early intervention in psychosis service - per primary care practice. Discussion There is little guidance on the essential components of effective and cost-effective educational interventions in primary mental health care. Furthermore, no study has demonstrated an effect of a theory-based intervention to help GPs identify young people at UHR for psychosis. This study protocol is underpinned by a robust scientific rationale that intends to address these limitations. Trial registration Current Controlled Trials ISRCTN70185866 PMID:23866815

  20. A method of building information extraction based on mathematical morphology and multiscale

    NASA Astrophysics Data System (ADS)

    Li, Jing-wen; Wang, Ke; Zhang, Zi-ping; Xue, Long-li; Yin, Shou-qiang; Zhou, Song

    2015-12-01

    In view of monitoring the changes of buildings on Earth's surface ,by analyzing the distribution characteristics of building in remote sensing image, combined with multi-scale in image segmentation and the advantages of mathematical morphology, this paper proposes a multi-scale combined with mathematical morphology of high resolution remote sensing image segmentation method, and uses the multiple fuzzy classification method and the shadow of auxiliary method to extract information building, With the comparison of k-means classification, and the traditional maximum likelihood classification method, the results of experiment object based on multi-scale combined with mathematical morphology of image segmentation and extraction method, can accurately extract the structure of the information is more clear classification data, provide the basis for the intelligent monitoring of earth data and theoretical support.

  1. [Thinking on TCM literature evaluation methods and techniques based on mass information].

    PubMed

    Xie, Qi; Cui, Meng; Pan, Yan-li

    2007-08-01

    The necessity and feasibility of TCM literature evaluation based on mass information of TCM literature was discussed in this paper. Beginning with the description on current situation of mass TCM literature information research, the authors offered a tentative plan for evaluating scientific and technologic TCM literature, its method and technique, and systematically analyzed the key issues, such as the subjects selection, documents screening and sorting, literature analysis, and development of software analysis platform, then, the methodology and the technology for constituting the mass TCM literature information based evaluation system was systemically clarified.

  2. Measuring information flow in cellular networks by the systems biology method through microarray data.

    PubMed

    Chen, Bor-Sen; Li, Cheng-Wei

    2015-01-01

    In general, it is very difficult to measure the information flow in a cellular network directly. In this study, based on an information flow model and microarray data, we measured the information flow in cellular networks indirectly by using a systems biology method. First, we used a recursive least square parameter estimation algorithm to identify the system parameters of coupling signal transduction pathways and the cellular gene regulatory network (GRN). Then, based on the identified parameters and systems theory, we estimated the signal transductivities of the coupling signal transduction pathways from the extracellular signals to each downstream protein and the information transductivities of the GRN between transcription factors in response to environmental events. According to the proposed method, the information flow, which is characterized by signal transductivity in coupling signaling pathways and information transductivity in the GRN, can be estimated by microarray temporal data or microarray sample data. It can also be estimated by other high-throughput data such as next-generation sequencing or proteomic data. Finally, the information flows of the signal transduction pathways and the GRN in leukemia cancer cells and non-leukemia normal cells were also measured to analyze the systematic dysfunction in this cancer from microarray sample data. The results show that the signal transductivities of signal transduction pathways change substantially from normal cells to leukemia cancer cells.

  3. Mechanisms of behavioural maintenance: Long-term effects of theory-based interventions to promote safe water consumption.

    PubMed

    Inauen, Jennifer; Mosler, Hans-Joachim

    2016-01-01

    Theory-based interventions can enhance people's safe water consumption, but the sustainability of these interventions and the mechanisms of maintenance remain unclear. We investigated these questions based on an extended theory of planned behaviour. Seven hundred and ten (445 analysed) randomly selected households participated in two cluster-randomised controlled trials in Bangladesh. Study 1 promoted switching to neighbours' arsenic-safe wells, and Study 2 promoted switching to arsenic-safe deep wells. Both studies included two intervention phases. Structured interviews were conducted at baseline (T1), and at 1-month (T2), 2-month (T3) and 9-month (T4) follow-ups. In intervention phase 1 (between T1 and T2), commitment-based behaviour change techniques--reminders, implementation intentions and public commitment--were combined with information and compared to an information-only control group. In phase 2 (between T2 and T3), half of each phase 1 intervention group was randomly assigned to receive either commitment-based techniques once more or coping planning with reminders and information. Initial well-switching rates of up to 60% significantly declined by T4: 38.3% of T2 safe water users stopped consuming arsenic-safe water. The decline depended on the intervention. Perceived behavioural control, intentions, commitment strength and coping planning were associated with maintenance. In line with previous studies, the results indicate that commitment and reminders engender long-term behavioural change. PMID:26304476

  4. An Energy-Efficient Game-Theory-Based Spectrum Decision Scheme for Cognitive Radio Sensor Networks.

    PubMed

    Salim, Shelly; Moh, Sangman

    2016-01-01

    A cognitive radio sensor network (CRSN) is a wireless sensor network in which sensor nodes are equipped with cognitive radio. In this paper, we propose an energy-efficient game-theory-based spectrum decision (EGSD) scheme for CRSNs to prolong the network lifetime. Note that energy efficiency is the most important design consideration in CRSNs because it determines the network lifetime. The central part of the EGSD scheme consists of two spectrum selection algorithms: random selection and game-theory-based selection. The EGSD scheme also includes a clustering algorithm, spectrum characterization with a Markov chain, and cluster member coordination. Our performance study shows that EGSD outperforms the existing popular framework in terms of network lifetime and coordination overhead. PMID:27376290

  5. A theory-based logic model for innovation policy and evaluation.

    SciTech Connect

    Jordan, Gretchen B.

    2010-04-01

    Current policy and program rationale, objectives, and evaluation use a fragmented picture of the innovation process. This presents a challenge since in the United States officials in both the executive and legislative branches of government see innovation, whether that be new products or processes or business models, as the solution to many of the problems the country faces. The logic model is a popular tool for developing and describing the rationale for a policy or program and its context. This article sets out to describe generic logic models of both the R&D process and the diffusion process, building on existing theory-based frameworks. Then a combined, theory-based logic model for the innovation process is presented. Examples of the elements of the logic, each a possible leverage point or intervention, are provided, along with a discussion of how this comprehensive but simple model might be useful for both evaluation and policy development.

  6. An Energy-Efficient Game-Theory-Based Spectrum Decision Scheme for Cognitive Radio Sensor Networks

    PubMed Central

    Salim, Shelly; Moh, Sangman

    2016-01-01

    A cognitive radio sensor network (CRSN) is a wireless sensor network in which sensor nodes are equipped with cognitive radio. In this paper, we propose an energy-efficient game-theory-based spectrum decision (EGSD) scheme for CRSNs to prolong the network lifetime. Note that energy efficiency is the most important design consideration in CRSNs because it determines the network lifetime. The central part of the EGSD scheme consists of two spectrum selection algorithms: random selection and game-theory-based selection. The EGSD scheme also includes a clustering algorithm, spectrum characterization with a Markov chain, and cluster member coordination. Our performance study shows that EGSD outperforms the existing popular framework in terms of network lifetime and coordination overhead. PMID:27376290

  7. An Energy-Efficient Game-Theory-Based Spectrum Decision Scheme for Cognitive Radio Sensor Networks.

    PubMed

    Salim, Shelly; Moh, Sangman

    2016-06-30

    A cognitive radio sensor network (CRSN) is a wireless sensor network in which sensor nodes are equipped with cognitive radio. In this paper, we propose an energy-efficient game-theory-based spectrum decision (EGSD) scheme for CRSNs to prolong the network lifetime. Note that energy efficiency is the most important design consideration in CRSNs because it determines the network lifetime. The central part of the EGSD scheme consists of two spectrum selection algorithms: random selection and game-theory-based selection. The EGSD scheme also includes a clustering algorithm, spectrum characterization with a Markov chain, and cluster member coordination. Our performance study shows that EGSD outperforms the existing popular framework in terms of network lifetime and coordination overhead.

  8. Theory-based evaluation of a comprehensive Latino education initiative: an interactive evaluation approach.

    PubMed

    Nesman, Teresa M; Batsche, Catherine; Hernandez, Mario

    2007-08-01

    Latino student access to higher education has received significant national attention in recent years. This article describes a theory-based evaluation approach used with ENLACE of Hillsborough, a 5-year project funded by the W.K. Kellogg Foundation for the purpose of increasing Latino student graduation from high school and college. Theory-based evaluation guided planning, implementation as well as evaluation through the process of developing consensus on the Latino population of focus, adoption of culturally appropriate principles and values to guide the project, and identification of strategies to reach, engage, and impact outcomes for Latino students and their families. The approach included interactive development of logic models that focused the scope of interventions and guided evaluation designs for addressing three stages of the initiative. Challenges and opportunities created by the approach are discussed, as well as ways in which the initiative impacted Latino students and collaborating educational institutions.

  9. An information preserving method for producing full coverage CoRoT light curves

    NASA Astrophysics Data System (ADS)

    Pascual-Granado, J.; Garrido, R.; Suárez, J. C.

    2015-09-01

    Invalid flux measurements, caused mainly by the South Atlantic Anomaly crossing of the CoRoT satellite, introduce aliases in the periodogram and wrong amplitudes. It has been demonstrated that replacing such invalid data with a linear interpolation is not harmless. On the other side, using power spectrum estimators for unevenly sampled time series is not only less computationally efficient but it leads to difficulties in the interpretation of the results. Therefore, even when the gaps are rather small and the duty cycle is high enough the use of gap-filling methods is a gain in frequency analysis. However, the method must preserve the information contained in the time series. In this work we give a short description of an information preserving method (MIARMA) and show some results when applying it to CoRoT seismo light curves. The method is implemented as the second step of a pipeline for CoRoT data analysis.

  10. Theory-based Low-Sodium Diet Education for Heart Failure Patients

    PubMed Central

    Welsh, Darlene; Marcinek, Regina; Abshire, Demetrius; Lennie, Terry; Biddle, Martha; Bentley, Brooke; Moser, Debra

    2010-01-01

    Theory-based teaching strategies for promoting adherence to a low-sodium diet among patients with heart failure are presented in this manuscript. The strategies, which are based on the theory of planned behavior, address patient attitude, subjective norm, and perceived control as they learn how to follow a low-sodium diet. Home health clinicians can select a variety of the instructional techniques presented to meet individual patient learning needs. PMID:20592543

  11. The Effect of Health Information Technology on Health Care Provider Communication: A Mixed-Method Protocol

    PubMed Central

    Adler-Milstein, Julia; Harrod, Molly; Sales, Anne; Hofer, Timothy P; Saint, Sanjay; Krein, Sarah L

    2015-01-01

    Background Communication failures between physicians and nurses are one of the most common causes of adverse events for hospitalized patients, as well as a major root cause of all sentinel events. Communication technology (ie, the electronic medical record, computerized provider order entry, email, and pagers), which is a component of health information technology (HIT), may help reduce some communication failures but increase others because of an inadequate understanding of how communication technology is used. Increasing use of health information and communication technologies is likely to affect communication between nurses and physicians. Objective The purpose of this study is to describe, in detail, how health information and communication technologies facilitate or hinder communication between nurses and physicians with the ultimate goal of identifying how we can optimize the use of these technologies to support effective communication. Effective communication is the process of developing shared understanding between communicators by establishing, testing, and maintaining relationships. Our theoretical model, based in communication and sociology theories, describes how health information and communication technologies affect communication through communication practices (ie, use of rich media; the location and availability of computers) and work relationships (ie, hierarchies and team stability). Therefore we seek to (1) identify the range of health information and communication technologies used in a national sample of medical-surgical acute care units, (2) describe communication practices and work relationships that may be influenced by health information and communication technologies in these same settings, and (3) explore how differences in health information and communication technologies, communication practices, and work relationships between physicians and nurses influence communication. Methods This 4-year study uses a sequential mixed-methods

  12. An overview of methods and applications to value informal care in economic evaluations of healthcare.

    PubMed

    Koopmanschap, Marc A; van Exel, Job N A; van den Berg, Bernard; Brouwer, Werner B F

    2008-01-01

    This paper compares several applied valuation methods for including informal care in economic evaluations of healthcare programmes: the proxy good method; the opportunity cost method; the contingent valuation method (CVM); conjoint measurement (CM); and valuation of health effects in terms of health-related quality of life (HR-QOL) and well-being. The comparison focuses on three questions: what outcome measures are available for including informal care in economic evaluations of healthcare programmes; whether these measures are compatible with the common types of economic evaluation; and, when applying these measures, whether all relevant aspects of informal care are incorporated. All types of economic evaluation can incorporate a monetary value of informal care (using the opportunity cost method, the proxy good method, CVM and CM) on the cost side of an analysis, but only when the relevant aspects of time costs have been valued. On the effect side of a cost-effectiveness or cost-utility analysis, the health effects (for the patient and/or caregiver) measured in natural units or QALYs can be combined with cost estimates based on the opportunity cost method or the proxy good method. One should be careful when incorporating CVM and CM in cost-minimization, cost-effectiveness and cost-utility analyses, as the health effects of patients receiving informal care and the carers themselves may also have been valued separately. One should determine whether the caregiver valuation exercise allows combination with other valuation techniques. In cost-benefit analyses, CVM and CM appear to be the best tools for the valuation of informal care. When researchers decide to use the well-being method, we recommend applying it in a cost-benefit analysis framework. This method values overall QOL (happiness); hence it is broader than just HR-QOL, which complicates inclusion in traditional health economic evaluations that normally define outcomes more narrowly. Using broader, non

  13. An overview of methods and applications to value informal care in economic evaluations of healthcare.

    PubMed

    Koopmanschap, Marc A; van Exel, Job N A; van den Berg, Bernard; Brouwer, Werner B F

    2008-01-01

    This paper compares several applied valuation methods for including informal care in economic evaluations of healthcare programmes: the proxy good method; the opportunity cost method; the contingent valuation method (CVM); conjoint measurement (CM); and valuation of health effects in terms of health-related quality of life (HR-QOL) and well-being. The comparison focuses on three questions: what outcome measures are available for including informal care in economic evaluations of healthcare programmes; whether these measures are compatible with the common types of economic evaluation; and, when applying these measures, whether all relevant aspects of informal care are incorporated. All types of economic evaluation can incorporate a monetary value of informal care (using the opportunity cost method, the proxy good method, CVM and CM) on the cost side of an analysis, but only when the relevant aspects of time costs have been valued. On the effect side of a cost-effectiveness or cost-utility analysis, the health effects (for the patient and/or caregiver) measured in natural units or QALYs can be combined with cost estimates based on the opportunity cost method or the proxy good method. One should be careful when incorporating CVM and CM in cost-minimization, cost-effectiveness and cost-utility analyses, as the health effects of patients receiving informal care and the carers themselves may also have been valued separately. One should determine whether the caregiver valuation exercise allows combination with other valuation techniques. In cost-benefit analyses, CVM and CM appear to be the best tools for the valuation of informal care. When researchers decide to use the well-being method, we recommend applying it in a cost-benefit analysis framework. This method values overall QOL (happiness); hence it is broader than just HR-QOL, which complicates inclusion in traditional health economic evaluations that normally define outcomes more narrowly. Using broader, non

  14. Application of information-retrieval methods to the classification of physical data

    NASA Technical Reports Server (NTRS)

    Mamotko, Z. N.; Khorolskaya, S. K.; Shatrovskiy, L. I.

    1975-01-01

    Scientific data received from satellites are characterized as a multi-dimensional time series, whose terms are vector functions of a vector of measurement conditions. Information retrieval methods are used to construct lower dimensional samples on the basis of the condition vector, in order to obtain these data and to construct partial relations. The methods are applied to the joint Soviet-French Arkad project.

  15. Method and Apparatus Providing Deception and/or Altered Operation in an Information System Operating System

    DOEpatents

    Cohen, Fred; Rogers, Deanna T.; Neagoe, Vicentiu

    2008-10-14

    A method and/or system and/or apparatus providing deception and/or execution alteration in an information system. In specific embodiments, deceptions and/or protections are provided by intercepting and/or modifying operation of one or more system calls of an operating system.

  16. The Implementation and Effectiveness of Geographic Information Systems Technology and Methods in Secondary Education

    ERIC Educational Resources Information Center

    Kerski, Joseph J.

    2003-01-01

    Geographic information systems (GIS) technology and methods have transformed decision-making in society by bringing geographic analysis to the desktop computer. Although some educators consider GIS to be a promising means for implementing reform, it has been adopted by less than 2 percent of American high schools. The reasons behind the interest…

  17. Systems, methods and apparatus for implementation of formal specifications derived from informal requirements

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G. (Inventor); Rash, James L. (Inventor); Erickson, John D. (Inventor); Gracinin, Denis (Inventor); Rouff, Christopher A. (Inventor)

    2010-01-01

    Systems, methods and apparatus are provided through which in some embodiments an informal specification is translated without human intervention into a formal specification. In some embodiments the formal specification is a process-based specification. In some embodiments, the formal specification is translated into a high-level computer programming language which is further compiled into a set of executable computer instructions.

  18. An Inquiry-Based Approach to Teaching Research Methods in Information Studies

    ERIC Educational Resources Information Center

    Albright, Kendra; Petrulis, Robert; Vasconcelos, Ana; Wood, Jamie

    2012-01-01

    This paper presents the results of a project that aimed at restructuring the delivery of research methods training at the Information School at the University of Sheffield, UK, based on an Inquiry-Based Learning (IBL) approach. The purpose of this research was to implement inquiry-based learning that would allow customization of research methods…

  19. Parenting Practices of Anxious and Nonanxious Mothers: A Multi-Method, Multi-Informant Approach

    ERIC Educational Resources Information Center

    Drake, Kelly L.; Ginsburg, Golda S.

    2011-01-01

    Anxious and nonanxious mothers were compared on theoretically derived parenting and family environment variables (i.e., overcontrol, warmth, criticism, anxious modeling) using multiple informants and methods. Mother-child dyads completed questionnaires about parenting and were observed during an interactional task. Findings reveal that, after…

  20. Genetically Informative Research on Adolescent Substance Use: Methods, Findings, and Challenges

    ERIC Educational Resources Information Center

    Lynskey, Michael T.; Agrawal, Arpana; Heath, Andrew C.

    2010-01-01

    Objective: To provide an overview of the genetic epidemiology of substance use and misuse in adolescents. Method: A selective review of genetically informative research strategies, their limitations, and key findings examining issues related to the heritability of substance use and substance use disorders in children and adolescents is presented.…

  1. 78 FR 68076 - Request for Information on Alternative Skin Sensitization Test Methods and Testing Strategies and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-13

    ... HUMAN SERVICES National Institutes of Health Request for Information on Alternative Skin Sensitization... for the evaluation of alternative skin sensitization test methods and testing strategies. The National... niceatm@niehs.nih.gov are preferred. NICEATM, National Institute of Environmental Health Sciences,...

  2. Methods study for the relocation of visual information in central scotoma cases

    NASA Astrophysics Data System (ADS)

    Scherlen, Anne-Catherine; Gautier, Vincent

    2005-03-01

    In this study we test the benefit on the reading performance of different ways to relocating the visual information present under the scotoma. The relocation (or unmasking) allows to compensate the loss of information and avoid the patient developing driving strategies not adapted for the reading. Eight healthy subjects were tested on a reading task, on each a central scotoma of various sizes was simulated. We then evaluate the reading speed (words/min) during three visual information relocation methods: all masked information is relocated - on both side of scotoma, - on the right of scotoma, - and only essentials letters for the word recognition too on the right of scotoma. We compare these reading speeds versus the pathological condition, ie without relocating visual information. Our results show that unmasking strategy improve the reading speed when all the visual information is unmask to the right of scotoma, this only for large scotoma. Taking account the word morphology, the perception of only certain letters outside the scotoma can be sufficient to improve the reading speed. A deepening of reading processes in the presence of a scotoma will then allows a new perspective for visual information unmasking. Multidisciplinary competences brought by engineers, ophtalmologists, linguists, clinicians would allow to optimize the reading benefit brought by the unmasking.

  3. Development and Validation of an Instrument Measuring Theory-Based Determinants of Monitoring Obesogenic Behaviors of Pre-Schoolers among Hispanic Mothers.

    PubMed

    Branscum, Paul; Lora, Karina R

    2016-01-01

    Public health interventions are greatly needed for obesity prevention, and planning for such strategies should include community participation. The study's purpose was to develop and validate a theory-based instrument with low-income, Hispanic mothers of preschoolers, to assess theory-based determinants of maternal monitoring of child's consumption of fruits and vegetables and sugar-sweetened beverages (SSB). Nine focus groups with mothers were conducted to determine nutrition-related behaviors that mothers found as most obesogenic for their children. Next, behaviors were operationally defined and rated for importance and changeability. Two behaviors were selected for investigation (fruits and vegetable and SSB). Twenty semi-structured interviews with mothers were conducted next to develop culturally appropriate items for the instrument. Afterwards, face and content validity were established using a panel of six experts. Finally, the instrument was tested with a sample of 238 mothers. Psychometric properties evaluated included construct validity (using the maximum likelihood extraction method of factor analysis), and internal consistency reliability (Cronbach's alpha). Results suggested that all scales on the instrument were valid and reliable, except for the autonomy scales. Researchers and community planners working with Hispanic families can use this instrument to measure theory-based determinants of parenting behaviors related to preschoolers' consumption of fruits and vegetables, and SSB. PMID:27271643

  4. Development and Validation of an Instrument Measuring Theory-Based Determinants of Monitoring Obesogenic Behaviors of Pre-Schoolers among Hispanic Mothers.

    PubMed

    Branscum, Paul; Lora, Karina R

    2016-01-01

    Public health interventions are greatly needed for obesity prevention, and planning for such strategies should include community participation. The study's purpose was to develop and validate a theory-based instrument with low-income, Hispanic mothers of preschoolers, to assess theory-based determinants of maternal monitoring of child's consumption of fruits and vegetables and sugar-sweetened beverages (SSB). Nine focus groups with mothers were conducted to determine nutrition-related behaviors that mothers found as most obesogenic for their children. Next, behaviors were operationally defined and rated for importance and changeability. Two behaviors were selected for investigation (fruits and vegetable and SSB). Twenty semi-structured interviews with mothers were conducted next to develop culturally appropriate items for the instrument. Afterwards, face and content validity were established using a panel of six experts. Finally, the instrument was tested with a sample of 238 mothers. Psychometric properties evaluated included construct validity (using the maximum likelihood extraction method of factor analysis), and internal consistency reliability (Cronbach's alpha). Results suggested that all scales on the instrument were valid and reliable, except for the autonomy scales. Researchers and community planners working with Hispanic families can use this instrument to measure theory-based determinants of parenting behaviors related to preschoolers' consumption of fruits and vegetables, and SSB.

  5. Development and Validation of an Instrument Measuring Theory-Based Determinants of Monitoring Obesogenic Behaviors of Pre-Schoolers among Hispanic Mothers

    PubMed Central

    Branscum, Paul; Lora, Karina R.

    2016-01-01

    Public health interventions are greatly needed for obesity prevention, and planning for such strategies should include community participation. The study’s purpose was to develop and validate a theory-based instrument with low-income, Hispanic mothers of preschoolers, to assess theory-based determinants of maternal monitoring of child’s consumption of fruits and vegetables and sugar-sweetened beverages (SSB). Nine focus groups with mothers were conducted to determine nutrition-related behaviors that mothers found as most obesogenic for their children. Next, behaviors were operationally defined and rated for importance and changeability. Two behaviors were selected for investigation (fruits and vegetable and SSB). Twenty semi-structured interviews with mothers were conducted next to develop culturally appropriate items for the instrument. Afterwards, face and content validity were established using a panel of six experts. Finally, the instrument was tested with a sample of 238 mothers. Psychometric properties evaluated included construct validity (using the maximum likelihood extraction method of factor analysis), and internal consistency reliability (Cronbach’s alpha). Results suggested that all scales on the instrument were valid and reliable, except for the autonomy scales. Researchers and community planners working with Hispanic families can use this instrument to measure theory-based determinants of parenting behaviors related to preschoolers’ consumption of fruits and vegetables, and SSB. PMID:27271643

  6. An information processing method for acoustic emission signal inspired from musical staff

    NASA Astrophysics Data System (ADS)

    Zheng, Wei; Wu, Chunxian

    2016-01-01

    This study proposes a musical-staff-inspired signal processing method for standard description expressions for discrete signals and describing the integrated characteristics of acoustic emission (AE) signals. The method maps various AE signals with complex environments into the normalized musical space. Four new indexes are proposed to comprehensively describe the signal. Several key features, such as contour, amplitude, and signal changing rate, are quantitatively expressed in a normalized musical space. The processed information requires only a small storage space to maintain high fidelity. The method is illustrated by using experiments on sandstones and computed tomography (CT) scanning to determine its validity for AE signal processing.

  7. Improvements in recall and food choices using a graphical method to deliver information of select nutrients.

    PubMed

    Pratt, Nathan S; Ellison, Brenna D; Benjamin, Aaron S; Nakamura, Manabu T

    2016-01-01

    Consumers have difficulty using nutrition information. We hypothesized that graphically delivering information of select nutrients relative to a target would allow individuals to process information in time-constrained settings more effectively than numerical information. Objectives of the study were to determine the efficacy of the graphical method in (1) improving memory of nutrient information and (2) improving consumer purchasing behavior in a restaurant. Values of fiber and protein per calorie were 2-dimensionally plotted alongside a target box. First, a randomized cued recall experiment was conducted (n=63). Recall accuracy of nutrition information improved by up to 43% when shown graphically instead of numerically. Second, the impact of graphical nutrition signposting on diner choices was tested in a cafeteria. Saturated fat and sodium information was also presented using color coding. Nutrient content of meals (n=362) was compared between 3 signposting phases: graphical, nutrition facts panels (NFP), or no nutrition label. Graphical signposting improved nutrient content of purchases in the intended direction, whereas NFP had no effect compared with the baseline. Calories ordered from total meals, entrées, and sides were significantly less during graphical signposting than no-label and NFP periods. For total meal and entrées, protein per calorie purchased was significantly higher and saturated fat significantly lower during graphical signposting than the other phases. Graphical signposting remained a predictor of calories and protein per calorie purchased in regression modeling. These findings demonstrate that graphically presenting nutrition information makes that information more available for decision making and influences behavior change in a realistic setting. PMID:26773780

  8. Method for Bandwidth Compression and Transmission of Environmental Information in Bilateral Teleoperation

    NASA Astrophysics Data System (ADS)

    Kubo, Ryogo; Ohnishi, Kouhei

    In this paper, a novel method for bandwidth compression and transmission of environmental information is proposed for bilateral teleoperation systems with multiple degrees of freedom (MDOF). In this method, environmental information, i.e., the position of end-effectors and the reaction force exerted on them, is converted into environmental modes by using discrete Fourier transform (DFT) matrices. The environmental modes to be transmitted are then selected on the basis of the communication bandwidth between master and slave robots. Bilateral control is achieved in low-frequency modal spaces, and local position control is achieved in high-frequency modal spaces. The validity of the proposed method is confirmed by performing an experiment.

  9. Preferred Methods for Delivery of Technological Information by the North Carolina Agricultural Extension Service: Opinions of Agricultural Producers Who Use Extension Information.

    ERIC Educational Resources Information Center

    Richardson, John G.; Mustian, R. David

    The findings of a questionnaire survey of 702 North Carolina agricultural producers indicated that communication methods historically used by the North Carolina Agricultural Extension Service for information dissemination are accepted by state farmers and continue to be popular. Information delivery methods most frequently preferred are…

  10. APhoRISM FP7 project: the A Priori information for Earthquake damage mapping method

    NASA Astrophysics Data System (ADS)

    Bignami, Christian; Stramondo, Salvatore; Pierdicca, Nazzareno

    2014-05-01

    The APhoRISM - Advanced PRocedure for volcanIc and Seismic Monitoring - project is an FP7 funded project, which aims at developing and testing two new methods to combine Earth Observation satellite data from different sensors, and ground data for seismic and volcanic risk management. The objective is to demonstrate that this two types of data, appropriately managed and integrated, can provide new improved products useful for seismic and volcanic crisis management. One of the two methods deals with earthquakes, and it concerns the generation of maps to address the detection and estimate of damage caused by a seism. The method is named APE - A Priori information for Earthquake damage mapping. The use of satellite data to investigate earthquake damages is not an innovative issue. Indeed, a wide literature and projects have addressed and focused such issue, but usually the proposed approaches are only based on change detection techniques and/or classifications algorithms. The novelty of APhoRISM-APE relies on the exploitation of a priori information derived by: - InSAR time series to measure surface movements - shakemaps obtained from seismological data - vulnerability information. This a priori information is then integrated with change detection map from earth observation satellite sensors (either Optical or Synthetic Aperture Radar) to improve accuracy and to limit false alarms.

  11. Spectral-spatial classification combined with diffusion theory based inverse modeling of hyperspectral images

    NASA Astrophysics Data System (ADS)

    Paluchowski, Lukasz A.; Bjorgan, Asgeir; Nordgaard, Hâvard B.; Randeberg, Lise L.

    2016-02-01

    Hyperspectral imagery opens a new perspective for biomedical diagnostics and tissue characterization. High spectral resolution can give insight into optical properties of the skin tissue. However, at the same time the amount of collected data represents a challenge when it comes to decomposition into clusters and extraction of useful diagnostic information. In this study spectral-spatial classification and inverse diffusion modeling were employed to hyperspectral images obtained from a porcine burn model using a hyperspectral push-broom camera. The implemented method takes advantage of spatial and spectral information simultaneously, and provides information about the average optical properties within each cluster. The implemented algorithm allows mapping spectral and spatial heterogeneity of the burn injury as well as dynamic changes of spectral properties within the burn area. The combination of statistical and physics informed tools allowed for initial separation of different burn wounds and further detailed characterization of the injuries in short post-injury time.

  12. Developing and testing theory-based and evidence-based interventions to promote switching to arsenic-safe wells in Bangladesh.

    PubMed

    Inauen, Jennifer; Mosler, Hans-Joachim

    2014-12-01

    Millions of people in Bangladesh drink arsenic-contaminated water despite increased awareness of consequences to health. Theory-based and evidence-based interventions are likely to have greater impact on people switching to existing arsenic-safe wells than providing information alone. To test this assumption, we first developed interventions based on an empirical test of the Risk, Attitudes, Norms, Abilities and Self-regulation (RANAS) model of behaviour change. In the second part of this study, a cluster-randomised controlled trial revealed that in accordance with our hypotheses, information alone showed smaller increases in switching to arsenic-safe wells than information with reminders or information with reminders and implementation intentions.

  13. Developing and testing theory-based and evidence-based interventions to promote switching to arsenic-safe wells in Bangladesh.

    PubMed

    Inauen, Jennifer; Mosler, Hans-Joachim

    2014-12-01

    Millions of people in Bangladesh drink arsenic-contaminated water despite increased awareness of consequences to health. Theory-based and evidence-based interventions are likely to have greater impact on people switching to existing arsenic-safe wells than providing information alone. To test this assumption, we first developed interventions based on an empirical test of the Risk, Attitudes, Norms, Abilities and Self-regulation (RANAS) model of behaviour change. In the second part of this study, a cluster-randomised controlled trial revealed that in accordance with our hypotheses, information alone showed smaller increases in switching to arsenic-safe wells than information with reminders or information with reminders and implementation intentions. PMID:23864069

  14. Development and Usability of REACH: A Tailored Theory-Based Text Messaging Intervention for Disadvantaged Adults With Type 2 Diabetes

    PubMed Central

    Nelson, Lyndsay A; Mayberry, Lindsay S; Wallston, Kenneth; Kripalani, Sunil; Bergner, Erin M

    2016-01-01

    Background Among adults with type 2 diabetes mellitus (T2DM), adherence to recommended self-care activities is suboptimal, especially among racial and ethnic minorities with low income. Self-care nonadherence is associated with having worse glycemic control and diabetes complications. Text messaging interventions are improving the self-care of adults with T2DM, but few have been tested with disadvantaged populations. Objective To develop Rapid Education/Encouragement And Communications for Health (REACH), a tailored, text messaging intervention to support the self-care adherence of disadvantaged patients with T2DM, based on the Information-Motivation-Behavioral skills model. We then tested REACH’s usability to make improvements before evaluating its effects. Methods We developed REACH’s content and functionality using an empirical and theory-based approach, findings from a previously pilot-tested intervention, and the expertise of our interdisciplinary research team. We recruited 36 adults with T2DM from Federally Qualified Health Centers to participate in 1 of 3 rounds of usability testing. For 2 weeks, participants received daily text messages assessing and promoting self-care, including tailored messages addressing users’ unique barriers to adherence, and weekly text messages with adherence feedback. We analyzed quantitative and qualitative user feedback and system-collected data to improve REACH. Results Participants were, on average, 52.4 (SD 9.5) years old, 56% (20/36) female, 63% (22/35) were a racial or ethnic minority, and 67% (22/33) had an income less than US $35,000. About half were taking insulin, and average hemoglobin A1c level was 8.2% (SD 2.2%). We identified issues (eg, user concerns with message phrasing, technical restrictions with responding to assessment messages) and made improvements between testing rounds. Overall, participants favorably rated the ease of understanding (mean 9.6, SD 0.7) and helpfulness (mean 9.3, SD 1.4) of self

  15. Extracting important information from Chinese Operation Notes with natural language processing methods.

    PubMed

    Wang, Hui; Zhang, Weide; Zeng, Qiang; Li, Zuofeng; Feng, Kaiyan; Liu, Lei

    2014-04-01

    Extracting information from unstructured clinical narratives is valuable for many clinical applications. Although natural Language Processing (NLP) methods have been profoundly studied in electronic medical records (EMR), few studies have explored NLP in extracting information from Chinese clinical narratives. In this study, we report the development and evaluation of extracting tumor-related information from operation notes of hepatic carcinomas which were written in Chinese. Using 86 operation notes manually annotated by physicians as the training set, we explored both rule-based and supervised machine-learning approaches. Evaluating on unseen 29 operation notes, our best approach yielded 69.6% in precision, 58.3% in recall and 63.5% F-score.

  16. Post-reconstruction non-local means filtering methods using CT side information for quantitative SPECT

    NASA Astrophysics Data System (ADS)

    Chun, Se Young; Fessler, Jeffrey A.; Dewaraja, Yuni K.

    2013-09-01

    Quantitative SPECT techniques are important for many applications including internal emitter therapy dosimetry where accurate estimation of total target activity and activity distribution within targets are both potentially important for dose-response evaluations. We investigated non-local means (NLM) post-reconstruction filtering for accurate I-131 SPECT estimation of both total target activity and the 3D activity distribution. We first investigated activity estimation versus number of ordered-subsets expectation-maximization (OSEM) iterations. We performed simulations using the XCAT phantom with tumors containing a uniform and a non-uniform activity distribution, and measured the recovery coefficient (RC) and the root mean squared error (RMSE) to quantify total target activity and activity distribution, respectively. We observed that using more OSEM iterations is essential for accurate estimation of RC, but may or may not improve RMSE. We then investigated various post-reconstruction filtering methods to suppress noise at high iteration while preserving image details so that both RC and RMSE can be improved. Recently, NLM filtering methods have shown promising results for noise reduction. Moreover, NLM methods using high-quality side information can improve image quality further. We investigated several NLM methods with and without CT side information for I-131 SPECT imaging and compared them to conventional Gaussian filtering and to unfiltered methods. We studied four different ways of incorporating CT information in the NLM methods: two known (NLM CT-B and NLM CT-M) and two newly considered (NLM CT-S and NLM CT-H). We also evaluated the robustness of NLM filtering using CT information to erroneous CT. NLM CT-S and NLM CT-H yielded comparable RC values to unfiltered images while substantially reducing RMSE. NLM CT-S achieved -2.7 to 2.6% increase of RC compared to no filtering and NLM CT-H yielded up to 6% decrease in RC while other methods yielded lower RCs

  17. [Method of multi-resolution 3D image registration by mutual information].

    PubMed

    Ren, Haiping; Wu, Wenkai; Yang, Hu; Chen, Shengzu

    2002-12-01

    Maximization of mutual information is a powerful criterion for 3D medical image registration, allowing robust and fully accurate automated rigid registration of multi-modal images in a various applications. In this paper, a method based on normalized mutual information for 3D image registration was presented on the images of CT, MR and PET. Powell's direction set method and Brent's one-dimensional optimization algorithm were used as optimization strategy. A multi-resolution approach is applied to speedup the matching process. For PET images, pre-procession of segmentation was performed to reduce the background artefacts. According to the evaluation by the Vanderbilt University, Sub-voxel accuracy in multi-modality registration had been achieved with this algorithm. PMID:12561358

  18. Information System Hazard Analysis: A Method for Identifying Technology-induced Latent Errors for Safety.

    PubMed

    Weber, Jens H; Mason-Blakley, Fieran; Price, Morgan

    2015-01-01

    Many health information and communication technologies (ICT) are safety-critical; moreover, reports of technology-induced adverse events related to them are plentiful in the literature. Despite repeated criticism and calls to action, recent data collected by the Institute of Medicine (IOM) and other organization do not indicate significant improvements with respect to the safety of health ICT systems. A large part of the industry still operates on a reactive "break & patch" model; the application of pro-active, systematic hazard analysis methods for engineering ICT that produce "safe by design" products is sparse. This paper applies one such method: Information System Hazard Analysis (ISHA). ISHA adapts and combines hazard analysis techniques from other safety-critical domains and customizes them for ICT. We provide an overview of the steps involved in ISHA and describe. PMID:25676999

  19. A method for tailoring the information content of a software process model

    NASA Technical Reports Server (NTRS)

    Perkins, Sharon; Arend, Mark B.

    1990-01-01

    The framework is defined for a general method for selecting a necessary and sufficient subset of a general software life cycle's information products, to support new software development process. Procedures for characterizing problem domains in general and mapping to a tailored set of life cycle processes and products is presented. An overview of the method is shown using the following steps: (1) During the problem concept definition phase, perform standardized interviews and dialogs between developer and user, and between user and customer; (2) Generate a quality needs profile of the software to be developed, based on information gathered in step 1; (3) Translate the quality needs profile into a profile of quality criteria that must be met by the software to satisfy the quality needs; (4) Map the quality criteria to a set of accepted processes and products for achieving each criterion; (5) select the information products which match or support the accepted processes and product of step 4; and (6) Select the design methodology which produces the information products selected in step 5.

  20. A method for tailoring the information content of a software process model

    NASA Technical Reports Server (NTRS)

    Perkins, Sharon; Arend, Mark B.

    1990-01-01

    The framework is defined for a general method for selecting a necessary and sufficient subset of a general software life cycle's information products, to support new software development process. Procedures for characterizing problem domains in general and mapping to a tailored set of life cycle processes and products is presented. An overview of the method is shown using the following steps: (1) During the problem concept definition phase, perform standardized interviews and dialogs between developer and user, and between user and customer; (2) Generate a quality needs profile of the software to be developed, based on information gathered in step 1; (3) Translate the quality needs profile into a profile of quality criteria that must be met by the software to satisfy the quality needs; (4) Map the quality criteria to set of accepted processes and products for achieving each criterion; (5) Select the information products which match or support the accepted processes and product of step 4; and (6) Select the design methodology which produces the information products selected in step 5.

  1. Immunohistochemistry relative to gravity: a simple method to retain information about gravity for immunolocalization and histochemistry.

    PubMed

    Harrison, Benjamin R; Masson, Patrick H

    2015-01-01

    We describe a simple method to preserve information about a plant organ's orientation relative to the direction of the gravity vector during sample processing for immunolocalization or histochemical analysis of cell biological processes. This approach has been used in gravity stimulated roots of Arabidopsis thaliana and Zea mays to study PIN3 relocalization, study the asymmetrical remodeling of the actin network and the cortical microtubule array, and to reveal the asymmetrical expression of the auxin signaling reporter DR5::GUS. This method enables the rapid analysis of a large number of samples from a variety of genotypes, as well as from tissue that may be too thick for microscopy in live plants.

  2. [Land salinization information extraction method based on HSI hyperspectral and TM imagery].

    PubMed

    Li, Jin; Zhao, Geng-Xing; Chang, Chun-Yan; Liu, Hai-Teng

    2014-02-01

    This paper chose the typical salinization area in Kenli County of the Yellow River Delta as the study area, selected HJ-1A satellite HSI image at March 15, 2011 and TM image at March 22, 2011 as source of information, and pre-processed these data by image cropping, geometric correction and atmospheric correction. Spectral characteristics of main land use types including different degree of salinization lands, water and shoals were analyzed to find distinct bands for information extraction Land use information extraction model was built by adopting the quantitative and qualitative rules combining the spectral characteristics and the content of soil salinity. Land salinization information was extracted via image classification using decision tree method. The remote sensing image interpretation accuracy was verified by land salinization degree, which was determined through soil salinity chemical analysis of soil sampling points. In addition, classification accuracy between the hyperspectral and multi-spectral images were analyzed and compared. The results showed that the overall image classification accuracy of HSI was 96.43%, Kappa coefficient was 95.59%; while the overall image classification accuracy of TM was 89.17%, Kappa coefficient was 86.74%. Therefore, compared to multi-spectral TM data, the hyperspectral imagery could be more accurate and efficient for land salinization information extraction. Also, the classification map showed that the soil salinity distinction degree of hyperspectral image was higher than that of multi-spectral image. This study explored the land salinization information extraction techniques from hyperspectral imagery, extracted the spatial distribution and area ratio information of different degree of salinization land, and provided decision-making basis for the scientific utilization and management of coastal salinization land resources in the Yellow River Delta.

  3. [Land salinization information extraction method based on HSI hyperspectral and TM imagery].

    PubMed

    Li, Jin; Zhao, Geng-Xing; Chang, Chun-Yan; Liu, Hai-Teng

    2014-02-01

    This paper chose the typical salinization area in Kenli County of the Yellow River Delta as the study area, selected HJ-1A satellite HSI image at March 15, 2011 and TM image at March 22, 2011 as source of information, and pre-processed these data by image cropping, geometric correction and atmospheric correction. Spectral characteristics of main land use types including different degree of salinization lands, water and shoals were analyzed to find distinct bands for information extraction Land use information extraction model was built by adopting the quantitative and qualitative rules combining the spectral characteristics and the content of soil salinity. Land salinization information was extracted via image classification using decision tree method. The remote sensing image interpretation accuracy was verified by land salinization degree, which was determined through soil salinity chemical analysis of soil sampling points. In addition, classification accuracy between the hyperspectral and multi-spectral images were analyzed and compared. The results showed that the overall image classification accuracy of HSI was 96.43%, Kappa coefficient was 95.59%; while the overall image classification accuracy of TM was 89.17%, Kappa coefficient was 86.74%. Therefore, compared to multi-spectral TM data, the hyperspectral imagery could be more accurate and efficient for land salinization information extraction. Also, the classification map showed that the soil salinity distinction degree of hyperspectral image was higher than that of multi-spectral image. This study explored the land salinization information extraction techniques from hyperspectral imagery, extracted the spatial distribution and area ratio information of different degree of salinization land, and provided decision-making basis for the scientific utilization and management of coastal salinization land resources in the Yellow River Delta. PMID:24822432

  4. ROI-preserving 3D video compression method utilizing depth information

    NASA Astrophysics Data System (ADS)

    Ti, Chunli; Xu, Guodong; Guan, Yudong; Teng, Yidan

    2015-09-01

    Efficiently transmitting the extra information of three dimensional (3D) video is becoming a key issue of the development of 3DTV. 2D plus depth format not only occupies the smaller bandwidth and is compatible transmission under the condition of the existing channel, but also can provide technique support for advanced 3D video compression in some extend. This paper proposes an ROI-preserving compression scheme to further improve the visual quality at a limited bit rate. According to the connection between the focus of Human Visual System (HVS) and depth information, region of interest (ROI) can be automatically selected via depth map progressing. The main improvement from common method is that a meanshift based segmentation is executed to the depth map before foreground ROI selection to keep the integrity of scene. Besides, the sensitive areas along the edges are also protected. The Spatio-temporal filtering adapting to H.264 is used to the non-ROI of both 2D video and depth map before compression. Experiments indicate that, the ROI extracted by this method is more undamaged and according with subjective feeling, and the proposed method can keep the key high-frequency information more effectively while the bit rate is reduced.

  5. A Method to Quantify Visual Information Processing in Children Using Eye Tracking

    PubMed Central

    Kooiker, Marlou J.G.; Pel, Johan J.M.; van der Steen-Kant, Sanny P.; van der Steen, Johannes

    2016-01-01

    Visual problems that occur early in life can have major impact on a child's development. Without verbal communication and only based on observational methods, it is difficult to make a quantitative assessment of a child's visual problems. This limits accurate diagnostics in children under the age of 4 years and in children with intellectual disabilities. Here we describe a quantitative method that overcomes these problems. The method uses a remote eye tracker and a four choice preferential looking paradigm to measure eye movement responses to different visual stimuli. The child sits without head support in front of a monitor with integrated infrared cameras. In one of four monitor quadrants a visual stimulus is presented. Each stimulus has a specific visual modality with respect to the background, e.g., form, motion, contrast or color. From the reflexive eye movement responses to these specific visual modalities, output parameters such as reaction times, fixation accuracy and fixation duration are calculated to quantify a child's viewing behavior. With this approach, the quality of visual information processing can be assessed without the use of communication. By comparing results with reference values obtained in typically developing children from 0-12 years, the method provides a characterization of visual information processing in visually impaired children. The quantitative information provided by this method can be advantageous for the field of clinical visual assessment and rehabilitation in multiple ways. The parameter values provide a good basis to: (i) characterize early visual capacities and consequently to enable early interventions; (ii) compare risk groups and follow visual development over time; and (iii), construct an individual visual profile for each child. PMID:27500922

  6. A Method to Quantify Visual Information Processing in Children Using Eye Tracking.

    PubMed

    Kooiker, Marlou J G; Pel, Johan J M; van der Steen-Kant, Sanny P; van der Steen, Johannes

    2016-01-01

    Visual problems that occur early in life can have major impact on a child's development. Without verbal communication and only based on observational methods, it is difficult to make a quantitative assessment of a child's visual problems. This limits accurate diagnostics in children under the age of 4 years and in children with intellectual disabilities. Here we describe a quantitative method that overcomes these problems. The method uses a remote eye tracker and a four choice preferential looking paradigm to measure eye movement responses to different visual stimuli. The child sits without head support in front of a monitor with integrated infrared cameras. In one of four monitor quadrants a visual stimulus is presented. Each stimulus has a specific visual modality with respect to the background, e.g., form, motion, contrast or color. From the reflexive eye movement responses to these specific visual modalities, output parameters such as reaction times, fixation accuracy and fixation duration are calculated to quantify a child's viewing behavior. With this approach, the quality of visual information processing can be assessed without the use of communication. By comparing results with reference values obtained in typically developing children from 0-12 years, the method provides a characterization of visual information processing in visually impaired children. The quantitative information provided by this method can be advantageous for the field of clinical visual assessment and rehabilitation in multiple ways. The parameter values provide a good basis to: (i) characterize early visual capacities and consequently to enable early interventions; (ii) compare risk groups and follow visual development over time; and (iii), construct an individual visual profile for each child. PMID:27500922

  7. A kinetic theory based numerical study of core collapse supernova dynamics

    NASA Astrophysics Data System (ADS)

    Strother, Terrance T.

    The explosion mechanism of core collapse supernovae remains an unsolved problem in astrophysics after many decades of theoretical and numerical study. The complex nature of this problem forces its consideration to rely heavily upon numerical simulations. Current state-of-the-art core collapse supernova simulations typically make use of hydrodynamic codes for the modeling of baryon dynamics coupled to a Boltzmann transport simulation for the neutrinos and other leptons. The results generated by such numerical simulations have given rise to the widely accepted notion that neutrino heating and convection are crucial for the explosion mechanism. However the precise roles that some factors such as neutrinos production and propagation, rotation, three-dimensional effects, the equation of state for asymmetric nuclear matter, general relativity, instabilities, magnetic fields, as well as others play in the explosion mechanism remain to be fully determined. In this work, we review sonic of the current methods used to simulate core collapse supernovae and the various scenarios that have been developed by numerical studies are discussed. Unlike most of the numerical simulations of core collapse supernovae, we employ a kinetic theory based approach that allows us to explicitly model the propagation of neutrinos and a full ensemble of nuclei. Both of these are significant advantages. The ability to explicitly model the propagation of neutrinos puts their treatment on equal footing with the modeling of baryon dynamics. No simplifying assumptions about the nature of neutrino-matter interactions need to be made and consequently our code is capable of producing output about the flow of neutrinos that most other simulations are inherently incapable of. Furthermore, neutrino flavor oscillations are readily incorporated with our approach. The ability to model the propagation of a full ensemble of nuclei is superior to the standard tracking of free baryons, alpha particles, and a

  8. Data Delivery Method Based on Neighbor Nodes' Information in a Mobile Ad Hoc Network

    PubMed Central

    Hayashi, Takuma; Taenaka, Yuzo; Okuda, Takeshi; Yamaguchi, Suguru

    2014-01-01

    This paper proposes a data delivery method based on neighbor nodes' information to achieve reliable communication in a mobile ad hoc network (MANET). In a MANET, it is difficult to deliver data reliably due to instabilities in network topology and wireless network condition which result from node movement. To overcome such unstable communication, opportunistic routing and network coding schemes have lately attracted considerable attention. Although an existing method that employs such schemes, MAC-independent opportunistic routing and encoding (MORE), Chachulski et al. (2007), improves the efficiency of data delivery in an unstable wireless mesh network, it does not address node movement. To efficiently deliver data in a MANET, the method proposed in this paper thus first employs the same opportunistic routing and network coding used in MORE and also uses the location information and transmission probabilities of neighbor nodes to adapt to changeable network topology and wireless network condition. The simulation experiments showed that the proposed method can achieve efficient data delivery with low network load when the movement speed is relatively slow. PMID:24672371

  9. Data delivery method based on neighbor nodes' information in a mobile ad hoc network.

    PubMed

    Kashihara, Shigeru; Hayashi, Takuma; Taenaka, Yuzo; Okuda, Takeshi; Yamaguchi, Suguru

    2014-01-01

    This paper proposes a data delivery method based on neighbor nodes' information to achieve reliable communication in a mobile ad hoc network (MANET). In a MANET, it is difficult to deliver data reliably due to instabilities in network topology and wireless network condition which result from node movement. To overcome such unstable communication, opportunistic routing and network coding schemes have lately attracted considerable attention. Although an existing method that employs such schemes, MAC-independent opportunistic routing and encoding (MORE), Chachulski et al. (2007), improves the efficiency of data delivery in an unstable wireless mesh network, it does not address node movement. To efficiently deliver data in a MANET, the method proposed in this paper thus first employs the same opportunistic routing and network coding used in MORE and also uses the location information and transmission probabilities of neighbor nodes to adapt to changeable network topology and wireless network condition. The simulation experiments showed that the proposed method can achieve efficient data delivery with low network load when the movement speed is relatively slow.

  10. Aircraft target onboard detecting technology via Circular Information Matching method for remote sensing satellite

    NASA Astrophysics Data System (ADS)

    Xiao, Huachao; Zhou, Quan; Li, Li

    2015-10-01

    Image information onboard processing is one o f important technology to rapidly achieve intelligence for remote sensing satellites. As a typical target, aircraft onboard detection has been getting more attention. In this paper, we propose an efficient method of aircraft detection for remote sensing satellite onboard processing. According to the feature of aircraft performance in remote sensing image, the detection algorithm consists of two steps: First Salient Object Detection (SOD) is employed to reduce the amount of calculation on large remote sensing image. SOD uses Gabor filtering and a simple binary test between pixels in a filtered image. White points are connected as regions. Plane candidate regions are screened from white regions by area, length and width of connected region. Next a new algorithm, called Circumferential Information Matching method, is used to detect aircraft on candidate regions. The results of tests show circumference curve around the plane center is stable shape, so the candidate region can be accurately detecting with this feature. In order to rotation invariant, we use circle matched filter to detect target. And discrete fast Fourier transform (DFFT) is used to accelerate and reduce calculation. Experiments show the detection accuracy rate of proposed algorithm is 90% with less than 0.5s processing time. In addition, the calculation of the proposed method through quantitative anglicized is very small. Experimental results and theoretical analysis show that the proposed method is reasonable and highly-efficient.

  11. Control theory based airfoil design for potential flow and a finite volume discretization

    NASA Technical Reports Server (NTRS)

    Reuther, J.; Jameson, A.

    1994-01-01

    This paper describes the implementation of optimization techniques based on control theory for airfoil design. In previous studies it was shown that control theory could be used to devise an effective optimization procedure for two-dimensional profiles in which the shape is determined by a conformal transformation from a unit circle, and the control is the mapping function. The goal of our present work is to develop a method which does not depend on conformal mapping, so that it can be extended to treat three-dimensional problems. Therefore, we have developed a method which can address arbitrary geometric shapes through the use of a finite volume method to discretize the potential flow equation. Here the control law serves to provide computationally inexpensive gradient information to a standard numerical optimization method. Results are presented, where both target speed distributions and minimum drag are used as objective functions.

  12. Theory-Based Interventions in Physical Activity: A Systematic Review of Literature in Iran

    PubMed Central

    Abdi, Jalal; Eftekhar, Hassan; Estebsari, Fatemeh; Sadeghi, Roya

    2015-01-01

    Lack of physical activity is ranked fourth among the causes of human death and chronic diseases. Using models and theories to design, implement, and evaluate the health education and health promotion interventions has many advantages. Using models and theories of physical activity, we decided to systematically study the educational and promotional interventions carried out in Iran from 2003 to 2013.Three information databases were used to systematically select papers using key words including Iranian Magazine Database (MAGIRAN), Iran Medical Library (MEDLIB), and Scientific Information Database (SID). Twenty papers were selected and studied. Having been applied in 9 studies, The Trans Theoretical Model (TTM) was the most widespread model in Iran (PENDER in 3 studies, BASNEF in 2, and the Theory of Planned Behavior in 2 studies). With regards to the educational methods, almost all studies used a combination of methods. The most widely used Integrative educational method was group discussion. Only one integrated study was done. Behavior maintenance was not addressed in 75% of the studies. Almost all studies used self-reporting instruments. The effectiveness of educational methods was assessed in none of the studies. Most of the included studies had several methodological weaknesses, which hinder the validity and applicability of their results. According to the findings, the necessity of need assessment in using models, epidemiology and methodology consultation, addressing maintenance of physical activity, using other theories and models such as social marketing and social-cognitive theory, and other educational methods like empirical and complementary are suggested. PMID:25948454

  13. Theory-based interventions in physical activity: a systematic review of literature in Iran.

    PubMed

    Abdi, Jalal; Eftekhar, Hassan; Estebsari, Fatemeh; Sadeghi, Roya

    2015-01-01

    Lack of physical activity is ranked fourth among the causes of human death and chronic diseases. Using models and theories to design, implement, and evaluate the health education and health promotion interventions has many advantages. Using models and theories of physical activity, we decided to systematically study the educational and promotional interventions carried out in Iran from 2003 to 2013.Three information databases were used to systematically select papers using key words including Iranian Magazine Database (MAGIRAN), Iran Medical Library (MEDLIB), and Scientific Information Database (SID). Twenty papers were selected and studied .Having been applied in 9 studies, The Trans Theoretical Model (TTM) was the most widespread model in Iran (PENDER in 3 studies, BASNEF in 2, and the Theory of Planned Behavior in 2 studies). With regards to the educational methods, almost all studies used a combination of methods. The most widely used Integrative educational method was group discussion. Only one integrated study was done. Behavior maintenance was not addressed in 75% of the studies. Almost all studies used self-reporting instruments. The effectiveness of educational methods was assessed in none of the studies. Most of the included studies had several methodological weaknesses, which hinder the validity and applicability of their results. According to the findings, the necessity of need assessment in using models, epidemiology and methodology consultation, addressing maintenance of physical activity, using other theories and models such as social marketing and social-cognitive theory, and other educational methods like empirical and complementary are suggested. PMID:25948454

  14. Methods of quantifying and visualising outbreaks of tuberculosis using genotypic information.

    PubMed

    Tanaka, Mark M; Francis, Andrew R

    2005-01-01

    Genotypic data from pathogenic isolates are often used to measure the extent of infectious disease transmission. These methods include phylogenetic reconstruction and the evaluation of clustering indices. The first aim of this paper is to critique current methods used to analyse genotypic data from molecular epidemiological studies of tuberculosis. In particular, by not accounting for the mutation rate of markers, errors arise in making inferences about outbreaks based on genotypic information. The second aim is to suggest a new way to represent genotypic data visually, involving graphs and trees. We also discuss some interpretations and modifications of existing indices. Although our focus is tuberculosis, the methods we discuss are generally applicable to any directly transmissible clonal pathogen.

  15. Bilateral Teleoperation Method Using an Autonomous Control Based on Information on Contact Environment

    NASA Astrophysics Data System (ADS)

    Taguchi, Keiichi; Ohnishi, Kouhei

    In procedures that involve remote control, such as remote surgery, it is necessary to operate a robot in a remote location in a sensitive environment; the treatment of internal organs is an example of such a procedure. In this paper, we propose a method for autonomous hazard avoidance control that is based on information on the contact environment. The proposed method involves the use of bilateral control. During safe operations, systems are controlled by bilateral control. During dangerous operations, a slave system is controlled autonomously so as to avoid dangerous operations. In order to determine the degree of operation risk, fuzzy set theory is applied to the force exerted on the environment. Further, variable compliance control based on the force exerted on the environment is utilized to avoid the risk. The effectiveness of the proposed method is confirmed by experimental results.

  16. Support of Wheelchairs Using Pheromone Information with Two Types of Communication Methods

    NASA Astrophysics Data System (ADS)

    Yamamoto, Koji; Nitta, Katsumi

    In this paper, we propose a communication framework which combined two types of communication among wheelchairs and mobile devices. Due to restriction of range of activity, there is a problem that wheelchair users tend to shut themselves up in their houses. We developed a navigational wheelchair which loads a system that displays information on a map through WWW. However, this wheelchair is expensive because it needs a solid PC, a precise GPS, a battery, and so on. We introduce mobile devices and use this framework to provide information to wheelchair users and to facilitate them to go out. When a user encounters other users, they exchange messages which they have by short-distance wireless communication. Once a message is delivered to a navigational wheelchair, the wheelchair uploads the message to the system. We use two types of pheromone information which represent trends of user's movement and existences of a crowd of users. First, when users gather, ``crowd of people pheromone'' is emitted virtually. Users do not send these pheromones to the environment but carry them. If the density exceeds the threshold, messages that express ``people gethered'' are generated automatically. The other pheromone is ``movement trend pheromone'', which is used to improve probability of successful transmissions. From results of experiments, we concluded that our method can deliver information that wheelchair users gathered to other wheelchairs.

  17. Spatial modelling of periglacial phenomena in Deception Island (Maritime Antarctic): logistic regression and informative value method.

    NASA Astrophysics Data System (ADS)

    Melo, Raquel; Vieira, Gonçalo; Caselli, Alberto; Ramos, Miguel

    2010-05-01

    Field surveying during the austral summer of 2007/08 and the analysis of a QuickBird satellite image, resulted on the production of a detailed geomorphological map of the Irizar and Crater Lake area in Deception Island (South Shetlands, Maritime Antarctic - 1:10 000) and allowed its analysis and spatial modelling of the geomorphological phenomena. The present study focus on the analysis of the spatial distribution and characteristics of hummocky terrains, lag surfaces and nivation hollows, complemented by GIS spatial modelling intending to identify relevant controlling geographical factors. Models of the susceptibility of occurrence of these phenomena were created using two statistical methods: logistical regression, as a multivariate method; and the informative value as a bivariate method. Success and prediction rate curves were used for model validation. The Area Under the Curve (AUC) was used to quantify the level of performance and prediction of the models and to allow the comparison between the two methods. Regarding the logistic regression method, the AUC showed a success rate of 71% for the lag surfaces, 81% for the hummocky terrains and 78% for the nivation hollows. The prediction rate was 72%, 68% and 71%, respectively. Concerning the informative value method, the success rate was 69% for the lag surfaces, 84% for the hummocky terrains and 78% for the nivation hollows, and with a correspondingly prediction of 71%, 66% and 69%. The results were of very good quality and demonstrate the potential of the models to predict the influence of independent variables in the occurrence of the geomorphological phenomena and also the reliability of the data. Key-words: present-day geomorphological dynamics, detailed geomorphological mapping, GIS, spatial modelling, Deception Island, Antarctic.

  18. Comparison of Seven Methods for Boolean Factor Analysis and Their Evaluation by Information Gain.

    PubMed

    Frolov, Alexander A; Húsek, Dušan; Polyakov, Pavel Yu

    2016-03-01

    An usual task in large data set analysis is searching for an appropriate data representation in a space of fewer dimensions. One of the most efficient methods to solve this task is factor analysis. In this paper, we compare seven methods for Boolean factor analysis (BFA) in solving the so-called bars problem (BP), which is a BFA benchmark. The performance of the methods is evaluated by means of information gain. Study of the results obtained in solving BP of different levels of complexity has allowed us to reveal strengths and weaknesses of these methods. It is shown that the Likelihood maximization Attractor Neural Network with Increasing Activity (LANNIA) is the most efficient BFA method in solving BP in many cases. Efficacy of the LANNIA method is also shown, when applied to the real data from the Kyoto Encyclopedia of Genes and Genomes database, which contains full genome sequencing for 1368 organisms, and to text data set R52 (from Reuters 21578) typically used for label categorization.

  19. Method for detecting core malware sites related to biomedical information systems.

    PubMed

    Kim, Dohoon; Choi, Donghee; Jin, Jonghyun

    2015-01-01

    Most advanced persistent threat attacks target web users through malicious code within landing (exploit) or distribution sites. There is an urgent need to block the affected websites. Attacks on biomedical information systems are no exception to this issue. In this paper, we present a method for locating malicious websites that attempt to attack biomedical information systems. Our approach uses malicious code crawling to rearrange websites in the order of their risk index by analyzing the centrality between malware sites and proactively eliminates the root of these sites by finding the core-hub node, thereby reducing unnecessary security policies. In particular, we dynamically estimate the risk index of the affected websites by analyzing various centrality measures and converting them into a single quantified vector. On average, the proactive elimination of core malicious websites results in an average improvement in zero-day attack detection of more than 20%. PMID:25821511

  20. Co-design of RAD and ETHICS methodologies: a combination of information system development methods

    NASA Astrophysics Data System (ADS)

    Nasehi, Arezo; Shahriyari, Salman

    2011-12-01

    Co-design is a new trend in the social world which tries to capture different ideas in order to use the most appropriate features for a system. In this paper, co-design of two information system methodologies is regarded; rapid application development (RAD) and effective technical and human implementation of computer-based systems (ETHICS). We tried to consider the characteristics of these methodologies to see the possibility of having a co-design or combination of them for developing an information system. To reach this purpose, four different aspects of them are analyzed: social or technical approach, user participation and user involvement, job satisfaction, and overcoming change resistance. Finally, a case study using the quantitative method is analyzed in order to examine the possibility of co-design using these factors. The paper concludes that RAD and ETHICS are appropriate to be co-designed and brings some suggestions for the co-design.

  1. Methods and apparatuses for information analysis on shared and distributed computing systems

    DOEpatents

    Bohn, Shawn J [Richland, WA; Krishnan, Manoj Kumar [Richland, WA; Cowley, Wendy E [Richland, WA; Nieplocha, Jarek [Richland, WA

    2011-02-22

    Apparatuses and computer-implemented methods for analyzing, on shared and distributed computing systems, information comprising one or more documents are disclosed according to some aspects. In one embodiment, information analysis can comprise distributing one or more distinct sets of documents among each of a plurality of processes, wherein each process performs operations on a distinct set of documents substantially in parallel with other processes. Operations by each process can further comprise computing term statistics for terms contained in each distinct set of documents, thereby generating a local set of term statistics for each distinct set of documents. Still further, operations by each process can comprise contributing the local sets of term statistics to a global set of term statistics, and participating in generating a major term set from an assigned portion of a global vocabulary.

  2. Method for detecting core malware sites related to biomedical information systems.

    PubMed

    Kim, Dohoon; Choi, Donghee; Jin, Jonghyun

    2015-01-01

    Most advanced persistent threat attacks target web users through malicious code within landing (exploit) or distribution sites. There is an urgent need to block the affected websites. Attacks on biomedical information systems are no exception to this issue. In this paper, we present a method for locating malicious websites that attempt to attack biomedical information systems. Our approach uses malicious code crawling to rearrange websites in the order of their risk index by analyzing the centrality between malware sites and proactively eliminates the root of these sites by finding the core-hub node, thereby reducing unnecessary security policies. In particular, we dynamically estimate the risk index of the affected websites by analyzing various centrality measures and converting them into a single quantified vector. On average, the proactive elimination of core malicious websites results in an average improvement in zero-day attack detection of more than 20%.

  3. Method for Detecting Core Malware Sites Related to Biomedical Information Systems

    PubMed Central

    Kim, Dohoon; Choi, Donghee; Jin, Jonghyun

    2015-01-01

    Most advanced persistent threat attacks target web users through malicious code within landing (exploit) or distribution sites. There is an urgent need to block the affected websites. Attacks on biomedical information systems are no exception to this issue. In this paper, we present a method for locating malicious websites that attempt to attack biomedical information systems. Our approach uses malicious code crawling to rearrange websites in the order of their risk index by analyzing the centrality between malware sites and proactively eliminates the root of these sites by finding the core-hub node, thereby reducing unnecessary security policies. In particular, we dynamically estimate the risk index of the affected websites by analyzing various centrality measures and converting them into a single quantified vector. On average, the proactive elimination of core malicious websites results in an average improvement in zero-day attack detection of more than 20%. PMID:25821511

  4. Comparison of Information Dissemination Methods in Inle Lake: A Lesson for Reconsidering Framework for Environmental Education Strategies

    ERIC Educational Resources Information Center

    Oo, Htun Naing; Sutheerawatthana, Pitch; Minato, Takayuki

    2010-01-01

    This article analyzes the practice of information dissemination regarding pesticide usage in floating gardening in a rural area. The analysis reveals reasons why the current information dissemination methods employed by relevant stakeholders do not work. It then puts forward a proposition that information sharing within organizations of and among…

  5. Geographic Information System Software to Remodel Population Data Using Dasymetric Mapping Methods

    USGS Publications Warehouse

    Sleeter, Rachel; Gould, Michael

    2007-01-01

    The U.S. Census Bureau provides decadal demographic data collected at the household level and aggregated to larger enumeration units for anonymity purposes. Although this system is appropriate for the dissemination of large amounts of national demographic data, often the boundaries of the enumeration units do not reflect the distribution of the underlying statistical phenomena. Conventional mapping methods such as choropleth mapping, are primarily employed due to their ease of use. However, the analytical drawbacks of choropleth methods are well known ranging from (1) the artificial transition of population at the boundaries of mapping units to (2) the assumption that the phenomena is evenly distributed across the enumeration unit (when in actuality there can be significant variation). Many methods to map population distribution have been practiced in geographic information systems (GIS) and remote sensing fields. Many cartographers prefer dasymetric mapping to map population because of its ability to more accurately distribute data over geographic space. Similar to ?choropleth maps?, a dasymetric map utilizes standardized data (for example, census data). However, rather than using arbitrary enumeration zones to symbolize population distribution, a dasymetric approach introduces ancillary information to redistribute the standardized data into zones relative to land use and land cover (LULC), taking into consideration actual changing densities within the boundaries of the enumeration unit. Thus, new zones are created that correlate to the function of the map, capturing spatial variations in population density. The transfer of data from census enumeration units to ancillary-driven homogenous zones is performed by a process called areal interpolation.

  6. A Novel Method of Multi-Information Acquisition for Electromagnetic Flow Meters.

    PubMed

    Cui, Wenhua; Li, Bin; Chen, Jie; Li, Xinwei

    2015-01-01

    In this paper, a novel method is proposed for multi-information acquisition from the electromagnetic flow meter, using magnetic excitation to measure the fluid velocity and electrochemistry impedance spectroscopy (EIS) for both the fluid quality and the contamination level of the transducer. The impedance spectra of the transducer are measured with an additional electrical stimulus in series with the electrode measurement loop. The series connection mode instead of the parallel one improves the signal-to-noise ratio (SNR) of the fluid velocity measurement and offers a wide range of impedance measurements by using a sample capacitance. In addition, a multi-frequency synchronous excitation source is synthesized based on the method of dual-base power sequences for fast EIS measurement. The conductivity measurements in the range of 1.7 μS/cm-2 mS/cm showed a relatively high accuracy with a measurement error of 5%, and the electrode adhesion detection on both with coating and no coating showed the ability of the qualitative determination of the electrode adhesion, which validated the feasibility of the multi-information acquisition method for the electromagnetic flow meter (EMFM). PMID:26712762

  7. A Novel Method of Multi-Information Acquisition for Electromagnetic Flow Meters

    PubMed Central

    Cui, Wenhua; Li, Bin; Chen, Jie; Li, Xinwei

    2015-01-01

    In this paper, a novel method is proposed for multi-information acquisition from the electromagnetic flow meter, using magnetic excitation to measure the fluid velocity and electrochemistry impedance spectroscopy (EIS) for both the fluid quality and the contamination level of the transducer. The impedance spectra of the transducer are measured with an additional electrical stimulus in series with the electrode measurement loop. The series connection mode instead of the parallel one improves the signal-to-noise ratio (SNR) of the fluid velocity measurement and offers a wide range of impedance measurements by using a sample capacitance. In addition, a multi-frequency synchronous excitation source is synthesized based on the method of dual-base power sequences for fast EIS measurement. The conductivity measurements in the range of 1.7 μS/cm–2 mS/cm showed a relatively high accuracy with a measurement error of 5%, and the electrode adhesion detection on both with coating and no coating showed the ability of the qualitative determination of the electrode adhesion, which validated the feasibility of the multi-information acquisition method for the electromagnetic flow meter (EMFM). PMID:26712762

  8. The method of earthquake landslide information extraction with high-resolution remote sensing

    NASA Astrophysics Data System (ADS)

    Wu, Jian; Chen, Peng; Liu, Yaolin; Wang, Jing

    2014-05-01

    As a kind of secondary geological disaster caused by strong earthquake, the earthquake-induced landslide has drawn much attention in the world due to the severe hazard. The high-resolution remote sensing, as a new technology for investigation and monitoring, has been widely applied in landslide susceptibility and hazard mapping. The Ms 8.0 Wenchuan earthquake, occurred on 12 May 2008, caused many buildings collapse and half million people be injured. Meanwhile, damage caused by earthquake-induced landslides, collapse and debris flow became the major part of total losses. By analyzing the property of the Zipingpu landslide occurred in the Wenchuan earthquake, the present study advanced a quick-and-effective way for landslide extraction based on NDVI and slope information, and the results were validated with pixel-oriented and object-oriented methods. The main advantage of the idea lies in the fact that it doesn't need much professional knowledge and data such as crustal movement, geological structure, fractured zone, etc. and the researchers can provide the landslide monitoring information for earthquake relief as soon as possible. In pixel-oriented way, the NDVI-differential image as well as slope image was analyzed and segmented to extract the landslide information. When it comes to object-oriented method, the multi-scale segmentation algorithm was applied in order to build up three-layer hierarchy. The spectral, textural, shape, location and contextual information of individual object classes, and GLCM (Grey Level Concurrence Matrix homogeneity, shape index etc. were extracted and used to establish the fuzzy decision rule system of each layer for earthquake landslide extraction. Comparison of the results generated from the two methods, showed that the object-oriented method could successfully avoid the phenomenon of NDVI-differential bright noise caused by the spectral diversity of high-resolution remote sensing data and achieved better result with an overall

  9. The Theory-based Influence of Map Features on Risk Beliefs: Self-reports of What is Seen and Understood for Maps Depicting an Environmental Health Hazard

    PubMed Central

    Vatovec, Christine

    2013-01-01

    Theory-based research is needed to understand how maps of environmental health risk information influence risk beliefs and protective behavior. Using theoretical concepts from multiple fields of study including visual cognition, semiotics, health behavior, and learning and memory supports a comprehensive assessment of this influence. We report results from thirteen cognitive interviews that provide theory-based insights into how visual features influenced what participants saw and the meaning of what they saw as they viewed three formats of water test results for private wells (choropleth map, dot map, and a table). The unit of perception, color, proximity to hazards, geographic distribution, and visual salience had substantial influences on what participants saw and their resulting risk beliefs. These influences are explained by theoretical factors that shape what is seen, properties of features that shape cognition (pre-attentive, symbolic, visual salience), information processing (top-down and bottom-up), and the strength of concrete compared to abstract information. Personal relevance guided top-down attention to proximal and larger hazards that shaped stronger risk beliefs. Meaning was more local for small perceptual units and global for large units. Three aspects of color were important: pre-attentive “incremental risk” meaning of sequential shading, symbolic safety meaning of stoplight colors, and visual salience that drew attention. The lack of imagery, geographic information, and color diminished interest in table information. Numeracy and prior beliefs influenced comprehension for some participants. Results guided the creation of an integrated conceptual framework for application to future studies. Ethics should guide the selection of map features that support appropriate communication goals. PMID:22715919

  10. The theory-based influence of map features on risk beliefs: self-reports of what is seen and understood for maps depicting an environmental health hazard.

    PubMed

    Severtson, Dolores J; Vatovec, Christine

    2012-08-01

    Theory-based research is needed to understand how maps of environmental health risk information influence risk beliefs and protective behavior. Using theoretical concepts from multiple fields of study including visual cognition, semiotics, health behavior, and learning and memory supports a comprehensive assessment of this influence. The authors report results from 13 cognitive interviews that provide theory-based insights into how visual features influenced what participants saw and the meaning of what they saw as they viewed 3 formats of water test results for private wells (choropleth map, dot map, and a table). The unit of perception, color, proximity to hazards, geographic distribution, and visual salience had substantial influences on what participants saw and their resulting risk beliefs. These influences are explained by theoretical factors that shape what is seen, properties of features that shape cognition (preattentive, symbolic, visual salience), information processing (top-down and bottom-up), and the strength of concrete compared with abstract information. Personal relevance guided top-down attention to proximal and larger hazards that shaped stronger risk beliefs. Meaning was more local for small perceptual units and global for large units. Three aspects of color were important: preattentive "incremental risk" meaning of sequential shading, symbolic safety meaning of stoplight colors, and visual salience that drew attention. The lack of imagery, geographic information, and color diminished interest in table information. Numeracy and prior beliefs influenced comprehension for some participants. Results guided the creation of an integrated conceptual framework for application to future studies. Ethics should guide the selection of map features that support appropriate communication goals.

  11. Benchmarking Clinical Speech Recognition and Information Extraction: New Data, Methods, and Evaluations

    PubMed Central

    Zhou, Liyuan; Hanlen, Leif; Ferraro, Gabriela

    2015-01-01

    Background Over a tenth of preventable adverse events in health care are caused by failures in information flow. These failures are tangible in clinical handover; regardless of good verbal handover, from two-thirds to all of this information is lost after 3-5 shifts if notes are taken by hand, or not at all. Speech recognition and information extraction provide a way to fill out a handover form for clinical proofing and sign-off. Objective The objective of the study was to provide a recorded spoken handover, annotated verbatim transcriptions, and evaluations to support research in spoken and written natural language processing for filling out a clinical handover form. This dataset is based on synthetic patient profiles, thereby avoiding ethical and legal restrictions, while maintaining efficacy for research in speech-to-text conversion and information extraction, based on realistic clinical scenarios. We also introduce a Web app to demonstrate the system design and workflow. Methods We experiment with Dragon Medical 11.0 for speech recognition and CRF++ for information extraction. To compute features for information extraction, we also apply CoreNLP, MetaMap, and Ontoserver. Our evaluation uses cross-validation techniques to measure processing correctness. Results The data provided were a simulation of nursing handover, as recorded using a mobile device, built from simulated patient records and handover scripts, spoken by an Australian registered nurse. Speech recognition recognized 5276 of 7277 words in our 100 test documents correctly. We considered 50 mutually exclusive categories in information extraction and achieved the F1 (ie, the harmonic mean of Precision and Recall) of 0.86 in the category for irrelevant text and the macro-averaged F1 of 0.70 over the remaining 35 nonempty categories of the form in our 101 test documents. Conclusions The significance of this study hinges on opening our data, together with the related performance benchmarks and some

  12. A hybrid and adaptive segmentation method using color and texture information

    NASA Astrophysics Data System (ADS)

    Meurie, C.; Ruichek, Y.; Cohen, A.; Marais, J.

    2010-01-01

    This paper presents a new image segmentation method based on the combination of texture and color informations. The method first computes the morphological color and texture gradients. The color gradient is analyzed taking into account the different color spaces. The texture gradient is computed using the luminance component of the HSL color space. The texture gradient procedure is achieved using a morphological filter and a granulometric and local energy analysis. To overcome the limitations of a linear/barycentric combination, the two morphological gradients are then mixed using a gradient component fusion strategy (to fuse the three components of the color gradient and the unique component of the texture gradient) and an adaptive technique to choose the weighting coefficients. The segmentation process is finally performed by applying the watershed technique using different type of germ images. The segmentation method is evaluated in different object classification applications using the k-means algorithm. The obtained results are compared with other known segmentation methods. The evaluation analysis shows that the proposed method gives better results, especially with hard image acquisition conditions.

  13. Methods of Hematoxylin and Erosin Image Information Acquisition and Optimization in Confocal Microscopy

    PubMed Central

    Yoon, Woong Bae; Kim, Hyunjin; Kim, Kwang Gi; Choi, Yongdoo; Chang, Hee Jin

    2016-01-01

    Objectives We produced hematoxylin and eosin (H&E) staining-like color images by using confocal laser scanning microscopy (CLSM), which can obtain the same or more information in comparison to conventional tissue staining. Methods We improved images by using several image converting techniques, including morphological methods, color space conversion methods, and segmentation methods. Results An image obtained after image processing showed coloring very similar to that in images produced by H&E staining, and it is advantageous to conduct analysis through fluorescent dye imaging and microscopy rather than analysis based on single microscopic imaging. Conclusions The colors used in CLSM are different from those seen in H&E staining, which is the method most widely used for pathologic diagnosis and is familiar to pathologists. Computer technology can facilitate the conversion of images by CLSM to be very similar to H&E staining images. We believe that the technique used in this study has great potential for application in clinical tissue analysis. PMID:27525165

  14. Split operator method for fluorescence diffuse optical tomography using anisotropic diffusion regularisation with prior anatomical information

    PubMed Central

    Correia, Teresa; Aguirre, Juan; Sisniega, Alejandro; Chamorro-Servent, Judit; Abascal, Juan; Vaquero, Juan J.; Desco, Manuel; Kolehmainen, Ville; Arridge, Simon

    2011-01-01

    Fluorescence diffuse optical tomography (fDOT) is an imaging modality that provides images of the fluorochrome distribution within the object of study. The image reconstruction problem is ill-posed and highly underdetermined and, therefore, regularisation techniques need to be used. In this paper we use a nonlinear anisotropic diffusion regularisation term that incorporates anatomical prior information. We introduce a split operator method that reduces the nonlinear inverse problem to two simpler problems, allowing fast and efficient solution of the fDOT problem. We tested our method using simulated, phantom and ex-vivo mouse data, and found that it provides reconstructions with better spatial localisation and size of fluorochrome inclusions than using the standard Tikhonov penalty term. PMID:22091447

  15. Obtaining structural and functional information for GPCRs using the substituted-cysteine accessibility method (SCAM).

    PubMed

    Liapakis, George

    2014-01-01

    G-protein coupled receptors (GPCRs) are proteins of the plasma membrane, which are characterized by seven membrane-spanning segments (TMs). GPCRs play an important role in almost all of our physiological and pathophysiological conditions by interacting with a large variety of ligands and stimulating different G-proteins and signaling cascades. By playing a key role in the function of our body and being involved in the pathophysiology of many disorders, GPCRs are very important therapeutic targets. Determination of the structure and function of GPCRs could advance the design of novel receptor-specific drugs against various diseases. A powerful method to obtain structural and functional information for GPCRs is the cysteine substituted accessibility method (SCAM). SCAM is used to systematically map the TM residues of GPCRs and determine their functional role. SCAM can also be used to determine differences in the structures of the TMs in different functional states of GPCRs.

  16. Information Accessibility of the Charcoal Burning Suicide Method in Mainland China

    PubMed Central

    Cheng, Qijin; Chang, Shu-Sen; Guo, Yingqi; Yip, Paul S. F.

    2015-01-01

    Background There has been a marked rise in suicide by charcoal burning (CB) in some East Asian countries but little is known about its incidence in mainland China. We examined media-reported CB suicides and the availability of online information about the method in mainland China. Methods We extracted and analyzed data for i) the characteristics and trends of fatal and nonfatal CB suicides reported by mainland Chinese newspapers (1998–2014); ii) trends and geographic variations in online searches using keywords relating to CB suicide (2011–2014); and iii) the content of Internet search results. Results 109 CB suicide attempts (89 fatal and 20 nonfatal) were reported by newspapers in 13 out of the 31 provinces or provincial-level-municipalities in mainland China. There were increasing trends in the incidence of reported CB suicides and in online searches using CB-related keywords. The province-level search intensities were correlated with CB suicide rates (Spearman’s correlation coefficient = 0.43 [95% confidence interval: 0.08–0.68]). Two-thirds of the web links retrieved using the search engine contained detailed information about the CB suicide method, of which 15% showed pro-suicide attitudes, and the majority (86%) did not encourage people to seek help. Limitations The incidence of CB suicide was based on newspaper reports and likely to be underestimated. Conclusions Mental health and suicide prevention professionals in mainland China should be alert to the increased use of this highly lethal suicide method. Better surveillance and intervention strategies need to be developed and implemented. PMID:26474297

  17. High-speed readout method of ID information on a large amount of electronic tags

    NASA Astrophysics Data System (ADS)

    Nagate, Wataru; Sasabe, Masahiro; Nakano, Hirotaka

    2006-10-01

    An electronic tag such as RFID is expected to create new services that cannot be achieved by the traditional bar code. Specifically, in a distribution system, simultaneous readout method of a large amount of electronic tags embedded in products is required to reduce costs and time. In this paper, we propose novel methods, called Response Probability Control (RPC), to accomplish this requirement. In RPC, a reader firstly sends an ID request to electronic tags in its access area. It succeeds reading information on a tag only if other tags do not respond. To improve the readout efficiency, the reader appropriately controls the response probability in accordance with the number of tags. However, this approach cannot entirely avoid a collision of multiple responses. When a collision occurs, ID information is lost. To reduce the amount of lost data, we divide the ID registration process into two steps. The reader first gathers the former part of the original ID, called temporal ID, according to the above method. After obtaining the temporal ID, it sequentially collects the latter part of ID, called remaining ID, based on the temporal ID. Note that we determine the number of bits of a temporal ID in accordance with the number of tags in the access area so that each tag can be distinguishable. Through simulation experiments, we evaluate RPC in terms of the readout efficiency. Simulation results show that RPC can accomplish the readout efficiency 1.17 times higher than the traditional method where there are a thousand of electronic tags whose IDs are 128 bits.

  18. Can functionalized cucurbituril bind actinyl cations efficiently? A density functional theory based investigation.

    PubMed

    Sundararajan, Mahesh; Sinha, Vivek; Bandyopadhyay, Tusar; Ghosh, Swapan K

    2012-05-01

    The feasibility of using cucurbituril host molecule as a probable actinyl cation binders candidate is investigated through density functional theory based calculations. Various possible binding sites of the cucurbit[5]uril host molecule to uranyl are analyzed and based on the binding energy evaluations, μ(5)-binding is predicted to be favored. For this coordination, the structure, vibrational spectra, and binding energies are evaluated for the binding of three actinyls in hexa-valent and penta-valent oxidation states with functionalized cucurbiturils. Functionalizing cucurbituril with methyl and cyclohexyl groups increases the binding affinities of actinyls, whereas fluorination decreases the binding affinities as compared to the native host molecule. Surprisingly hydroxylation of the host molecule does not distinguish the oxidation state of the three actinyls.

  19. Can functionalized cucurbituril bind actinyl cations efficiently? A density functional theory based investigation.

    PubMed

    Sundararajan, Mahesh; Sinha, Vivek; Bandyopadhyay, Tusar; Ghosh, Swapan K

    2012-05-01

    The feasibility of using cucurbituril host molecule as a probable actinyl cation binders candidate is investigated through density functional theory based calculations. Various possible binding sites of the cucurbit[5]uril host molecule to uranyl are analyzed and based on the binding energy evaluations, μ(5)-binding is predicted to be favored. For this coordination, the structure, vibrational spectra, and binding energies are evaluated for the binding of three actinyls in hexa-valent and penta-valent oxidation states with functionalized cucurbiturils. Functionalizing cucurbituril with methyl and cyclohexyl groups increases the binding affinities of actinyls, whereas fluorination decreases the binding affinities as compared to the native host molecule. Surprisingly hydroxylation of the host molecule does not distinguish the oxidation state of the three actinyls. PMID:22471316

  20. Theory-based approaches to understanding public emergency preparedness: implications for effective health and risk communication.

    PubMed

    Paek, Hye-Jin; Hilyard, Karen; Freimuth, Vicki; Barge, J Kevin; Mindlin, Michele

    2010-06-01

    Recent natural and human-caused disasters have awakened public health officials to the importance of emergency preparedness. Guided by health behavior and media effects theories, the analysis of a statewide survey in Georgia reveals that self-efficacy, subjective norm, and emergency news exposure are positively associated with the respondents' possession of emergency items and their stages of emergency preparedness. Practical implications suggest less focus on demographics as the sole predictor of emergency preparedness and more comprehensive measures of preparedness, including both a person's cognitive stage of preparedness and checklists of emergency items on hand. We highlight the utility of theory-based approaches for understanding and predicting public emergency preparedness as a way to enable more effective health and risk communication. PMID:20574880

  1. Prognostic value of graph theory-based tissue architecture analysis in carcinomas of the tongue.

    PubMed

    Sudbø, J; Bankfalvi, A; Bryne, M; Marcelpoil, R; Boysen, M; Piffko, J; Hemmer, J; Kraft, K; Reith, A

    2000-12-01

    Several studies on oral squamous cell carcinomas (OSCC) suggest that the clinical value of traditional histologic grading is limited both by poor reproducibility and by low prognostic impact. However, the prognostic potential of a strictly quantitative and highly reproducible assessment of the tissue architecture in OSCC has not been evaluated. Using image analysis, in 193 cases of T1-2 (Stage I-II) OSCC we retrospectively investigated the prognostic impact of two graph theory-derived structural features: the average Delaunay Edge Length (DEL_av) and the average homogeneity of the Ulam Tree (ELH_av). Both structural features were derived from subgraphs of the Voronoi Diagram. The geometric centers of the cell nuclei were computed, generating a two-dimensional swarm of point-like seeds from which graphs could be constructed. The impact on survival of the computed values of ELH_av and DEL_av was estimated by the method of Kaplan and Meier, with relapse-free survival and overall survival as end-points. The prognostic values of DEL_av and ELH_av as computed for the invasive front, the superficial part of the carcinoma, the total carcinoma, and the normal-appearing oral mucosa were compared. For DEL_av, significant prognostic information was found in the invasive front (p < 0.001). No significant prognostic information was found in superficial part of the carcinoma (p = 0.34), in the carcinoma as a whole (p = 0.35), or in the normal-appearing mucosa (p = 0.27). For ELH_av, significant prognostic information was found in the invasive front (p = 0.01) and, surprisingly, in putatively normal mucosa (p = 0.03). No significant prognostic information was found in superficial parts of the carcinoma (p = 0.34) or in the total carcinoma (p = 0.11). In conclusion, strictly quantitative assessment of tissue architecture in the invasive front of OSCC yields highly prognostic information. PMID:11140700

  2. Prognostic value of graph theory-based tissue architecture analysis in carcinomas of the tongue.

    PubMed

    Sudbø, J; Bankfalvi, A; Bryne, M; Marcelpoil, R; Boysen, M; Piffko, J; Hemmer, J; Kraft, K; Reith, A

    2000-12-01

    Several studies on oral squamous cell carcinomas (OSCC) suggest that the clinical value of traditional histologic grading is limited both by poor reproducibility and by low prognostic impact. However, the prognostic potential of a strictly quantitative and highly reproducible assessment of the tissue architecture in OSCC has not been evaluated. Using image analysis, in 193 cases of T1-2 (Stage I-II) OSCC we retrospectively investigated the prognostic impact of two graph theory-derived structural features: the average Delaunay Edge Length (DEL_av) and the average homogeneity of the Ulam Tree (ELH_av). Both structural features were derived from subgraphs of the Voronoi Diagram. The geometric centers of the cell nuclei were computed, generating a two-dimensional swarm of point-like seeds from which graphs could be constructed. The impact on survival of the computed values of ELH_av and DEL_av was estimated by the method of Kaplan and Meier, with relapse-free survival and overall survival as end-points. The prognostic values of DEL_av and ELH_av as computed for the invasive front, the superficial part of the carcinoma, the total carcinoma, and the normal-appearing oral mucosa were compared. For DEL_av, significant prognostic information was found in the invasive front (p < 0.001). No significant prognostic information was found in superficial part of the carcinoma (p = 0.34), in the carcinoma as a whole (p = 0.35), or in the normal-appearing mucosa (p = 0.27). For ELH_av, significant prognostic information was found in the invasive front (p = 0.01) and, surprisingly, in putatively normal mucosa (p = 0.03). No significant prognostic information was found in superficial parts of the carcinoma (p = 0.34) or in the total carcinoma (p = 0.11). In conclusion, strictly quantitative assessment of tissue architecture in the invasive front of OSCC yields highly prognostic information.

  3. An inversion method for retrieving soil moisture information from satellite altimetry observations

    NASA Astrophysics Data System (ADS)

    Uebbing, Bernd; Forootan, Ehsan; Kusche, Jürgen; Braakmann-Folgmann, Anne

    2016-04-01

    Soil moisture represents an important component of the terrestrial water cycle that controls., evapotranspiration and vegetation growth. Consequently, knowledge on soil moisture variability is essential to understand the interactions between land and atmosphere. Yet, terrestrial measurements are sparse and their information content is limited due to the large spatial variability of soil moisture. Therefore, over the last two decades, several active and passive radar and satellite missions such as ERS/SCAT, AMSR, SMOS or SMAP have been providing backscatter information that can be used to estimate surface conditions including soil moisture which is proportional to the dielectric constant of the upper (few cm) soil layers . Another source of soil moisture information are satellite radar altimeters, originally designed to measure sea surface height over the oceans. Measurements of Jason-1/2 (Ku- and C-Band) or Envisat (Ku- and S-Band) nadir radar backscatter provide high-resolution along-track information (~ 300m along-track resolution) on backscatter every ~10 days (Jason-1/2) or ~35 days (Envisat). Recent studies found good correlation between backscatter and soil moisture in upper layers, especially in arid and semi-arid regions, indicating the potential of satellite altimetry both to reconstruct and to monitor soil moisture variability. However, measuring soil moisture using altimetry has some drawbacks that include: (1) the noisy behavior of the altimetry-derived backscatter (due to e.g., existence of surface water in the radar foot-print), (2) the strong assumptions for converting altimetry backscatters to the soil moisture storage changes, and (3) the need for interpolating between the tracks. In this study, we suggest a new inversion framework that allows to retrieve soil moisture information from along-track Jason-2 and Envisat satellite altimetry data, and we test this scheme over the Australian arid and semi-arid regions. Our method consists of: (i

  4. Adaptive broadcasting method using neighbor type information in wireless sensor networks.

    PubMed

    Jeong, Hyocheol; Kim, Jeonghyun; Yoo, Younghwan

    2011-01-01

    Flooding is the simplest and most effective way to disseminate a packet to all nodes in a wireless sensor network (WSN). However, basic flooding makes all nodes transmit the packet at least once, resulting in the broadcast storm problem in a worst case, and in turn, network resources are severely wasted. Particularly, power is the most valuable resource of WSNs as nodes are powered by batteries, then the waste of energy by the basic flooding lessens the lifetime of WSNs. In order to solve the broadcast storm problem, this paper proposes a dynamic probabilistic flooding that utilizes the neighbor information like the numbers of child and sibling nodes. In general, the more sibling nodes there are, the higher is the probability that a broadcast packet may be sent by one of the sibling nodes. The packet is not retransmitted by itself, though. Meanwhile, if a node has many child nodes its retransmission probability should be high to achieve the high packet delivery ratio. Therefore, these two terms-the numbers of child and sibling nodes-are adopted in the proposed method in order to attain more reliable flooding. The proposed method also adopts the back-off delay scheme to avoid collisions between close neighbors. Simulation results prove that the proposed method outperforms previous flooding methods in respect of the number of duplicate packets and packet delivery ratio.

  5. Informative Bayesian Model Selection: a method for identifying interactions in genome-wide data.

    PubMed

    Aflakparast, Mehran; Masoudi-Nejad, Ali; Bozorgmehr, Joseph H; Visweswaran, Shyam

    2014-10-01

    In high-dimensional genome-wide (GWA) data, a key challenge is to detect genomic variants that interact in a nonlinear fashion in their association with disease. Identifying such genomic interactions is important for elucidating the inheritance of complex phenotypes and diseases. In this paper, we introduce a new computational method called Informative Bayesian Model Selection (IBMS) that leverages correlation among variants in GWA data due to the linkage disequilibrium to identify interactions accurately in a computationally efficient manner. IBMS combines several statistical methods including canonical correlation analysis, logistic regression analysis, and a Bayesians statistical measure of evaluating interactions. Compared to BOOST and BEAM that are two widely used methods for detecting genomic interactions, IBMS had significantly higher power when evaluated on synthetic data. Furthermore, when applied to Alzheimer's disease GWA data, IBMS identified previously reported interactions. IBMS is a useful method for identifying variants in GWA data, and software that implements IBMS is freely available online from http://lbb.ut.ac.ir/Download/LBBsoft/IBMS.

  6. Methods for assessing the quality of data in public health information systems: a critical review.

    PubMed

    Chen, Hong; Yu, Ping; Hailey, David; Wang, Ning

    2014-01-01

    The quality of data in public health information systems can be ensured by effective data quality assessment. In order to conduct effective data quality assessment, measurable data attributes have to be precisely defined. Then reliable and valid measurement methods for data attributes have to be used to measure each attribute. We conducted a systematic review of data quality assessment methods for public health using major databases and well-known institutional websites. 35 studies were eligible for inclusion in the study. A total of 49 attributes of data quality were identified from the literature. Completeness, accuracy and timeliness were the three most frequently assessed attributes of data quality. Most studies directly examined data values. This is complemented by exploring either data users' perception or documentation quality. However, there are limitations of current data quality assessment methods: a lack of consensus on attributes measured; inconsistent definition of the data quality attributes; a lack of mixed methods for assessing data quality; and inadequate attention to reliability and validity. Removal of these limitations is an opportunity for further improvement. PMID:25087521

  7. A method of construction of information images of the acoustic signals of the human bronchopulmonary system

    NASA Astrophysics Data System (ADS)

    Bureev, A. Sh.; Zhdanov, D. S.; Zemlyakov, I. Yu.; Kiseleva, E. Yu.; Khokhlova, L. A.

    2015-11-01

    The present study focuses on the development of a method of identification of respiratory sounds and noises of a human naturally and in various pathological conditions. The existing approaches based on a simple method of frequency and time signal analysis, have insufficient specificity, efficiency and unambiguous interpretation of the results of a clinical study. An algorithm for a phase selection of respiratory cycles and analysis of respiratory sounds resulting from bronchi examination of a patient has been suggested. The algorithm is based on the method of phase timing analysis of bronchi phonograms. The results of the phase-frequency algorithm with high resolution reflects a time position of the traceable signals and the individual structure of recorded signals. This allows using the proposed method for the formation of information images (models) of the diagnostically significant fragments. A weight function, frequency parameters of which can be selectively modified, is used for this purpose. The vision of the weighting function is specific to each type of respiratory noise, traditionally referred to quality characteristics (wet or dry noise, crackling, etc.).

  8. Methods for assessing the quality of data in public health information systems: a critical review.

    PubMed

    Chen, Hong; Yu, Ping; Hailey, David; Wang, Ning

    2014-01-01

    The quality of data in public health information systems can be ensured by effective data quality assessment. In order to conduct effective data quality assessment, measurable data attributes have to be precisely defined. Then reliable and valid measurement methods for data attributes have to be used to measure each attribute. We conducted a systematic review of data quality assessment methods for public health using major databases and well-known institutional websites. 35 studies were eligible for inclusion in the study. A total of 49 attributes of data quality were identified from the literature. Completeness, accuracy and timeliness were the three most frequently assessed attributes of data quality. Most studies directly examined data values. This is complemented by exploring either data users' perception or documentation quality. However, there are limitations of current data quality assessment methods: a lack of consensus on attributes measured; inconsistent definition of the data quality attributes; a lack of mixed methods for assessing data quality; and inadequate attention to reliability and validity. Removal of these limitations is an opportunity for further improvement.

  9. A new method to obtain uniform distribution of ground control points based on regional statistical information

    NASA Astrophysics Data System (ADS)

    Ma, Chao; An, Wei; Deng, Xinpu

    2015-10-01

    The Ground Control Points (GCPs) is an important source of fundamental data in geometric correction for remote sensing imagery. The quantity, accuracy and distribution of GCPs are three factors which may affect the accuracy of geometric correction. It is generally required that the distribution of GCP should be uniform, so they can fully control the accuracy of mapping regions. In this paper, we establish an objective standard of evaluating the uniformity of the GCPs' distribution based on regional statistical information (RSI), and get an optimal distribution of GCPs. This sampling method is called RSIS for short in this work. The Amounts of GCPs in different regions by equally partitioning the image in regions in different manners are counted which forms a vector called RSI vector in this work. The uniformity of GCPs' distribution can be evaluated by a mathematical quantity of the RSI vector. An optimal distribution of GCPs is obtained by searching the RSI vector with the minimum mathematical quantity. In this paper, the simulation annealing is employed to search the optimal distribution of GCPs that have the minimum mathematical quantity of the RSI vector. Experiments are carried out to test the method proposed in this paper, and sampling designs compared are simple random sampling and universal kriging model-based sampling. The experiments indicate that this method is highly recommended as new GCPs sampling design method for geometric correction of remotely sensed imagery.

  10. Evaluation of Statistical Rainfall Disaggregation Methods Using Rain-Gauge Information for West-Central Florida

    SciTech Connect

    Murch, Renee Rokicki; Zhang, Jing; Ross, Mark; Ganguly, Auroop R; Nachabe, Mahmood

    2008-01-01

    Rainfall disaggregation in time can be useful for the simulation of hydrologic systems and the prediction of floods and flash floods. Disaggregation of rainfall to timescales less than 1 h can be especially useful for small urbanized watershed study, and for continuous hydrologic simulations and when Hortonian or saturation-excess runoff dominates. However, the majority of rain gauges in any region record rainfall in daily time steps or, very often, hourly records have extensive missing data. Also, the convective nature of the rainfall can result in significant differences in the measured rainfall at nearby gauges. This study evaluates several statistical approaches for rainfall disaggregation which may be applicable using data from West-Central Florida, specifically from 1 h observations to 15 min records, and proposes new methodologies that have the potential to outperform existing approaches. Four approaches are examined. The first approach is an existing direct scaling method that utilizes observed 15 min rainfall at secondary rain gauges, to disaggregate observed 1 h rainfall at more numerous primary rain gauges. The second approach is an extension of an existing method for continuous rainfall disaggregation through statistical distributional assumptions. The third approach relies on artificial neural networks for the disaggregation process without sorting and the fourth approach extends the neural network methods through statistical preprocessing via new sorting and desorting schemes. The applicability and performance of these methods were evaluated using information from a fairly dense rain gauge network in West-Central Florida. Of the four methods compared, the sorted neural networks and the direct scaling method predicted peak rainfall magnitudes significantly better than the remaining techniques. The study also suggests that desorting algorithms would also be useful to randomly replace the artificial hyetograph within a rainfall period.

  11. A Systematic Evaluation of Different Methods for Calculating Adolescent Vaccination Levels Using Immunization Information System Data

    PubMed Central

    Gowda, Charitha; Dong, Shiming; Potter, Rachel C.; Dombkowski, Kevin J.; Stokley, Shannon

    2013-01-01

    Objective Immunization information systems (IISs) are valuable surveillance tools; however, population relocation may introduce bias when determining immunization coverage. We explored alternative methods for estimating the vaccine-eligible population when calculating adolescent immunization levels using a statewide IIS. Methods We performed a retrospective analysis of the Michigan State Care Improvement Registry (MCIR) for all adolescents aged 11–18 years registered in the MCIR as of October 2010. We explored four methods for determining denominators: (1) including all adolescents with MCIR records, (2) excluding adolescents with out-of-state residence, (3) further excluding those without MCIR activity ≥10 years prior to the evaluation date, and (4) using a denominator based on U.S. Census data. We estimated state- and county-specific coverage levels for four adolescent vaccines. Results We found a 20% difference in estimated vaccination coverage between the most inclusive and restrictive denominator populations. Although there was some variability among the four methods in vaccination at the state level (2%–11%), greater variation occurred at the county level (up to 21%). This variation was substantial enough to potentially impact public health assessments of immunization programs. Generally, vaccines with higher coverage levels had greater absolute variation, as did counties with smaller populations. Conclusion At the county level, using the four denominator calculation methods resulted in substantial differences in estimated adolescent immunization rates that were less apparent when aggregated at the state level. Further research is needed to ascertain the most appropriate method for estimating vaccine coverage levels using IIS data. PMID:24179260

  12. How much information on permeability can we expect from induced polarization methods? (Invited)

    NASA Astrophysics Data System (ADS)

    Binley, A. M.; Slater, L. D.

    2013-12-01

    Recognizing the significance of permeability heterogeneity on solute transport in groundwater, the determination of qualitative and quantitative information on permeability has been a major focus in the field of hydrogeophysics for some time. This drive has been particularly encouraged due to the minimal invasive method of most geophysical techniques, and the ability to produce spatially dense datasets of geophysical properties. Whilst DC resistivity, as a method, has matured into an extremely robust and flexible technique, and despite its wide use for mapping lithologies, translation of DC resistivity, as a property, to permeability is extremely limited, principally because of the sensitivity to pore fluid states (e.g. salinity) and grain surface electrical conductivity. Induced polarization (IP), in contrast, is sensitive to properties related to the grain surface and/or pore throat geometry, and thus it is intuitive to assume that the permeability and induced polarization response may be closely linked. Spectral IP (SIP) potentially adds further valuable information, given the measure of distribution of polarization length scales. In fact, IP as a tool for hydrogeological studies has been recognized for over 50 years, although it is only over the past two decades that significant advances have been made in both methodology (e.g. instruments, data inversion, etc.) and hydrogeological interpretation. Attempts to link IP (including SIP) and permeability have been explored through laboratory, field and model studies. Mechanistic models have been proposed, along with several empirical relationships. Despite these efforts, the ability to link permeability to IP measurements remains challenging. Formation-specific relationships have been demonstrated, and yet a universal link continues to be elusive. Here, we discuss the principal constraints, illustrated using laboratory and field datasets from a number of studies. We highlight specific challenges, including

  13. A Review of Data Quality Assessment Methods for Public Health Information Systems

    PubMed Central

    Chen, Hong; Hailey, David; Wang, Ning; Yu, Ping

    2014-01-01

    High quality data and effective data quality assessment are required for accurately evaluating the impact of public health interventions and measuring public health outcomes. Data, data use, and data collection process, as the three dimensions of data quality, all need to be assessed for overall data quality assessment. We reviewed current data quality assessment methods. The relevant study was identified in major databases and well-known institutional websites. We found the dimension of data was most frequently assessed. Completeness, accuracy, and timeliness were the three most-used attributes among a total of 49 attributes of data quality. The major quantitative assessment methods were descriptive surveys and data audits, whereas the common qualitative assessment methods were interview and documentation review. The limitations of the reviewed studies included inattentiveness to data use and data collection process, inconsistency in the definition of attributes of data quality, failure to address data users’ concerns and a lack of systematic procedures in data quality assessment. This review study is limited by the coverage of the databases and the breadth of public health information systems. Further research could develop consistent data quality definitions and attributes. More research efforts should be given to assess the quality of data use and the quality of data collection process. PMID:24830450

  14. Systems and methods for supplemental weather information presentation on a display

    NASA Technical Reports Server (NTRS)

    Bunch, Brian (Inventor)

    2010-01-01

    An embodiment of the supplemental weather display system presents supplemental weather information on a display in a craft. An exemplary embodiment receives the supplemental weather information from a remote source, determines a location of the supplemental weather information relative to the craft, receives weather information from an on-board radar system, and integrates the supplemental weather information with the weather information received from the on-board radar system.

  15. Using three-phase theory-based formative research to explore healthy eating in Australian truck drivers.

    PubMed

    Vayro, Caitlin; Hamilton, Kyra

    2016-03-01

    In Australia, fruit and vegetable consumption is lower than recommended while discretionary foods (i.e., foods high in fat, sugar, and salt) are eaten in excess. Long-haul truck drivers are a group at risk of unhealthy eating but have received limited attention in the health literature. We aimed to examine long-haul truck drivers eating decisions in order to develop theory-based and empirically-driven health messages to improve their healthy food choices. Drawing on the Theory of Planned Behavior, three-phased formative research was conducted using self-report surveys. Phase 1 (N = 30, Mage = 39.53, SDage = 10.72) identified modal salient beliefs about fruit and vegetable (FV) intake and limiting discretionary choices (DC). There were nine behavioral and seven normative beliefs elicited for both FV and DC; while nine and five control beliefs were elicited for FV and DC, respectively. Phase 2 (N = 148, Mage = 44.23, SDage = 12.08) adopted a prospective design with one week follow-up to examine the predictors of FV and DC intention and behavior. A variety of behavioral and control beliefs were predictive of FV and DC intention and behavior. Normative beliefs were predictive of FV intention and behavior and DC intention only. Phase 3 (N = 20, Mage = 46.9, SDage = 12.85) elicited the reasons why each belief is held/solutions to negative beliefs, that could be used as health messages. In total, 40 reasons/solutions were identified: 26 for FV and 14 for DC. In summary, we found that specific behavioral, normative and control beliefs influenced FV and DC eating decisions. These results have implications for truck driver's health and provide formative research to inform future interventions to improve the food choices of a unique group who are at risk of unhealthy eating behaviors.

  16. A sample theory-based logic model to improve program development, implementation, and sustainability of Farm to School programs.

    PubMed

    Ratcliffe, Michelle M

    2012-08-01

    Farm to School programs hold promise to address childhood obesity. These programs may increase students’ access to healthier foods, increase students’ knowledge of and desire to eat these foods, and increase their consumption of them. Implementing Farm to School programs requires the involvement of multiple people, including nutrition services, educators, and food producers. Because these groups have not traditionally worked together and each has different goals, it is important to demonstrate how Farm to School programs that are designed to decrease childhood obesity may also address others’ objectives, such as academic achievement and economic development. A logic model is an effective tool to help articulate a shared vision for how Farm to School programs may work to accomplish multiple goals. Furthermore, there is evidence that programs based on theory are more likely to be effective at changing individuals’ behaviors. Logic models based on theory may help to explain how a program works, aid in efficient and sustained implementation, and support the development of a coherent evaluation plan. This article presents a sample theory-based logic model for Farm to School programs. The presented logic model is informed by the polytheoretical model for food and garden-based education in school settings (PMFGBE). The logic model has been applied to multiple settings, including Farm to School program development and evaluation in urban and rural school districts. This article also includes a brief discussion on the development of the PMFGBE, a detailed explanation of how Farm to School programs may enhance the curricular, physical, and social learning environments of schools, and suggestions for the applicability of the logic model for practitioners, researchers, and policy makers.

  17. Using three-phase theory-based formative research to explore healthy eating in Australian truck drivers.

    PubMed

    Vayro, Caitlin; Hamilton, Kyra

    2016-03-01

    In Australia, fruit and vegetable consumption is lower than recommended while discretionary foods (i.e., foods high in fat, sugar, and salt) are eaten in excess. Long-haul truck drivers are a group at risk of unhealthy eating but have received limited attention in the health literature. We aimed to examine long-haul truck drivers eating decisions in order to develop theory-based and empirically-driven health messages to improve their healthy food choices. Drawing on the Theory of Planned Behavior, three-phased formative research was conducted using self-report surveys. Phase 1 (N = 30, Mage = 39.53, SDage = 10.72) identified modal salient beliefs about fruit and vegetable (FV) intake and limiting discretionary choices (DC). There were nine behavioral and seven normative beliefs elicited for both FV and DC; while nine and five control beliefs were elicited for FV and DC, respectively. Phase 2 (N = 148, Mage = 44.23, SDage = 12.08) adopted a prospective design with one week follow-up to examine the predictors of FV and DC intention and behavior. A variety of behavioral and control beliefs were predictive of FV and DC intention and behavior. Normative beliefs were predictive of FV intention and behavior and DC intention only. Phase 3 (N = 20, Mage = 46.9, SDage = 12.85) elicited the reasons why each belief is held/solutions to negative beliefs, that could be used as health messages. In total, 40 reasons/solutions were identified: 26 for FV and 14 for DC. In summary, we found that specific behavioral, normative and control beliefs influenced FV and DC eating decisions. These results have implications for truck driver's health and provide formative research to inform future interventions to improve the food choices of a unique group who are at risk of unhealthy eating behaviors. PMID:26710674

  18. 77 FR 25148 - Request for Information Regarding Scope, Methods, and Data Sources for Conducting Study of Pre...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-27

    ... PROTECTION Request for Information Regarding Scope, Methods, and Data Sources for Conducting Study of Pre... identify the appropriate scope of the Study, as well as appropriate methods and sources of data for... help identify the appropriate scope, methods, and sources of data for the Study required by...

  19. Municipal Solid Waste Management using Geographical Information System aided methods: a mini review.

    PubMed

    Khan, Debishree; Samadder, Sukha Ranjan

    2014-11-01

    Municipal Solid Waste Management (MSWM) is one of the major environmental challenges in developing countries. Many efforts to reduce and recover the wastes have been made, but still land disposal of solid wastes is the most popular one. Finding an environmentally sound landfill site is a challenging task. This paper addresses a mini review on various aspects of MSWM (suitable landfill site selection, route optimization and public acceptance) using the Geographical Information System (GIS) coupled with other tools. The salient features of each of the integrated tools with GIS are discussed in this paper. It is also addressed how GIS can help in optimizing routes for collection of solid wastes from transfer stations to disposal sites to reduce the overall cost of solid waste management. A detailed approach on performing a public acceptance study of a proposed landfill site is presented in this study. The study will help municipal authorities to identify the most effective method of MSWM.

  20. Physician's Referral Letter Bibliographic Service: A New Method of Disseminating Medical Information *

    PubMed Central

    Lodico, Norma Jean

    1973-01-01

    At the time this paper was written a unique trial project for disseminating medical literature to physicians had been in operation for six months (October 1971-April 1972) at the Virginia Medical Information System. Doctors in the state who referred patients to the Medical College of Virginia received short lists of references relevant to the problems of their patients as described in referral letters sent them by MCV consultants. Doctors receiving such lists were offered free photocopies of the articles cited if they could not obtain them locally. Of some 700 reference lists sent out, VAMIS received 12½ percent direct responses, and 22 percent of respondents to a questionnaire reported obtaining articles elsewhere. Ninety percent of the respondents favored the service and approved of its method. Funding problems necessitated revision of the service in the summer of 1972. Presently the service is provided only on request of the referring physician. PMID:4800294

  1. Clinical simulation: A method for development and evaluation of clinical information systems.

    PubMed

    Jensen, Sanne; Kushniruk, Andre W; Nøhr, Christian

    2015-04-01

    Use of clinical simulation in the design and evaluation of eHealth systems and applications has increased during the last decade. This paper describes a methodological approach for using clinical simulations in the design and evaluation of clinical information systems. The method is based on experiences from more than 20 clinical simulation studies conducted at the ITX-lab in the Capital Region of Denmark during the last 5 years. A ten-step approach to conducting simulations is presented in this paper. To illustrate the approach, a clinical simulation study concerning implementation of Digital Clinical Practice Guidelines in a prototype planning and coordination module is presented. In the case study potential benefits were assessed in a full-scale simulation test including 18 health care professionals. The results showed that health care professionals can benefit from such a module. Unintended consequences concerning terminology and changes in the division of responsibility amongst healthcare professionals were also identified, and questions were raised concerning future workflow across sector borders. Furthermore unexpected new possible benefits concerning improved communication, content of information in discharge letters and quality management emerged during the testing. In addition new potential groups of users were identified. The case study is used to demonstrate the potential of using the clinical simulation approach described in the paper.

  2. Genetically informative research on adolescent substance use: methods, findings and challenges

    PubMed Central

    Lynskey, Michael T.; Agrawal, Arpana; Heath, Andrew C.

    2010-01-01

    Objective To provide an overview of the genetic epidemiology of substance use and misuse in adolescents. Method We present a selective review of genetically informative research strategies, their limitations and key findings examining issues related to the heritability of substance use and substance use disorders in children and adolescents. Results Adoption, twin and extended family designs have established there is a strong heritable component to liability to nicotine, alcohol and illicit drug dependence in adults. However, shared environmental influences are relatively stronger in youth samples and at earlier stages of substance involvement (e.g., use). There is considerable overlap in the genetic influences associated with the abuse/ dependence across drug classes while shared genetic influences also contribute to the commonly observed associations between substance use disorders and both externalizing and, to a lesser extent, internalizing psychopathology. Rapid technological advances have made the identification of specific gene variants that influence risks for substance use disorders feasible and linkage and association (including genomewide association studies) have identified promising candidate genes implicated in the development of substance use disorders. Conclusions Studies using genetically informative research designs, including those that examine aggregate genetic factors and those examining specific gene variants, individually and in interaction with environmental influences, offer promising avenues not only for delineating genetic effects on substance use disorders but also for understanding the unfolding of risk across development and the interaction between environmental and genetic factors in the etiology of these disorders. PMID:21093770

  3. The method of assessment of solar potential for selected area with use Geographical Information Systems

    NASA Astrophysics Data System (ADS)

    Pietras, M.; Netzel, P.

    2012-10-01

    This paper describes a method for analyse the spatial distribution of solar energy potential based on calculated solar irradiation with use of GIS (Geographical Information System). Program GIS GRASS gives opportunity to create spatial distribution of solar radiation which is taking into account such important elements like: terrain, atmosphere, pollutants, water and aerosol in atmosphere, clouds. The use of GIS GRASS module - named r.sun gives opportunity to generate spatial distribution of solar radiation on Lower Silesia (south - west part of Poland). In this work the analyse of solar potential to obtain hot water in the individual household were done. This analyse was based on the amount of total solar radiation monthly sums generated by r.sun module. Spatial distribution of solar potential was used to classify the Lower Silesia region in terms of work efficiency solar installations. It is very usefully because it gives people information about the date of the return of the funds invested in the purchase of the solar collectors.

  4. Text Messaging as a Method for Health Ministry Leaders to Disseminate Cancer Information.

    PubMed

    Schoenberger, Yu-Mei M; Phillips, Janice M; Mohiuddin, M Omar

    2015-12-01

    Mobile phone-based interventions can play a significant role in decreasing health disparities by enhancing population and individual health. The purpose of this study was to explore health ministry leaders (HMLs) and congregation members' communication technology usage and to assess the acceptability of mobile technology for delivery of cancer information. Six focus groups were conducted in two urban African-American churches with trained HMLs (n=7) and congregation members (n=37) to determine mobile phone technology usage and identify barriers and facilitators to a mobile phone intervention. All participants were African-American, majority were female (80% of HMLs; 73% of congregation members), and the mean age was 54 (HMLs) and 41 (congregation members). All of the HMLs and 95% of congregation members indicated owning a mobile phone. All HMLs reported sending/receiving text messages, whereas of the congregation members, 85% sent and 91% received text messages. The facilitators of a text messaging system mentioned by participants included alternative form of communication, quick method for disseminating information, and accessibility. The overall main barriers reported by both groups to using mobile technology include receiving multiple messages, difficulty texting, and cost. Ways to overcome barriers were explored with participants, and education was the most proposed solution. The findings from this study indicate that HMLs and congregation members are interested in receiving text messages to promote healthy lifestyles and cancer awareness. These findings represent the first step in the development of a mobile phone-based program designed to enhance the work of health ministry leaders.

  5. Informing HIV prevention efforts targeting Liberian youth: a study using the PLACE method in Liberia

    PubMed Central

    2013-01-01

    Background Preventing HIV infection among young people is a priority for the Liberian government. Data on the young people in Liberia are scarce but needed to guide HIV programming efforts. Methods We used the Priorities for Local AIDS Control Efforts (PLACE) method to gather information on risk behaviors that young people (ages 14 to 24) engage in or are exposed to that increase their vulnerability for HIV infection. Community informants identified 240 unique venues of which 150 were visited and verified by research staff. 89 of the 150 venues comprised our sampling frame and 571 females and 548 males were interviewed in 50 venues using a behavioral survey. Results Ninety-one percent of females and 86% of males reported being sexually active. 56% of females and 47% of males reported they initiated sexual activity before the age of 15. Among the sexually active females, 71% reported they had received money or a gift for sex and 56% of males reported they had given money or goods for sex. 20% of females and 6% males reported that their first sexual encounter was forced and 15% of females and 6% of males reported they had been forced to have sex in the past year. Multiple partnerships were common among both sexes with 81% females and 76% males reporting one or more sex partners in the past four weeks. Less than 1% reported having experiences with injecting drugs and only 1% of males reporting have sex with men. While knowledge of HIV/AIDS was high, prevention behaviors including HIV testing and condom use were low. Conclusion Youth-focused HIV efforts in Liberia need to address transactional sex and multiple and concurrent partnerships. HIV prevention interventions should include efforts to meet the economic needs of youth. PMID:24107301

  6. Retrieval of Aerosol information from UV measurement by using optimal estimation method

    NASA Astrophysics Data System (ADS)

    KIM, M.; Kim, J.; Jeong, U.; Kim, W. V.; Kim, S. K.; Lee, S. D.; Moon, K. J.

    2014-12-01

    An algorithm to retrieve aerosol optical depth (AOD), single scattering albedo (SSA), and aerosol loading height is developed for GEMS (Geostationary Environment Monitoring Spectrometer) measurement. The GEMS is planned to be launched in geostationary orbit in 2018, and employs hyper-spectral imaging with 0.6 nm resolution to observe solar backscatter radiation in the UV and Visible range. In the UV range, the low surface contribution to the backscattered radiation and strong interaction between aerosol absorption and molecular scattering can be advantageous in retrieving aerosol information such as AOD and SSA [Torres et al., 2007; Torres et al., 2013; Ahn et al., 2014]. However, the large contribution of atmospheric scattering results in the increase of the sensitivity of the backward radiance to aerosol loading height. Thus, the assumption of aerosol loading height becomes important issue to obtain accurate result. Accordingly, this study focused on the simultaneous retrieval of aerosol loading height with AOD and SSA by utilizing the optimal estimation method. For the RTM simulation, the aerosol optical properties were analyzed from AERONET inversion data (level 2.0) at 46 AERONET sites over ASIA. Also, 2-channel inversion method is applied to estimate a priori value of the aerosol information to solve the Lavenberg Marquardt equation. The GEMS aerosol algorithm is tested with OMI level-1B dataset, a provisional data for GEMS measurement, and the result is compared with OMI standard aerosol product and AERONET values. The retrieved AOD and SSA show reasonable distribution compared with OMI products, and are well correlated with the value measured from AERONET. However, retrieval uncertainty in aerosol loading height is relatively larger than other results.

  7. Delayed Retention of Information Learned to Criterion for Proficiency Modular Instruction in a College Reading Methods Course.

    ERIC Educational Resources Information Center

    Dishner, Ernest K.; And Others

    The purpose of this study was to examine the degree of forgetting of meaningful information learned to 90% criterion by thirty-nine college students in two introductory reading methods classes in the word recognition (exclusive of phonics) module of those classes. The amount of information gained was compared to forgetting to determine the percent…

  8. Cases on Successful E-Learning Practices in the Developed and Developing World: Methods for the Global Information Economy

    ERIC Educational Resources Information Center

    Olaniran, Bolanle A., Ed.

    2010-01-01

    E-learning has become a significant aspect of training and education in the worldwide information economy as an attempt to create and facilitate a competent global work force. "Cases on Successful E-Learning Practices in the Developed and Developing World: Methods for the Global Information Economy" provides eclectic accounts of case studies in…

  9. One of the Methods of Organizing the Information Storage Unit of a System of Data Retrieval and Processing (SPOD).

    ERIC Educational Resources Information Center

    Askinazi, R. B.; Papina, I. L.

    The paper deals with one method of organizing the storage unit of a descriptor IPS (information retrieval system) of the SPOD type, the information array of which constitutes the totality of uniform documents with ordered disposition of data within each of them. Three categories of data composing the retrieval form of document were defined…

  10. Practices and Methods for Actualization of the Scientific Information in Art Excursions (Excursions and Cultural Heritage in the Contemporary World)

    ERIC Educational Resources Information Center

    Portnova, Tatiana V.

    2016-01-01

    The paper deals with various practices and methods for actualization of the scientific information in art excursions. The modern society is characterized by commitment to information richness. The range of cultural and historical materials used as the basis for art excursions is really immense. However if to consider the number of excursions with…

  11. [Ultrasonic Doppler-cardiography as a method of study of cardiodynamics during flight based on patent and information analysis].

    PubMed

    Bednenko, V S; Kozlov, A N

    1983-01-01

    This paper reviews patent and information data on the methods of cardiac location using ultrasound dopplercardiography, as well as on the design and development of onboard equipment to be used for medical monitoring of aircraft and spacecraft crewmembers inflight. It is emphasized that dopplercardiography, being a very informative, noise-proof and relatively simple technique, holds high promise for operational medical monitoring.

  12. Generalized Cross Entropy Method for estimating joint distribution from incomplete information

    NASA Astrophysics Data System (ADS)

    Xu, Hai-Yan; Kuo, Shyh-Hao; Li, Guoqi; Legara, Erika Fille T.; Zhao, Daxuan; Monterola, Christopher P.

    2016-07-01

    Obtaining a full joint distribution from individual marginal distributions with incomplete information is a non-trivial task that continues to challenge researchers from various domains including economics, demography, and statistics. In this work, we develop a new methodology referred to as "Generalized Cross Entropy Method" (GCEM) that is aimed at addressing the issue. The objective function is proposed to be a weighted sum of divergences between joint distributions and various references. We show that the solution of the GCEM is unique and global optimal. Furthermore, we illustrate the applicability and validity of the method by utilizing it to recover the joint distribution of a household profile of a given administrative region. In particular, we estimate the joint distribution of the household size, household dwelling type, and household home ownership in Singapore. Results show a high-accuracy estimation of the full joint distribution of the household profile under study. Finally, the impact of constraints and weight on the estimation of joint distribution is explored.

  13. Incorporating level set methods in Geographical Information Systems (GIS) for land-surface process modeling

    NASA Astrophysics Data System (ADS)

    Pullar, D.

    2005-08-01

    Land-surface processes include a broad class of models that operate at a landscape scale. Current modelling approaches tend to be specialised towards one type of process, yet it is the interaction of processes that is increasing seen as important to obtain a more integrated approach to land management. This paper presents a technique and a tool that may be applied generically to landscape processes. The technique tracks moving interfaces across landscapes for processes such as water flow, biochemical diffusion, and plant dispersal. Its theoretical development applies a Lagrangian approach to motion over a Eulerian grid space by tracking quantities across a landscape as an evolving front. An algorithm for this technique, called level set method, is implemented in a geographical information system (GIS). It fits with a field data model in GIS and is implemented as operators in map algebra. The paper describes an implementation of the level set methods in a map algebra programming language, called MapScript, and gives example program scripts for applications in ecology and hydrology.

  14. A Rapid Monitoring and Evaluation Method of Schistosomiasis Based on Spatial Information Technology.

    PubMed

    Wang, Yong; Zhuang, Dafang

    2015-12-01

    Thanks to Spatial Information Technologies (SITs) such as Remote Sensing (RS) and Geographical Information System (GIS) that are being quickly developed and updated, SITs are being used more widely in the public health field. The use of SITs to study the characteristics of the temporal and spatial distribution of Schistosoma japonicum and to assess the risk of infection provides methods for the control and prevention of schistosomiasis japonica has gradually become a hot topic in the field. The purpose of the present paper was to use RS and GIS technology to develop an efficient method of prediction and assessment of the risk of schistosomiasis japonica. We choose the Yueyang region, close to the east DongTing Lake (Hunan Province, China), as the study area, where a recent serious outbreak of schistosomiasis japonica took place. We monitored and evaluated the transmission risk of schistosomiasis japonica in the region using SITs. Water distribution data were extracted from RS images. The ground temperature, ground humidity and vegetation index were calculated based on RS images. Additionally, the density of oncomelania snails, which are the Schistosoma japonicum intermediate host, was calculated on the base of RS data and field measurements. The spatial distribution of oncomelania snails was explored using SITs in order to estimate the area surrounding the residents with transmission risk of schistosomiasis japonica. Our research result demonstrated: (1) the risk factors for the transmission of schistosomiasis japonica were closely related to the living environment of oncomelania snails. Key factors such as water distribution, ground temperature, ground humidity and vegetation index can be quickly obtained and calculated from RS images; (2) using GIS technology and a RS deduction technique along with statistical regression models, the density distribution model of oncomelania snails could be quickly built; (3) using SITs and analysis with overlaying population

  15. A Rapid Monitoring and Evaluation Method of Schistosomiasis Based on Spatial Information Technology

    PubMed Central

    Wang, Yong; Zhuang, Dafang

    2015-01-01

    Thanks to Spatial Information Technologies (SITs) such as Remote Sensing (RS) and Geographical Information System (GIS) that are being quickly developed and updated, SITs are being used more widely in the public health field. The use of SITs to study the characteristics of the temporal and spatial distribution of Schistosoma japonicum and to assess the risk of infection provides methods for the control and prevention of schistosomiasis japonica has gradually become a hot topic in the field. The purpose of the present paper was to use RS and GIS technology to develop an efficient method of prediction and assessment of the risk of schistosomiasis japonica. We choose the Yueyang region, close to the east DongTing Lake (Hunan Province, China), as the study area, where a recent serious outbreak of schistosomiasis japonica took place. We monitored and evaluated the transmission risk of schistosomiasis japonica in the region using SITs. Water distribution data were extracted from RS images. The ground temperature, ground humidity and vegetation index were calculated based on RS images. Additionally, the density of oncomelania snails, which are the Schistosoma japonicum intermediate host, was calculated on the base of RS data and field measurements. The spatial distribution of oncomelania snails was explored using SITs in order to estimate the area surrounding the residents with transmission risk of schistosomiasis japonica. Our research result demonstrated: (1) the risk factors for the transmission of schistosomiasis japonica were closely related to the living environment of oncomelania snails. Key factors such as water distribution, ground temperature, ground humidity and vegetation index can be quickly obtained and calculated from RS images; (2) using GIS technology and a RS deduction technique along with statistical regression models, the density distribution model of oncomelania snails could be quickly built; (3) using SITs and analysis with overlaying population

  16. A Rapid Monitoring and Evaluation Method of Schistosomiasis Based on Spatial Information Technology.

    PubMed

    Wang, Yong; Zhuang, Dafang

    2015-12-12

    Thanks to Spatial Information Technologies (SITs) such as Remote Sensing (RS) and Geographical Information System (GIS) that are being quickly developed and updated, SITs are being used more widely in the public health field. The use of SITs to study the characteristics of the temporal and spatial distribution of Schistosoma japonicum and to assess the risk of infection provides methods for the control and prevention of schistosomiasis japonica has gradually become a hot topic in the field. The purpose of the present paper was to use RS and GIS technology to develop an efficient method of prediction and assessment of the risk of schistosomiasis japonica. We choose the Yueyang region, close to the east DongTing Lake (Hunan Province, China), as the study area, where a recent serious outbreak of schistosomiasis japonica took place. We monitored and evaluated the transmission risk of schistosomiasis japonica in the region using SITs. Water distribution data were extracted from RS images. The ground temperature, ground humidity and vegetation index were calculated based on RS images. Additionally, the density of oncomelania snails, which are the Schistosoma japonicum intermediate host, was calculated on the base of RS data and field measurements. The spatial distribution of oncomelania snails was explored using SITs in order to estimate the area surrounding the residents with transmission risk of schistosomiasis japonica. Our research result demonstrated: (1) the risk factors for the transmission of schistosomiasis japonica were closely related to the living environment of oncomelania snails. Key factors such as water distribution, ground temperature, ground humidity and vegetation index can be quickly obtained and calculated from RS images; (2) using GIS technology and a RS deduction technique along with statistical regression models, the density distribution model of oncomelania snails could be quickly built; (3) using SITs and analysis with overlaying population

  17. A generic model for data acquisition: Connectionist methods of information processing

    NASA Astrophysics Data System (ADS)

    Ehrlich, Jacques

    1993-06-01

    EDDAKS (Event Driven Data Acquisition Kernel System), for the quality control of products created in industrial production processes, is proposed. It is capable of acquiring information about discrete event systems by synchronizing to them via the events. EDDAKS consists of EdObjects, forming a hierarchy, which react to EdEvents, and perform processing operations on messages. The hierarchy of EdObjects consists (from bottom up) of the Sensor, the Phase, the Extracter, the Dynamic Spreadsheet, and EDDAKS itself. The first three levels contribute to building the internal representation: a state vector characterizing a product in the course of production. The Dynamic Spreadsheet, is a processing structure that can be parameterized, used to perform calculations on a set of internal representations in order to deliver the external representation to the user. A system intended for quality control of the products delivered by a concrete production plant was generated by EDDAKS and used to validate. Processing methods using the multilayer perceptron model were considered. Two contributions aimed at improving the performance of this network are proposed. One consists of implanting a conjugate gradient method. The effectiveness of this method depends on the determination of an optimum gradient step that is efficiently calculated by a linear search using a secant algorithm. The other is intended to reduce the connectivity of the network by adapting it to the problem to be solved. It consists of identifying links having little or no activity and destroying them. This activity is determined by evaluating the covariance between each of the inputs of a cell and its output. An experiment in which nonlinear prediction is applied to a civil engineering problem is described.

  18. Grey-Theory-Based Optimization Model of Emergency Logistics Considering Time Uncertainty

    PubMed Central

    Qiu, Bao-Jian; Zhang, Jiang-Hua; Qi, Yuan-Tao; Liu, Yang

    2015-01-01

    Natural disasters occur frequently in recent years, causing huge casualties and property losses. Nowadays, people pay more and more attention to the emergency logistics problems. This paper studies the emergency logistics problem with multi-center, multi-commodity, and single-affected-point. Considering that the path near the disaster point may be damaged, the information of the state of the paths is not complete, and the travel time is uncertainty, we establish the nonlinear programming model that objective function is the maximization of time-satisfaction degree. To overcome these drawbacks: the incomplete information and uncertain time, this paper firstly evaluates the multiple roads of transportation network based on grey theory and selects the reliable and optimal path. Then simplify the original model under the scenario that the vehicle only follows the optimal path from the emergency logistics center to the affected point, and use Lingo software to solve it. The numerical experiments are presented to show the feasibility and effectiveness of the proposed method. PMID:26417946

  19. Grey-Theory-Based Optimization Model of Emergency Logistics Considering Time Uncertainty.

    PubMed

    Qiu, Bao-Jian; Zhang, Jiang-Hua; Qi, Yuan-Tao; Liu, Yang

    2015-01-01

    Natural disasters occur frequently in recent years, causing huge casualties and property losses. Nowadays, people pay more and more attention to the emergency logistics problems. This paper studies the emergency logistics problem with multi-center, multi-commodity, and single-affected-point. Considering that the path near the disaster point may be damaged, the information of the state of the paths is not complete, and the travel time is uncertainty, we establish the nonlinear programming model that objective function is the maximization of time-satisfaction degree. To overcome these drawbacks: the incomplete information and uncertain time, this paper firstly evaluates the multiple roads of transportation network based on grey theory and selects the reliable and optimal path. Then simplify the original model under the scenario that the vehicle only follows the optimal path from the emergency logistics center to the affected point, and use Lingo software to solve it. The numerical experiments are presented to show the feasibility and effectiveness of the proposed method.

  20. Increasing Cervical Cancer Awareness and Screening in Jamaica: Effectiveness of a Theory-Based Educational Intervention.

    PubMed

    Coronado Interis, Evelyn; Anakwenze, Chidinma P; Aung, Maug; Jolly, Pauline E

    2015-12-22

    Despite declines in cervical cancer mortality in developed countries, cervical cancer incidence and mortality rates remain high in Jamaica due to low levels of screening. Effective interventions are needed to decrease barriers to preventive behaviors and increase adoption of behaviors and services to improve prospects of survival. We enrolled 225 women attending health facilities in an intervention consisting of a pre-test, educational presentation and post-test. The questionnaires assessed attitudes, knowledge, risk factors, and symptoms of cervical cancer among women. Changes in knowledge and intention to screen were assessed using paired t-tests and tests for correlated proportions. Participants were followed approximately six months post-intervention to determine cervical cancer screening rates. We found statistically significant increases from pre-test to post-test in the percentage of questions correctly answered and in participants' intention to screen for cervical cancer. The greatest improvements were observed in responses to questions on knowledge, symptoms and prevention, with some items increasing up to 62% from pre-test to post-test. Of the 123 women reached for follow-up, 50 (40.7%) screened for cervical cancer. This theory-based education intervention significantly increased knowledge of and intention to screen for cervical cancer, and may be replicated in similar settings to promote awareness and increase screening rates.

  1. Securing mobile ad hoc networks using danger theory-based artificial immune algorithm.

    PubMed

    Abdelhaq, Maha; Alsaqour, Raed; Abdelhaq, Shawkat

    2015-01-01

    A mobile ad hoc network (MANET) is a set of mobile, decentralized, and self-organizing nodes that are used in special cases, such as in the military. MANET properties render the environment of this network vulnerable to different types of attacks, including black hole, wormhole and flooding-based attacks. Flooding-based attacks are one of the most dangerous attacks that aim to consume all network resources and thus paralyze the functionality of the whole network. Therefore, the objective of this paper is to investigate the capability of a danger theory-based artificial immune algorithm called the mobile dendritic cell algorithm (MDCA) to detect flooding-based attacks in MANETs. The MDCA applies the dendritic cell algorithm (DCA) to secure the MANET with additional improvements. The MDCA is tested and validated using Qualnet v7.1 simulation tool. This work also introduces a new simulation module for a flooding attack called the resource consumption attack (RCA) using Qualnet v7.1. The results highlight the high efficiency of the MDCA in detecting RCAs in MANETs. PMID:25946001

  2. Securing Mobile Ad Hoc Networks Using Danger Theory-Based Artificial Immune Algorithm

    PubMed Central

    2015-01-01

    A mobile ad hoc network (MANET) is a set of mobile, decentralized, and self-organizing nodes that are used in special cases, such as in the military. MANET properties render the environment of this network vulnerable to different types of attacks, including black hole, wormhole and flooding-based attacks. Flooding-based attacks are one of the most dangerous attacks that aim to consume all network resources and thus paralyze the functionality of the whole network. Therefore, the objective of this paper is to investigate the capability of a danger theory-based artificial immune algorithm called the mobile dendritic cell algorithm (MDCA) to detect flooding-based attacks in MANETs. The MDCA applies the dendritic cell algorithm (DCA) to secure the MANET with additional improvements. The MDCA is tested and validated using Qualnet v7.1 simulation tool. This work also introduces a new simulation module for a flooding attack called the resource consumption attack (RCA) using Qualnet v7.1. The results highlight the high efficiency of the MDCA in detecting RCAs in MANETs. PMID:25946001

  3. Securing mobile ad hoc networks using danger theory-based artificial immune algorithm.

    PubMed

    Abdelhaq, Maha; Alsaqour, Raed; Abdelhaq, Shawkat

    2015-01-01

    A mobile ad hoc network (MANET) is a set of mobile, decentralized, and self-organizing nodes that are used in special cases, such as in the military. MANET properties render the environment of this network vulnerable to different types of attacks, including black hole, wormhole and flooding-based attacks. Flooding-based attacks are one of the most dangerous attacks that aim to consume all network resources and thus paralyze the functionality of the whole network. Therefore, the objective of this paper is to investigate the capability of a danger theory-based artificial immune algorithm called the mobile dendritic cell algorithm (MDCA) to detect flooding-based attacks in MANETs. The MDCA applies the dendritic cell algorithm (DCA) to secure the MANET with additional improvements. The MDCA is tested and validated using Qualnet v7.1 simulation tool. This work also introduces a new simulation module for a flooding attack called the resource consumption attack (RCA) using Qualnet v7.1. The results highlight the high efficiency of the MDCA in detecting RCAs in MANETs.

  4. Evaluating clinical simulations for learning procedural skills: a theory-based approach.

    PubMed

    Kneebone, Roger

    2005-06-01

    Simulation-based learning is becoming widely established within medical education. It offers obvious benefits to novices learning invasive procedural skills, especially in a climate of decreasing clinical exposure. However, simulations are often accepted uncritically, with undue emphasis being placed on technological sophistication at the expense of theory-based design. The author proposes four key areas that underpin simulation-based learning, and summarizes the theoretical grounding for each. These are (1) gaining technical proficiency (psychomotor skills and learning theory, the importance of repeated practice and regular reinforcement), (2) the place of expert assistance (a Vygotskian interpretation of tutor support, where assistance is tailored to each learner's needs), (3) learning within a professional context (situated learning and contemporary apprenticeship theory), and (4) the affective component of learning (the effect of emotion on learning). The author then offers four criteria for critically evaluating new or existing simulations, based on the theoretical framework outlined above. These are: (1) Simulations should allow for sustained, deliberate practice within a safe environment, ensuring that recently-acquired skills are consolidated within a defined curriculum which assures regular reinforcement; (2) simulations should provide access to expert tutors when appropriate, ensuring that such support fades when no longer needed; (3) simulations should map onto real-life clinical experience, ensuring that learning supports the experience gained within communities of actual practice; and (4) simulation-based learning environments should provide a supportive, motivational, and learner-centered milieu which is conducive to learning.

  5. Increasing Cervical Cancer Awareness and Screening in Jamaica: Effectiveness of a Theory-Based Educational Intervention

    PubMed Central

    Coronado Interis, Evelyn; Anakwenze, Chidinma P.; Aung, Maug; Jolly, Pauline E.

    2015-01-01

    Despite declines in cervical cancer mortality in developed countries, cervical cancer incidence and mortality rates remain high in Jamaica due to low levels of screening. Effective interventions are needed to decrease barriers to preventive behaviors and increase adoption of behaviors and services to improve prospects of survival. We enrolled 225 women attending health facilities in an intervention consisting of a pre-test, educational presentation and post-test. The questionnaires assessed attitudes, knowledge, risk factors, and symptoms of cervical cancer among women. Changes in knowledge and intention to screen were assessed using paired t-tests and tests for correlated proportions. Participants were followed approximately six months post-intervention to determine cervical cancer screening rates. We found statistically significant increases from pre-test to post-test in the percentage of questions correctly answered and in participants’ intention to screen for cervical cancer. The greatest improvements were observed in responses to questions on knowledge, symptoms and prevention, with some items increasing up to 62% from pre-test to post-test. Of the 123 women reached for follow-up, 50 (40.7%) screened for cervical cancer. This theory-based education intervention significantly increased knowledge of and intention to screen for cervical cancer, and may be replicated in similar settings to promote awareness and increase screening rates. PMID:26703641

  6. The use of theory based semistructured elicitation questionnaires: formative research for CDC's Prevention Marketing Initiative.

    PubMed Central

    Middlestadt, S E; Bhattacharyya, K; Rosenbaum, J; Fishbein, M; Shepherd, M

    1996-01-01

    Through one of its many HIV prevention programs, the Prevention Marketing Initiative, the Centers for Disease Control and Prevention promotes a multifaceted strategy for preventing the sexual transmission of HIV/AIDS among people less than 25 years of age. The Prevention Marketing Initiative is an application of marketing and consumer-oriented technologies that rely heavily on behavioral research and behavior change theories to bring the behavioral and social sciences to bear on practical program planning decisions. One objective of the Prevention Marketing Initiative is to encourage consistent and correct condom use among sexually active young adults. Qualitative formative research is being conducted in several segments of the population of heterosexually active, unmarried young adults between 18 and 25 using a semistructured elicitation procedure to identify and understand underlying behavioral determinants of consistent condom use. The purpose of this paper is to illustrate the use of this type of qualitative research methodology in designing effective theory-based behavior change interventions. Issues of research design and data collection and analysis are discussed. To illustrate the methodology, results of content analyses of selected responses to open-ended questions on consistent condom use are presented by gender (male, female), ethnic group (white, African American), and consistency of condom use (always, sometimes). This type of formative research can be applied immediately to designing programs and is invaluable for valid and relevant larger-scale quantitative research. PMID:8862153

  7. Applying Sequential Analytic Methods to Self-Reported Information to Anticipate Care Needs

    PubMed Central

    Bayliss, Elizabeth A.; Powers, J. David; Ellis, Jennifer L.; Barrow, Jennifer C.; Strobel, MaryJo; Beck, Arne

    2016-01-01

    Purpose: Identifying care needs for newly enrolled or newly insured individuals is important under the Affordable Care Act. Systematically collected patient-reported information can potentially identify subgroups with specific care needs prior to service use. Methods: We conducted a retrospective cohort investigation of 6,047 individuals who completed a 10-question needs assessment upon initial enrollment in Kaiser Permanente Colorado (KPCO), a not-for-profit integrated delivery system, through the Colorado State Individual Exchange. We used responses from the Brief Health Questionnaire (BHQ), to develop a predictive model for cost for receiving care in the top 25 percent, then applied cluster analytic techniques to identify different high-cost subpopulations. Per-member, per-month cost was measured from 6 to 12 months following BHQ response. Results: BHQ responses significantly predictive of high-cost care included self-reported health status, functional limitations, medication use, presence of 0–4 chronic conditions, self-reported emergency department (ED) use during the prior year, and lack of prior insurance. Age, gender, and deductible-based insurance product were also predictive. The largest possible range of predicted probabilities of being in the top 25 percent of cost was 3.5 percent to 96.4 percent. Within the top cost quartile, examples of potentially actionable clusters of patients included those with high morbidity, prior utilization, depression risk and financial constraints; those with high morbidity, previously uninsured individuals with few financial constraints; and relatively healthy, previously insured individuals with medication needs. Conclusions: Applying sequential predictive modeling and cluster analytic techniques to patient-reported information can identify subgroups of individuals within heterogeneous populations who may benefit from specific interventions to optimize initial care delivery. PMID:27563684

  8. Text Messaging as a Method for Health Ministry Leaders to Disseminate Cancer Information.

    PubMed

    Schoenberger, Yu-Mei M; Phillips, Janice M; Mohiuddin, M Omar

    2015-12-01

    Mobile phone-based interventions can play a significant role in decreasing health disparities by enhancing population and individual health. The purpose of this study was to explore health ministry leaders (HMLs) and congregation members' communication technology usage and to assess the acceptability of mobile technology for delivery of cancer information. Six focus groups were conducted in two urban African-American churches with trained HMLs (n=7) and congregation members (n=37) to determine mobile phone technology usage and identify barriers and facilitators to a mobile phone intervention. All participants were African-American, majority were female (80% of HMLs; 73% of congregation members), and the mean age was 54 (HMLs) and 41 (congregation members). All of the HMLs and 95% of congregation members indicated owning a mobile phone. All HMLs reported sending/receiving text messages, whereas of the congregation members, 85% sent and 91% received text messages. The facilitators of a text messaging system mentioned by participants included alternative form of communication, quick method for disseminating information, and accessibility. The overall main barriers reported by both groups to using mobile technology include receiving multiple messages, difficulty texting, and cost. Ways to overcome barriers were explored with participants, and education was the most proposed solution. The findings from this study indicate that HMLs and congregation members are interested in receiving text messages to promote healthy lifestyles and cancer awareness. These findings represent the first step in the development of a mobile phone-based program designed to enhance the work of health ministry leaders. PMID:25355523

  9. Healthcare information systems: data mining methods in the creation of a clinical recommender system

    NASA Astrophysics Data System (ADS)

    Duan, L.; Street, W. N.; Xu, E.

    2011-05-01

    Recommender systems have been extensively studied to present items, such as movies, music and books that are likely of interest to the user. Researchers have indicated that integrated medical information systems are becoming an essential part of the modern healthcare systems. Such systems have evolved to an integrated enterprise-wide system. In particular, such systems are considered as a type of enterprise information systems or ERP system addressing healthcare industry sector needs. As part of efforts, nursing care plan recommender systems can provide clinical decision support, nursing education, clinical quality control, and serve as a complement to existing practice guidelines. We propose to use correlations among nursing diagnoses, outcomes and interventions to create a recommender system for constructing nursing care plans. In the current study, we used nursing diagnosis data to develop the methodology. Our system utilises a prefix-tree structure common in itemset mining to construct a ranked list of suggested care plan items based on previously-entered items. Unlike common commercial systems, our system makes sequential recommendations based on user interaction, modifying a ranked list of suggested items at each step in care plan construction. We rank items based on traditional association-rule measures such as support and confidence, as well as a novel measure that anticipates which selections might improve the quality of future rankings. Since the multi-step nature of our recommendations presents problems for traditional evaluation measures, we also present a new evaluation method based on average ranking position and use it to test the effectiveness of different recommendation strategies.

  10. The JPL Tropical Cyclone Information System: Methods for Creating Near Real-Time Science Data Portals

    NASA Astrophysics Data System (ADS)

    Knosp, B.; Li, P.; Vu, Q.; Hristova-Veleva, S. M.; Turk, F. J.; Shen, T.; Poulsen, W. L.; Lambrigtsen, B.

    2013-12-01

    The JPL Tropical Cyclone Information System (TCIS) brings together satellite, aircraft, and model forecast data from several NASA, NOAA, and other data centers to assist researchers in comparing and analyzing data related to tropical cyclones. The JPL TCIS was made public in 2008 and initially served as a data and plot archive for past storms. More recently, the TCIS has expanded its functionality to provide near real-time (NRT) data portals for specific science field campaigns, such as the Genesis and Rapid Intensification Processes (GRIP) campaign in 2010 and ongoing Hurricane and Severe Storm Sentinel (HS3) campaign. These NRT portals allow campaign team members to look at current conditions in the geographical domain of interest. Creating the NRT portals has been particularly challenging due to (1) the wide breadth of data that needs to be collected, (2) the number of data product plots that need to be served to the user, (3) the mechanics of the search and discovery tools, and (4) the issue of how to display multiple data plots at once in a meaningful way. Recently, the TCIS team has been working to redevelop the NRT portals with these challenges in mind. The new architecture we created allows for configurable mission portals that can be created on the fly. In addition to a new database that handles portal configuration, these updated NRT portals also support an improved navigation method that allows users to see what data is available, as well as a resizable visualization area based on the users' client. The integration of the NRT portal with the NASA Earth Observing System Simulators Suite (NEOS3) and a set of new online data analysis tools allows users to compare the observation and model outputs directly and perform statistical analysis with multiple datasets. In this poster, we will present the methods and practices we used to create configurable portals, gather and plot science data with low latencies, design a navigation scheme that supports multiple

  11. A Causal Modelling Approach to the Development of Theory-Based Behaviour Change Programmes for Trial Evaluation

    ERIC Educational Resources Information Center

    Hardeman, Wendy; Sutton, Stephen; Griffin, Simon; Johnston, Marie; White, Anthony; Wareham, Nicholas J.; Kinmonth, Ann Louise

    2005-01-01

    Theory-based intervention programmes to support health-related behaviour change aim to increase health impact and improve understanding of mechanisms of behaviour change. However, the science of intervention development remains at an early stage. We present a causal modelling approach to developing complex interventions for evaluation in…

  12. Effects of a Theory-Based Feedback and Consultation Process on Instruction and Learning in College Classrooms

    ERIC Educational Resources Information Center

    Hampton, Scott E.; Reiser, Robert A.

    2004-01-01

    This study examined how midterm student ratings feedback provided to teaching assistants via a theory-based ratings instrument, combined with consultation on instructional practices, would affect teaching practices, ratings of teaching effectiveness, and student learning and motivation. The student ratings instrument that was employed focused on a…

  13. Information Methods of Human and Veterinary Medical Scientists (HVMS) in Borno State, Nigeria.

    ERIC Educational Resources Information Center

    Nweke, Ken M. C.

    1995-01-01

    Describes results of a survey of human and veterinary medical scientists in Borno State (Nigeria) that was conducted to determine their information-seeking behavior and to examine sources of information used. Problems in information gathering are discussed, including lack of relevant sources, and suggestions for improvements in information…

  14. The Evolution of Library Instruction Delivery in the Chemistry Curriculum Informed by Mixed Assessment Methods

    ERIC Educational Resources Information Center

    Mandernach, Meris A.; Shorish, Yasmeen; Reisner, Barbara A.

    2014-01-01

    As information continues to evolve over time, the information literacy expectations for chemistry students also change. This article examines transformations to an undergraduate chemistry course that focuses on chemical literature and information literacy and is co-taught by a chemistry professor and a chemistry librarian. This article also…

  15. Economic valuation of informal care: lessons from the application of the opportunity costs and proxy good methods.

    PubMed

    van den Berg, Bernard; Brouwer, Werner; van Exel, Job; Koopmanschap, Marc; van den Bos, Geertrudis A M; Rutten, Frans

    2006-02-01

    This paper reports the results of the application of the opportunity costs and proxy good methods to determine a monetary value of informal care. We developed a survey in which we asked informal caregivers in The Netherlands to indicate the different types of time forgone (paid work, unpaid work and leisure) in order to be able to provide care. Moreover, we asked informal caregivers how much time they spent on a list of 16 informal care tasks during the week before the interview. Data were obtained from surveys in two different populations: informal caregivers and their care recipients with stroke and with rheumatoid arthritis (RA). A total of 218 care recipients with stroke and their primary informal caregivers completed a survey as well as 147 caregivers and their care recipients with RA. The measurement of care according to both methods is more problematic compared to the valuation. This is especially the case for the opportunity costs method and for the housework part in the proxy good method. More precise guidelines are necessary for the consistent application of both methods in order to ensure comparability of results and of economic evaluations of health care.

  16. [Guideline for the Development of Evidence-based Patient Information: insights into the methods and implementation of evidence-based health information].

    PubMed

    Lühnen, Julia; Albrecht, Martina; Hanßen, Käthe; Hildebrandt, Julia; Steckelberg, Anke

    2015-01-01

    The "Guideline for the Development of Evidence-based Patient Information" project is a novelty. The aim of this project is to enhance the quality of health information. The development and implementation process is guided by national and international standards. Involvement of health information developers plays an essential role. This article provides an insight into the guideline's underlying methodology, using graphics as an example. In addition, the results of a qualitative study exploring the competencies of health information developers are presented. These results will guide the implementation of the guideline. We conducted systematic literature searches (until June 2014), critical appraisal and descriptive analyses applying GRADE for two selected guideline questions. Out of 3,287 hits 11 RCTs were included in the analysis. The evidence has been rated to be of low to moderate quality. Additional graphics may have a positive effect on cognitive outcomes. However, the relevance of the results is questionable. For graphics, we found some indication that especially pictograms but also bar graphs have a positive effect on cognitive outcomes and meet patients' preferences. In order to prepare for the implementation of the guideline, we conducted a qualitative study to explore the competencies of health information developers using expert interviews. Four telephone interviews were conducted, audio recorded, transcribed and analysed according to Grounded Theory. Six categories were identified: literature search, development of health information, participation of target groups, continuing education and further training of health information developers, cooperation with different institutions, essential competencies. Levels of competencies regarding the methods of evidence-based medicine and evidence-based health information vary considerably and indicate a need for training. These results have informed the development of a training programme that will support the

  17. Conceptual database modeling: a method for enabling end users (radiologists) to understand and develop their information management applications.

    PubMed

    Hawkins, H; Young, S K; Hubert, K C; Hallock, P

    2001-06-01

    As medical technology advances at a rapid pace, clinicians become further and further removed from the design of their own technological tools. This is particularly evident with information management. For radiologists, clinical histories, patient reports, and other pertinent information require sophisticated tools for data handling. However, as databases grow more powerful and sophisticated, systems require the expertise of programmers and information technology personnel. The radiologist, the clinician end-user, must maintain involvement in the development of system tools to insure effective information management. Conceptual database modeling is a design method that serves to bridge the gap between the technological aspects of information management and its clinical applications. Conceptual database modeling involves developing information systems in simple language so that anyone can have input into the overall design. This presentation describes conceptual database modeling, using object role modeling, as a means by which end-users (clinicians) may participate in database development.

  18. System and method for simultaneously collecting serial number information from numerous identity tags

    DOEpatents

    Doty, M.A.

    1997-01-07

    A system and method are disclosed for simultaneously collecting serial number information reports from numerous colliding coded-radio-frequency identity tags. Each tag has a unique multi-digit serial number that is stored in non-volatile RAM. A reader transmits an ASCII coded ``D`` character on a carrier of about 900 MHz and a power illumination field having a frequency of about 1.6 Ghz. A one MHz tone is modulated on the 1.6 Ghz carrier as a timing clock for a microprocessor in each of the identity tags. Over a thousand such tags may be in the vicinity and each is powered-up and clocked by the 1.6 Ghz power illumination field. Each identity tag looks for the ``D`` interrogator modulated on the 900 MHz carrier, and each uses a digit of its serial number to time a response. Clear responses received by the reader are repeated for verification. If no verification or a wrong number is received by any identity tag, it uses a second digital together with the first to time out a more extended period for response. Ultimately, the entire serial number will be used in the worst case collision environments; and since the serial numbers are defined as being unique, the final possibility will be successful because a clear time-slot channel will be available. 5 figs.

  19. System and method for simultaneously collecting serial number information from numerous identity tags

    DOEpatents

    Doty, Michael A.

    1997-01-01

    A system and method for simultaneously collecting serial number information reports from numerous colliding coded-radio-frequency identity tags. Each tag has a unique multi-digit serial number that is stored in non-volatile RAM. A reader transmits an ASCII coded "D" character on a carrier of about 900 MHz and a power illumination field having a frequency of about 1.6 Ghz. A one MHz tone is modulated on the 1.6 Ghz carrier as a timing clock for a microprocessor in each of the identity tags. Over a thousand such tags may be in the vicinity and each is powered-up and clocked by the 1.6 Ghz power illumination field. Each identity tag looks for the "D" interrogator modulated on the 900 MHz carrier, and each uses a digit of its serial number to time a response. Clear responses received by the reader are repeated for verification. If no verification or a wrong number is received by any identity tag, it uses a second digital together with the first to time out a more extended period for response. Ultimately, the entire serial number will be used in the worst case collision environments; and since the serial numbers are defined as being unique, the final possibility will be successful because a clear time-slot channel will be available.

  20. Similarity landscapes: An improved method for scientific visualization of information from protein and DNA database searches

    SciTech Connect

    Dogget, N.; Myers, G.; Wills, C.J.

    1998-12-01

    This is the final report of a three-year, Laboratory Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). The authors have used computer simulations and examination of a variety of databases to answer questions about a wide range of evolutionary questions. The authors have found that there is a clear distinction in the evolution of HIV-1 and HIV-2, with the former and more virulent virus evolving more rapidly at a functional level. The authors have discovered highly non-random patterns in the evolution of HIV-1 that can be attributed to a variety of selective pressures. In the course of examination of microsatellite DNA (short repeat regions) in microorganisms, the authors have found clear differences between prokaryotes and eukaryotes in their distribution, differences that can be tied to different selective pressures. They have developed a new method (topiary pruning) for enhancing the phylogenetic information contained in DNA sequences. Most recently, the authors have discovered effects in complex rainforest ecosystems that indicate strong frequency-dependent interactions between host species and their parasites, leading to the maintenance of ecosystem variability.

  1. Physicians' Preferred Learning Methods and Sources of Information. Do Self-Identified Independent Learners Differ from Course Participants?

    ERIC Educational Resources Information Center

    Ferguson, Kristi J.; Caplan, Richard M.

    1987-01-01

    To determine whether self-identified independent learners differed significantly from their colleagues regarding preferred learning methods or sources of information, this study assessed physicians who scheduled independent learning activities and physicians who attended a traditional refresher course. Both groups rated learning methods and…

  2. A gene-based information gain method for detecting gene-gene interactions in case-control studies.

    PubMed

    Li, Jin; Huang, Dongli; Guo, Maozu; Liu, Xiaoyan; Wang, Chunyu; Teng, Zhixia; Zhang, Ruijie; Jiang, Yongshuai; Lv, Hongchao; Wang, Limei

    2015-11-01

    Currently, most methods for detecting gene-gene interactions (GGIs) in genome-wide association studies are divided into SNP-based methods and gene-based methods. Generally, the gene-based methods can be more powerful than SNP-based methods. Some gene-based entropy methods can only capture the linear relationship between genes. We therefore proposed a nonparametric gene-based information gain method (GBIGM) that can capture both linear relationship and nonlinear correlation between genes. Through simulation with different odds ratio, sample size and prevalence rate, GBIGM was shown to be valid and more powerful than classic KCCU method and SNP-based entropy method. In the analysis of data from 17 genes on rheumatoid arthritis, GBIGM was more effective than the other two methods as it obtains fewer significant results, which was important for biological verification. Therefore, GBIGM is a suitable and powerful tool for detecting GGIs in case-control studies.

  3. Cervical Cancer Screening among University Students in South Africa: A Theory Based Study

    PubMed Central

    Hoque, Muhammad Ehsanu; Ghuman, Shanaz; Coopoosmay, Roger; Van Hal, Guido

    2014-01-01

    Introduction Cervical cancer is a serious public health problem in South Africa. Even though the screening is free in health facilities in South Africa, the Pap smear uptake is very low. The objective of the study is to investigate the knowledge and beliefs of female university students in South Africa. Methods A cross sectional study was conducted among university women in South Africa to elicit information about knowledge and beliefs, and screening history. Results A total of 440 students completed the questionnaire. The average age of the participants was 20.39 years (SD  = 1.71 years). Regarding cervical cancer, 55.2% (n = 243) had ever heard about it. Results indicated that only 15% (22/147) of the students who had ever had sex and had heard about cervical cancer had taken a Pap test. Pearson correlation analysis showed that cervical cancer knowledge had a significantly negative relationship with barriers to cervical cancer screening. Susceptibility and seriousness score were significantly moderately correlated with benefit and motivation score as well as barrier score. Self-efficacy score also had a moderate correlation with benefit and motivation score. Students who had had a Pap test showed a significantly lower score in barriers to being screened compared to students who had not had a Pap test. Conclusion This study showed that educated women in South Africa lack complete information on cervical cancer. Students who had had a Pap test had significantly lower barriers to cervical cancer screening than those students who had not had a Pap test. PMID:25387105

  4. Knowledge-based method for determining the meaning of ambiguous biomedical terms using information content measures of similarity.

    PubMed

    McInnes, Bridget T; Pedersen, Ted; Liu, Ying; Melton, Genevieve B; Pakhomov, Serguei V

    2011-01-01

    In this paper, we introduce a novel knowledge-based word sense disambiguation method that determines the sense of an ambiguous word in biomedical text using semantic similarity or relatedness measures. These measures quantify the degree of similarity between concepts in the Unified Medical Language System (UMLS). The objective of this work was to develop a method that can disambiguate terms in biomedical text by exploiting similarity information extracted from the UMLS and to evaluate the efficacy of information content-based semantic similarity measures, which augment path-based information with probabilities derived from biomedical corpora. We show that information content-based measures obtain a higher disambiguation accuracy than path-based measures because they weight the path based on where it exists in the taxonomy coupled with the probability of the concepts occurring in a corpus of text.

  5. Methods and apparatus for capture and storage of semantic information with sub-files in a parallel computing system

    DOEpatents

    Faibish, Sorin; Bent, John M; Tzelnic, Percy; Grider, Gary; Torres, Aaron

    2015-02-03

    Techniques are provided for storing files in a parallel computing system using sub-files with semantically meaningful boundaries. A method is provided for storing at least one file generated by a distributed application in a parallel computing system. The file comprises one or more of a complete file and a plurality of sub-files. The method comprises the steps of obtaining a user specification of semantic information related to the file; providing the semantic information as a data structure description to a data formatting library write function; and storing the semantic information related to the file with one or more of the sub-files in one or more storage nodes of the parallel computing system. The semantic information provides a description of data in the file. The sub-files can be replicated based on semantically meaningful boundaries.

  6. Integrating Safety Assessment Methods using the Risk Informed Safety Margins Characterization (RISMC) Approach

    SciTech Connect

    Curtis Smith; Diego Mandelli

    2013-03-01

    Safety is central to the design, licensing, operation, and economics of nuclear power plants (NPPs). As the current light water reactor (LWR) NPPs age beyond 60 years, there are possibilities for increased frequency of systems, structures, and components (SSC) degradations or failures that initiate safety significant events, reduce existing accident mitigation capabilities, or create new failure modes. Plant designers commonly “over-design” portions of NPPs and provide robustness in the form of redundant and diverse engineered safety features to ensure that, even in the case of well-beyond design basis scenarios, public health and safety will be protected with a very high degree of assurance. This form of defense-in-depth is a reasoned response to uncertainties and is often referred to generically as “safety margin.” Historically, specific safety margin provisions have been formulated primarily based on engineering judgment backed by a set of conservative engineering calculations. The ability to better characterize and quantify safety margin is important to improved decision making about LWR design, operation, and plant life extension. A systematic approach to characterization of safety margins and the subsequent margin management options represents a vital input to the licensee and regulatory analysis and decision making that will be involved. In addition, as research and development (R&D) in the LWR Sustainability (LWRS) Program and other collaborative efforts yield new data, sensors, and improved scientific understanding of physical processes that govern the aging and degradation of plant SSCs needs and opportunities to better optimize plant safety and performance will become known. To support decision making related to economics, readability, and safety, the RISMC Pathway provides methods and tools that enable mitigation options known as margins management strategies. The purpose of the RISMC Pathway R&D is to support plant decisions for risk-informed

  7. Number Needed to Benefit From Information (NNBI): Proposal From a Mixed Methods Research Study With Practicing Family Physicians

    PubMed Central

    Pluye, Pierre; Grad, Roland M.; Johnson-Lafleur, Janique; Granikov, Vera; Shulha, Michael; Marlow, Bernard; Ricarte, Ivan Luiz Marques

    2013-01-01

    PURPOSE We wanted to describe family physicians’ use of information from an electronic knowledge resource for answering clinical questions, and their perception of subsequent patient health outcomes; and to estimate the number needed to benefit from information (NNBI), defined as the number of patients for whom clinical information was retrieved for 1 to benefit. METHODS We undertook a mixed methods research study, combining quantitative longitudinal and qualitative research studies. Participants were 41 family physicians from primary care clinics across Canada. Physicians were given access to 1 electronic knowledge resource on handheld computer in 2008–2009. For the outcome assessment, participants rated their searches using a validated method. Rated searches were examined during interviews guided by log reports that included ratings. Cases were defined as clearly described searches where clinical information was used for a specific patient. For each case, interviewees described information-related patient health outcomes. For the mixed methods data analysis, quantitative and qualitative data were merged into clinical vignettes (each vignette describing a case). We then estimated the NNBI. RESULTS In 715 of 1,193 searches for information conducted during an average of 86 days, the search objective was directly linked to a patient. Of those searches, 188 were considered to be cases. In 53 cases, participants associated the use of information with at least 1 patient health benefit. This finding suggested an NNBI of 14 (715/53). CONCLUSION The NNBI may be used in further experimental research to compare electronic knowledge resources. A low NNBI can encourage clinicians to search for information more frequently. If all searches had benefits, the NNBI would be 1. In addition to patient benefits, learning and knowledge reinforcement outcomes are frequently reported. PMID:24218380

  8. Using Pop Culture to Teach Information Literacy: Methods to Engage a New Generation

    ERIC Educational Resources Information Center

    Behen, Linda D.

    2006-01-01

    Building on the information needs and the learning style preferences of today's high school students, the author builds a case for using pop culture (TV shows, fads, and current technology) to build integrated information skills lessons for students. Chapters include a rationale, a review of the current literature, and examples of units of study…

  9. Emerging Information Literacy and Research-Method Competencies in Urban Community College Psychology Students

    ERIC Educational Resources Information Center

    Wolfe, Kate S.

    2015-01-01

    This article details an assignment developed to teach students at urban community colleges information-literacy skills. This annotated bibliography assignment introduces students to library research skills, helps increase information literacy in beginning college students, and helps psychology students learn research methodology crucial in…

  10. Professional Identity Development among Graduate Library and Information Studies Online Learners: A Mixed Methods Study

    ERIC Educational Resources Information Center

    Croxton, Rebecca A.

    2015-01-01

    This study explores how factors relating to fully online Master of Library and Information Studies (MLIS) students' connectedness with peers and faculty may impact their professional identity development as library and information studies professionals. Participants include students enrolled in a fully online MLIS degree program in the…

  11. Consumer Health Information Behavior in Public Libraries: A Mixed Methods Study

    ERIC Educational Resources Information Center

    Yi, Yong Jeong

    2012-01-01

    Previous studies indicated inadequate health literacy of American adults as one of the biggest challenges for consumer health information services provided in public libraries. Little attention, however, has been paid to public users' health literacy and health information behaviors. In order to bridge the research gap, the study aims to…

  12. Automated Methods to Extract Patient New Information from Clinical Notes in Electronic Health Record Systems

    ERIC Educational Resources Information Center

    Zhang, Rui

    2013-01-01

    The widespread adoption of Electronic Health Record (EHR) has resulted in rapid text proliferation within clinical care. Clinicians' use of copying and pasting functions in EHR systems further compounds this by creating a large amount of redundant clinical information in clinical documents. A mixture of redundant information (especially outdated…

  13. Accidental Discovery of Information on the User-Defined Social Web: A Mixed-Method Study

    ERIC Educational Resources Information Center

    Lu, Chi-Jung

    2012-01-01

    Frequently interacting with other people or working in an information-rich environment can foster the "accidental discovery of information" (ADI) (Erdelez, 2000; McCay-Peet & Toms, 2010). With the increasing adoption of social web technologies, online user-participation communities and user-generated content have provided users the…

  14. The Routines-Based Interview: A Method for Gathering Information and Assessing Needs

    ERIC Educational Resources Information Center

    McWilliam, R. A.; Casey, Amy M.; Sims, Jessica

    2009-01-01

    There are multiple ways to gather information from families receiving early intervention services (J. J. Woods & D. P. Lindeman, 2008). In this article, we discuss a specific strategy for doing this through information-gathering conversations with families. The routines-based interview (RBI; R. A. McWilliam, 1992, 2005a) was developed to meet a…

  15. Discriminating Micropathogen Lineages and Their Reticulate Evolution through Graph Theory-Based Network Analysis: The Case of Trypanosoma cruzi, the Agent of Chagas Disease

    PubMed Central

    Arnaud-Haond, Sophie; Moalic, Yann; Barnabé, Christian; Ayala, Francisco José; Tibayrenc, Michel

    2014-01-01

    Micropathogens (viruses, bacteria, fungi, parasitic protozoa) share a common trait, which is partial clonality, with wide variance in the respective influence of clonality and sexual recombination on the dynamics and evolution of taxa. The discrimination of distinct lineages and the reconstruction of their phylogenetic history are key information to infer their biomedical properties. However, the phylogenetic picture is often clouded by occasional events of recombination across divergent lineages, limiting the relevance of classical phylogenetic analysis and dichotomic trees. We have applied a network analysis based on graph theory to illustrate the relationships among genotypes of Trypanosoma cruzi, the parasitic protozoan responsible for Chagas disease, to identify major lineages and to unravel their past history of divergence and possible recombination events. At the scale of T. cruzi subspecific diversity, graph theory-based networks applied to 22 isoenzyme loci (262 distinct Multi-Locus-Enzyme-Electrophoresis -MLEE) and 19 microsatellite loci (66 Multi-Locus-Genotypes -MLG) fully confirms the high clustering of genotypes into major lineages or “near-clades”. The release of the dichotomic constraint associated with phylogenetic reconstruction usually applied to Multilocus data allows identifying putative hybrids and their parental lineages. Reticulate topology suggests a slightly different history for some of the main “near-clades”, and a possibly more complex origin for the putative hybrids than hitherto proposed. Finally the sub-network of the near-clade T. cruzi I (28 MLG) shows a clustering subdivision into three differentiated lesser near-clades (“Russian doll pattern”), which confirms the hypothesis recently proposed by other investigators. The present study broadens and clarifies the hypotheses previously obtained from classical markers on the same sets of data, which demonstrates the added value of this approach. This underlines the

  16. Discriminating micropathogen lineages and their reticulate evolution through graph theory-based network analysis: the case of Trypanosoma cruzi, the agent of Chagas disease.

    PubMed

    Arnaud-Haond, Sophie; Moalic, Yann; Barnabé, Christian; Ayala, Francisco José; Tibayrenc, Michel

    2014-01-01

    Micropathogens (viruses, bacteria, fungi, parasitic protozoa) share a common trait, which is partial clonality, with wide variance in the respective influence of clonality and sexual recombination on the dynamics and evolution of taxa. The discrimination of distinct lineages and the reconstruction of their phylogenetic history are key information to infer their biomedical properties. However, the phylogenetic picture is often clouded by occasional events of recombination across divergent lineages, limiting the relevance of classical phylogenetic analysis and dichotomic trees. We have applied a network analysis based on graph theory to illustrate the relationships among genotypes of Trypanosoma cruzi, the parasitic protozoan responsible for Chagas disease, to identify major lineages and to unravel their past history of divergence and possible recombination events. At the scale of T. cruzi subspecific diversity, graph theory-based networks applied to 22 isoenzyme loci (262 distinct Multi-Locus-Enzyme-Electrophoresis -MLEE) and 19 microsatellite loci (66 Multi-Locus-Genotypes -MLG) fully confirms the high clustering of genotypes into major lineages or "near-clades". The release of the dichotomic constraint associated with phylogenetic reconstruction usually applied to Multilocus data allows identifying putative hybrids and their parental lineages. Reticulate topology suggests a slightly different history for some of the main "near-clades", and a possibly more complex origin for the putative hybrids than hitherto proposed. Finally the sub-network of the near-clade T. cruzi I (28 MLG) shows a clustering subdivision into three differentiated lesser near-clades ("Russian doll pattern"), which confirms the hypothesis recently proposed by other investigators. The present study broadens and clarifies the hypotheses previously obtained from classical markers on the same sets of data, which demonstrates the added value of this approach. This underlines the potential of graph

  17. The Swedish strategy and method for development of a national healthcare information architecture.

    PubMed

    Rosenälv, Jessica; Lundell, Karl-Henrik

    2012-01-01

    "We need a precise framework of regulations in order to maintain appropriate and structured health care documentation that ensures that the information maintains a sufficient level of quality to be used in treatment, in research and by the actual patient. The users shall be aided by clearly and uniformly defined terms and concepts, and there should be an information structure that clarifies what to document and how to make the information more useful. Most of all, we need to standardize the information, not just the technical systems." (eHälsa - nytta och näring, Riksdag report 2011/12:RFR5, p. 37). In 2010, the Swedish Government adopted the National e-Health - the national strategy for accessible and secure information in healthcare. The strategy is a revision and extension of the previous strategy from 2006, which was used as input for the most recent efforts to develop a national information structure utilizing business-oriented generic models. A national decision on healthcare informatics standards was made by the Swedish County Councils, which decided to follow and use EN/ISO 13606 as a standard for the development of a universally applicable information structure, including archetypes and templates. The overall aim of the Swedish strategy for development of National Healthcare Information Architecture is to achieve high level semantic interoperability for clinical content and clinical contexts. High level semantic interoperability requires consistently structured clinical data and other types of data with coherent traceability to be mapped to reference clinical models. Archetypes that are formal definitions of the clinical and demographic concepts and some administrative data were developed. Each archetype describes the information structure and content of overarching core clinical concepts. Information that is defined in archetypes should be used for different purposes. Generic clinical process model was made concrete and analyzed. For each decision

  18. The Swedish strategy and method for development of a national healthcare information architecture.

    PubMed

    Rosenälv, Jessica; Lundell, Karl-Henrik

    2012-01-01

    "We need a precise framework of regulations in order to maintain appropriate and structured health care documentation that ensures that the information maintains a sufficient level of quality to be used in treatment, in research and by the actual patient. The users shall be aided by clearly and uniformly defined terms and concepts, and there should be an information structure that clarifies what to document and how to make the information more useful. Most of all, we need to standardize the information, not just the technical systems." (eHälsa - nytta och näring, Riksdag report 2011/12:RFR5, p. 37). In 2010, the Swedish Government adopted the National e-Health - the national strategy for accessible and secure information in healthcare. The strategy is a revision and extension of the previous strategy from 2006, which was used as input for the most recent efforts to develop a national information structure utilizing business-oriented generic models. A national decision on healthcare informatics standards was made by the Swedish County Councils, which decided to follow and use EN/ISO 13606 as a standard for the development of a universally applicable information structure, including archetypes and templates. The overall aim of the Swedish strategy for development of National Healthcare Information Architecture is to achieve high level semantic interoperability for clinical content and clinical contexts. High level semantic interoperability requires consistently structured clinical data and other types of data with coherent traceability to be mapped to reference clinical models. Archetypes that are formal definitions of the clinical and demographic concepts and some administrative data were developed. Each archetype describes the information structure and content of overarching core clinical concepts. Information that is defined in archetypes should be used for different purposes. Generic clinical process model was made concrete and analyzed. For each decision

  19. SFAPS: an R package for structure/function analysis of protein sequences based on informational spectrum method.

    PubMed

    Deng, Su-Ping; Huang, De-Shuang

    2014-10-01

    The R package SFAPS has been developed for structure/function analysis of protein sequences based on information spectrum method. The informational spectrum method employs the electron-ion interaction potential parameter as the numerical representation for the protein sequence, and obtains the characteristic frequency of a particular protein interaction after computing the Discrete Fourier Transform for protein sequences. The informational spectrum method is often used to analyze protein sequences, so we developed this software tool, which is implemented as an add-on package to the freely available and widely used statistical language R. Our package is distributed as open source code for Linux, Unix and Microsoft Windows. It is released under the GNU General Public License. The R package along with its source code and additional material are freely available at http://mlsbl.tongji.edu.cn/DBdownload.asp.

  20. Scenario-based design: A method for connecting information system design with public health operations and emergency management

    PubMed Central

    Reeder, Blaine; Turner, Anne M

    2011-01-01

    Responding to public health emergencies requires rapid and accurate assessment of workforce availability under adverse and changing circumstances. However, public health information systems to support resource management during both routine and emergency operations are currently lacking. We applied scenario-based design as an approach to engage public health practitioners in the creation and validation of an information design to support routine and emergency public health activities. Methods: Using semi-structured interviews we identified the information needs and activities of senior public health managers of a large municipal health department during routine and emergency operations. Results: Interview analysis identified twenty-five information needs for public health operations management. The identified information needs were used in conjunction with scenario-based design to create twenty-five scenarios of use and a public health manager persona. Scenarios of use and persona were validated and modified based on follow-up surveys with study participants. Scenarios were used to test and gain feedback on a pilot information system. Conclusion: The method of scenario-based design was applied to represent the resource management needs of senior-level public health managers under routine and disaster settings. Scenario-based design can be a useful tool for engaging public health practitioners in the design process and to validate an information system design. PMID:21807120

  1. The deformation behavior of solid polymers and modeling with the viscoplasticity theory based on overstress

    NASA Astrophysics Data System (ADS)

    Khan, Fazeel Jilani

    theory based on overstress (VBO), a state variable model consisting of a set of non-linear differential equations. Material constants and model predictions for HDPE and PPO have been generated. Curved unloading and the aforementioned rate reversal behavior, however, appear to fall beyond the purview of the existing formulation. Potential modifications to the model are discussed.

  2. Comparison of key informant and survey methods for ascertainment of childhood epilepsy in West Bengal, India.

    PubMed

    Pal, D K; Das, T; Sengupta, S

    1998-08-01

    Epilepsy is the most important neurological problem in developing countries, with a total of 50 million people worldwide having the condition. 33 million of these cases are children in developing countries, of whom 90% are untreated. To determine the number of children with active epilepsy in a given developing country community, previous studies have either ascertained information directly from key informants in the community or through more broad-based two-stage surveys. Findings are reported from a study conducted in 46 villages of district 24 Parganas South, a rural district south of Calcutta, comparing the two approaches' sensitivity, efficacy, and costs. Village leaders, health workers, and students were interviewed as key informants, while house-to-house surveys were conducted in 15,000 households. The survey was 4 times as sensitive as the key informant approach, although the approaches had similar positive predictive values. The survey had an absolute sensitivity of 59%. Case identification by key informants strongly predicted successful treatment outcomes. The cost of finding 1 case was US$11 and US$14, and of finding one successful treatment outcome US$35 and US$67 for informants and survey, respectively. The use of key informants was essential to attaining longer-term program objectives. PMID:9758124

  3. A randomised trial of three methods of giving information about prenatal testing.

    PubMed Central

    Thornton, J. G.; Hewison, J.; Lilford, R. J.; Vail, A.

    1995-01-01

    OBJECTIVE--To test the effect of extra non-directive information about prenatal testing, given individually or in a class. SETTING--Antenatal clinics in a district general hospital and a university hospital. DESIGN--Randomised controlled trial; participants allocated to control group or offer of extra information individually or in class. SUBJECTS--1691 women booking antenatal care before 15 weeks' gestation. INTERVENTIONS--All participants received the usual information about prenatal tests from hospital staff. Individual participants were offered a separate session with a research midwife in which prenatal screening was described in detail. Class participants were offered the same extra information in an early prenatal class. MAIN OUTCOME MEASURES--Attendance at extra information sessions; uptake rates of prenatal tests; levels of anxiety, understanding, and satisfaction with decisions. RESULTS--Attendance at classes was lower than at individual sessions (adjusted odds ratio 0.45; 95% confidence interval 0.35 to 0.58). Ultrasonography was almost universally accepted (99%) and was not affected by either intervention. Uptake of cystic fibrosis testing, high in controls (79%), was lowered in the individual group (0.44; 0.20 to 0.97) and classes (0.39; 0.18 to 0.86). Uptake of screening for Down's syndrome, already low (34%) in controls, was not further depressed by extra information in classes (0.99; 0.70 to 1.39) and was slightly higher in the individual group (1.45; 1.04 to 2.02). Women offered extra information had improved understanding and were more satisfied with information received; satisfaction with decisions about prenatal testing was unchanged. The offer of individual information reduced anxiety later in pregnancy. CONCLUSIONS--Ultrasonography is valued for non-medical reasons and chosen even by fully informed people who eschew prenatal diagnosis. The offer of extra information has no overall adverse effects on anxiety and reduces uptake of blood tests

  4. [Visualization and analysis of drug information on adverse reactions using data mining method, and its clinical application].

    PubMed

    Kawakami, Junko

    2014-01-01

    Sources of drug information such as package inserts (PIs) and interview forms (IFs) and existing drug information databases provide primarily document-based and numerical information. For this reason, it is not easy to obtain a complete picture of the information concerning many drugs with similar effects or to understand differences among drugs. The visualization of drug information may help provide a large amount of information in a short period, relieve the burden on medical workers, facilitate a comprehensive understanding and comparison of drugs, and contribute to improvements in patients' QOL. At our department, we are developing an approach to convert information on side effects obtained from PIs of many drugs with similar effects into visual maps reflecting the data structure through competitive learning using the self-organizing map (SOM) technique of Kohonen, which is a powerful method for pattern recognition, to facilitate the grasping of all available information and differences among drugs, to anticipate the appearance of side effects; we are also evaluating the possibility of its clinical application. In this paper, this approach is described by taking the examples of antibiotics, antihypertensive drugs, and diabetes drugs.

  5. Theory-based analysis of clinical efficacy of triptans using receptor occupancy

    PubMed Central

    2014-01-01

    Background Triptans, serotonin 5-HT1B/1D receptor agonists, exert their action by targeting serotonin 5-HT1B/1D receptors, are used for treatment of migraine attack. Presently, 5 different triptans, namely sumatriptan, zolmitriptan, eletriptan, rizatriptan, and naratriptan, are marketed in Japan. In the present study, we retrospectively analyzed the relationships of clinical efficacy (headache relief) in Japanese and 5-HT1B/1D receptor occupancy (Φ1B and Φ1D). Receptor occupancies were calculated from both the pharmacokinetic and pharmacodynamic data of triptans. Methods To evaluate the total amount of exposure to drug, we calculated the area under the plasma concentration-time curve (AUCcp) and the areas under the time curves for Ф1B and Ф1D (AUCФ1B and AUCФ1D). Moreover, parameters expressing drug transfer and binding rates (A cp , A Ф 1B , A Ф 1D ) were calculated. Results Our calculations showed that Фmax1B and Фmax1D were relatively high at 32.0-89.4% and 68.4-96.2%, respectively, suggesting that it is likely that a high occupancy is necessary to attain the clinical effect. In addition, the relationships between therapeutic effect and AUCcp, AUCΦ1B, AUCΦ1D, and A cp  · AUCcp differed with each drug and administered form, whereas a significant relationship was found between the therapeutic effect and A Φ 1B  · AUCΦ1B or A Φ 1D  · AUCΦ1D that was not affected by the drug and the form of administration. Conclusions These results suggest that receptor occupancy can be used as a parameter for a common index to evaluate the therapeutic effect. We considered that the present findings provide useful information to support the proper use of triptans. PMID:25488888

  6. Strategies and methods for aligning current and best medical practices. The role of information technologies.

    PubMed Central

    Schneider, E C; Eisenberg, J M

    1998-01-01

    Rapid change in American medicine requires that physicians adjust established behaviors and acquire new skills. In this article, we address three questions: What do we know about how to change physicians' practices? How can physicians take advantage of new and evolving information technologies that are likely to have an impact on the future practice of medicine? and What strategic educational interventions will best enable physicians to show competencies in information management and readiness to change practice? We outline four guiding principles for incorporating information systems tools into both medical education and practice, and we make eight recommendations for the development of a new medical school curriculum. This curriculum will produce a future medical practitioner who is capable of using information technologies to systematically measure practice performance, appropriateness, and effectiveness while updating knowledge efficiently. PMID:9614787

  7. The effect of theory-based interventions on physical activity participation among overweight/obese individuals: a systematic review.

    PubMed

    Bélanger-Gravel, A; Godin, G; Vézina-Im, L-A; Amireault, S; Poirier, P

    2011-06-01

    Little attention has been paid to the evaluation of the long-term impact of theory-based interventions on physical activity participation among overweight/obese individuals after the interventions have ended. The primary aim of this systematic review was to investigate the long-term effectiveness of theory-based interventions increasing physical activity and identify the most effective techniques for behaviour change among overweight/obese individuals. The secondary aim was to investigate the effect of these interventions on theoretical variables. Eighteen studies were reviewed. Among these studies, three reported significant short-term and two long-term effects of interventions on physical activity participation. Most of the studies observed a significant short- or long-term effect of time on this behaviour. Theoretical frameworks most often applied included the Behavioural Model and the Social Learning/Cognitive Theory. However, few of the studies reported any impact on theoretical variables. The most prevalent techniques consisted of providing opportunities for social comparison and instruction as well as self-monitoring. Leading techniques differentiating the experimental group from the control group included prompting practice and intentions formation and barriers identification. Although the combination of these three techniques appears successful, the long-term impact of theory-based interventions remains ambiguous.

  8. Infobuttons and classification models: a method for the automatic selection of on-line information resources to fulfill clinicians’ information needs

    PubMed Central

    Del Fiol, Guilherme; Haug, Peter J.

    2008-01-01

    Objective Infobuttons are decision support tools that offer links to information resources based on the context of the interaction between a clinician and an electronic medical record (EMR) system. The objective of this study was to explore machine learning and web usage mining methods to produce classification models for the prediction of information resources that might be relevant in a particular infobutton context. Design Classification models were developed and evaluated with an infobutton usage dataset. The performance of the models was measured and compared with a reference implementation in a series of experiments. Measurements Level of agreement (kappa) between the models and the resources that clinicians actually used in each infobutton session. Results The classification models performed significantly better than the reference implementation (p<0.0001). The performance of these models tended to decrease over time, probably due to a phenomenon known as concept drift. However, the performance of the models remained stable when concept drift handling techniques were used. Conclusion The results suggest that classification models are a promising method for the prediction of information resources that a clinician would use to answer patient care questions. PMID:18249041

  9. 75 FR 8817 - Annual Submission of Tax Information for Use in the Revenue Shortfall Allocation Method

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-26

    ... Shortfall Allocation Method ACTION: Final rule. SUMMARY: The Surface Transportation Board (Board) is... Allocation Method (RSAM). RSAM is one of three benchmarks that together are used to determine the... Method, STB Ex Parte No. 646 (Sub-No. 2) (STB served May 11, 2009) (RSAM Taxes). Specifically,...

  10. Characterizing the Processes for Navigating Internet Health Information Using Real-Time Observations: A Mixed-Methods Approach

    PubMed Central

    Paterniti, Debora A; Wilson, Machelle; Bell, Robert A; Chan, Man Shan; Villareal, Chloe C; Nguyen, Hien Huy; Kravitz, Richard L

    2015-01-01

    Background Little is known about the processes people use to find health-related information on the Internet or the individual characteristics that shape selection of information-seeking approaches. Objective Our aim was to describe the processes by which users navigate the Internet for information about a hypothetical acute illness and to identify individual characteristics predictive of their information-seeking strategies. Methods Study participants were recruited from public settings and agencies. Interested individuals were screened for eligibility using an online questionnaire. Participants listened to one of two clinical scenarios—consistent with influenza or bacterial meningitis—and then conducted an Internet search. Screen-capture video software captured Internet search mouse clicks and keystrokes. Each step of the search was coded as hypothesis testing (etiology), evidence gathering (symptoms), or action/treatment seeking (behavior). The coded steps were used to form a step-by-step pattern of each participant’s information-seeking process. A total of 78 Internet health information seekers ranging from 21-35 years of age and who experienced barriers to accessing health care services participated. Results We identified 27 unique patterns of information seeking, which were grouped into four overarching classifications based on the number of steps taken during the search, whether a pattern consisted of developing a hypothesis and exploring symptoms before ending the search or searching an action/treatment, and whether a pattern ended with action/treatment seeking. Applying dual-processing theory, we categorized the four overarching pattern classifications as either System 1 (41%, 32/78), unconscious, rapid, automatic, and high capacity processing; or System 2 (59%, 46/78), conscious, slow, and deliberative processing. Using multivariate regression, we found that System 2 processing was associated with higher education and younger age. Conclusions We

  11. Extending Value of Information Methods to Include the Co-Net Benefits of Earth Observations

    NASA Astrophysics Data System (ADS)

    Macauley, M.

    2015-12-01

    The widening relevance of Earth observations information across the spectrum of natural and environmental resources markedly enhances the value of these observations. An example is observations of forest extent, species composition, health, and change; this information can help in assessing carbon sequestration, biodiversity and habitat, watershed management, fuelwood potential, and other ecosystem services as well as inform the opportunity cost of forest removal for alternative land use such as agriculture, pasture, or development. These "stacked" indicators or co- net benefits add significant value to Earth observations. In part because of reliance on case studies, much previous research about the value of information from Earth observations has assessed individual applications rather than aggregate across applications, thus tending to undervalue the observations. Aggregating across applications is difficult, however, because it requires common units of measurement: controlling for spatial, spectral, and temporal attributes of the observations; and consistent application of value of information techniques. This paper will discuss general principles of co-net benefit aggregation and illustrate its application to attributing value to Earth observations.

  12. Development of a Simple 12-Item Theory-Based Instrument to Assess the Impact of Continuing Professional Development on Clinical Behavioral Intentions

    PubMed Central

    Légaré, France; Borduas, Francine; Freitas, Adriana; Jacques, André; Godin, Gaston; Luconi, Francesca; Grimshaw, Jeremy

    2014-01-01

    Background Decision-makers in organizations providing continuing professional development (CPD) have identified the need for routine assessment of its impact on practice. We sought to develop a theory-based instrument for evaluating the impact of CPD activities on health professionals' clinical behavioral intentions. Methods and Findings Our multipronged study had four phases. 1) We systematically reviewed the literature for instruments that used socio-cognitive theories to assess healthcare professionals' clinically-oriented behavioral intentions and/or behaviors; we extracted items relating to the theoretical constructs of an integrated model of healthcare professionals' behaviors and removed duplicates. 2) A committee of researchers and CPD decision-makers selected a pool of items relevant to CPD. 3) An international group of experts (n = 70) reached consensus on the most relevant items using electronic Delphi surveys. 4) We created a preliminary instrument with the items found most relevant and assessed its factorial validity, internal consistency and reliability (weighted kappa) over a two-week period among 138 physicians attending a CPD activity. Out of 72 potentially relevant instruments, 47 were analyzed. Of the 1218 items extracted from these, 16% were discarded as improperly phrased and 70% discarded as duplicates. Mapping the remaining items onto the constructs of the integrated model of healthcare professionals' behaviors yielded a minimum of 18 and a maximum of 275 items per construct. The partnership committee retained 61 items covering all seven constructs. Two iterations of the Delphi process produced consensus on a provisional 40-item questionnaire. Exploratory factorial analysis following test-retest resulted in a 12-item questionnaire. Cronbach's coefficients for the constructs varied from 0.77 to 0.85. Conclusion A 12-item theory-based instrument for assessing the impact of CPD activities on health professionals' clinical behavioral

  13. A new EEG synchronization strength analysis method: S-estimator based normalized weighted-permutation mutual information.

    PubMed

    Cui, Dong; Pu, Weiting; Liu, Jing; Bian, Zhijie; Li, Qiuli; Wang, Lei; Gu, Guanghua

    2016-10-01

    Synchronization is an important mechanism for understanding information processing in normal or abnormal brains. In this paper, we propose a new method called normalized weighted-permutation mutual information (NWPMI) for double variable signal synchronization analysis and combine NWPMI with S-estimator measure to generate a new method named S-estimator based normalized weighted-permutation mutual information (SNWPMI) for analyzing multi-channel electroencephalographic (EEG) synchronization strength. The performances including the effects of time delay, embedding dimension, coupling coefficients, signal to noise ratios (SNRs) and data length of the NWPMI are evaluated by using Coupled Henon mapping model. The results show that the NWPMI is superior in describing the synchronization compared with the normalized permutation mutual information (NPMI). Furthermore, the proposed SNWPMI method is applied to analyze scalp EEG data from 26 amnestic mild cognitive impairment (aMCI) subjects and 20 age-matched controls with normal cognitive function, who both suffer from type 2 diabetes mellitus (T2DM). The proposed methods NWPMI and SNWPMI are suggested to be an effective index to estimate the synchronization strength.

  14. 14 CFR 39.21 - Where can I get information about FAA-approved alternative methods of compliance?

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 14 Aeronautics and Space 1 2014-01-01 2014-01-01 false Where can I get information about FAA-approved alternative methods of compliance? 39.21 Section 39.21 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION AIRCRAFT AIRWORTHINESS DIRECTIVES § 39.21 Where can I...

  15. A new EEG synchronization strength analysis method: S-estimator based normalized weighted-permutation mutual information.

    PubMed

    Cui, Dong; Pu, Weiting; Liu, Jing; Bian, Zhijie; Li, Qiuli; Wang, Lei; Gu, Guanghua

    2016-10-01

    Synchronization is an important mechanism for understanding information processing in normal or abnormal brains. In this paper, we propose a new method called normalized weighted-permutation mutual information (NWPMI) for double variable signal synchronization analysis and combine NWPMI with S-estimator measure to generate a new method named S-estimator based normalized weighted-permutation mutual information (SNWPMI) for analyzing multi-channel electroencephalographic (EEG) synchronization strength. The performances including the effects of time delay, embedding dimension, coupling coefficients, signal to noise ratios (SNRs) and data length of the NWPMI are evaluated by using Coupled Henon mapping model. The results show that the NWPMI is superior in describing the synchronization compared with the normalized permutation mutual information (NPMI). Furthermore, the proposed SNWPMI method is applied to analyze scalp EEG data from 26 amnestic mild cognitive impairment (aMCI) subjects and 20 age-matched controls with normal cognitive function, who both suffer from type 2 diabetes mellitus (T2DM). The proposed methods NWPMI and SNWPMI are suggested to be an effective index to estimate the synchronization strength. PMID:27451314

  16. Diagnostic Instability of "DSM-IV" ADHD Subtypes: Effects of Informant Source, Instrumentation, and Methods for Combining Symptom Reports

    ERIC Educational Resources Information Center

    Valo, Shana; Tannock, Rosemary

    2010-01-01

    Using data from 123 children (aged 6-12 years) referred consecutively to a pediatric neuropsychiatry clinic by community physicians for assessment of Attention-Deficit/Hyperactivity Disorder (ADHD) and related problems, we investigated the effects of informant (parent, teacher), tool (interview, rating scale), and method for combining symptom…

  17. Following Experts at Work in Their Own Information Spaces: Using Observational Methods To Develop Tools for the Digital Library.

    ERIC Educational Resources Information Center

    Gorman, Paul; Lavelle, Mary; Delcambre, Lois; Maier, David

    2002-01-01

    Offers an overview of the authors' experience using several observational methods to better understand one class of users, expert clinicians treating patients in hospital settings. Shows the evolution of understanding of the users and their information-handling tasks based on observations made in the field by a multidisciplinary research team, and…

  18. Study on the digitized and quantified evaluating method for super information characteristics of herbal preparation by infrared spectrum fingerprints.

    PubMed

    Sun, Guoxiang; Li, Lifeng; Li, Yanfei; Song, Aihua

    2014-10-01

    This paper aims to establish the infrared spectrum fingerprint (IRFP) in the absorbing region of 4,000-400 cm(-1) and its first derivative infrared spectrum fingerprints (d-IRFP) of ginkgo tablet (GT). And set up theories of the digitized and quantified evaluating method for super information characteristics by IRFPs of traditional Chinese medicine (TCM) which consists of the IRFP index, information index, fluctuation index, information fluctuation index and the quantified infrared fingerprint method (QIFM). Direct tabletting method was applied during the data collection of the IRFPs of 14 batches of GTs by Fourier transform infrared spectrometer. In terms of the digitized features, QIFM and similarity analysis of d-IRFP, sample S4 and S7 were evaluated as suspected outliers while the qualities of S1, S2, S6 and S12 were less well and the rests were relatively good. The assessing approach makes the expression and processing of superposed information in IRFP of TCM digitized simple and effective. What's more, an approach which can test total chemical contents in the complex system of TCM rapidly, simply and accurately was achieved by the application of QIFM based on IR technique. Finally, the quantitative and digitized infrared fingerprinting method was established as a novel approach to evaluate the quality of TCM.

  19. Unpacking (In)formal Learning in an Academic Development Programme: A Mixed-Method Social Network Perspective

    ERIC Educational Resources Information Center

    Rienties, Bart; Hosein, Anesa

    2015-01-01

    How and with whom academics develop and maintain formal and informal networks for reflecting on their teaching practice has received limited attention even though academic development (AD) programmes have become an almost ubiquitous feature of higher education. The primary goal of this mixed-method study is to unpack how 114 academics in an AD…

  20. Sexual Health Information Seeking Online: A Mixed-Methods Study among Lesbian, Gay, Bisexual, and Transgender Young People

    ERIC Educational Resources Information Center

    Magee, Joshua C.; Bigelow, Louisa; DeHaan, Samantha; Mustanski, Brian S.

    2012-01-01

    The current study used a mixed-methods approach to investigate the positive and negative aspects of Internet use for sexual health information among lesbian, gay, bisexual, and transgender (LGBT) young people. A diverse community sample of 32 LGBT young people (aged 16-24 years) completed qualitative interviews focusing on how, where, and when…

  1. Estimation Method of Information Understanding in Communication by Nasal Skin Thermogram

    NASA Astrophysics Data System (ADS)

    Nozawa, Akio; Uchida, Masafumi; Ide, Hideto

    The information understanding in communication is supposed to be concerned with the emotion. The purpose of this study is to estimate the information understanding through the nasal skin thermogram which reflects the human emotion. The local fractal dimension analysis was applied to the nasal skin thermogram. Based on the relationship among the local fractal dimension and the subjective understanding measured in English listening tests, the information understanding has been estimated by the linear multiple regression. As a result, the estimated understanding has been obviously classified in two groups according to the difficulty of the test. Additionally, we have shown that the local fractal dimension of the nasal skin thermogram is the individual independent factor for the infomation understanding estimation.

  2. Novel classification method for remote sensing images based on information entropy discretization algorithm and vector space model

    NASA Astrophysics Data System (ADS)

    Xie, Li; Li, Guangyao; Xiao, Mang; Peng, Lei

    2016-04-01

    Various kinds of remote sensing image classification algorithms have been developed to adapt to the rapid growth of remote sensing data. Conventional methods typically have restrictions in either classification accuracy or computational efficiency. Aiming to overcome the difficulties, a new solution for remote sensing image classification is presented in this study. A discretization algorithm based on information entropy is applied to extract features from the data set and a vector space model (VSM) method is employed as the feature representation algorithm. Because of the simple structure of the feature space, the training rate is accelerated. The performance of the proposed method is compared with two other algorithms: back propagation neural networks (BPNN) method and ant colony optimization (ACO) method. Experimental results confirm that the proposed method is superior to the other algorithms in terms of classification accuracy and computational efficiency.

  3. Knowledge and information needs of young people with epilepsy and their parents: Mixed-method systematic review

    PubMed Central

    2010-01-01

    Background Young people with neurological impairments such as epilepsy are known to receive less adequate services compared to young people with other long-term conditions. The time (age 13-19 years) around transition to adult services is particularly important in facilitating young people's self-care and ongoing management. There are epilepsy specific, biological and psycho-social factors that act as barriers and enablers to information exchange and nurturing of self-care practices. Review objectives were to identify what is known to be effective in delivering information to young people age 13-19 years with epilepsy and their parents, to describe their experiences of information exchange in healthcare contexts, and to identify factors influencing positive and negative healthcare communication. Methods The Evidence for Policy and Practice Information Coordinating Centre systematic mixed-method approach was adapted to locate, appraise, extract and synthesise evidence. We used Ley's cognitive hypothetical model of communication and subsequently developed a theoretical framework explaining information exchange in healthcare contexts. Results Young people and parents believed that healthcare professionals were only interested in medical management. Young people felt that discussions about their epilepsy primarily occurred between professionals and parents. Epilepsy information that young people obtained from parents or from their own efforts increased the risk of epilepsy misconceptions. Accurate epilepsy knowledge aided psychosocial adjustment. There is some evidence that interventions, when delivered in a structured psycho-educational, age appropriate way, increased young people's epilepsy knowledge, with positive trend to improving quality of life. We used mainly qualitative and mixed-method evidence to develop a theoretical framework explaining information exchange in clinical encounters. Conclusions There is a paucity of evidence reporting effective interventions

  4. Methods of extending signatures and training without ground information. [data processing, pattern recognition

    NASA Technical Reports Server (NTRS)

    Henderson, R. G.; Thomas, G. S.; Nalepka, R. F.

    1975-01-01

    Methods of performing signature extension, using LANDSAT-1 data, are explored. The emphasis is on improving the performance and cost-effectiveness of large area wheat surveys. Two methods were developed: ASC, and MASC. Two methods, Ratio, and RADIFF, previously used with aircraft data were adapted to and tested on LANDSAT-1 data. An investigation into the sources and nature of between scene data variations was included. Initial investigations into the selection of training fields without in situ ground truth were undertaken.

  5. The Use and Misuse of Anthropological Methods in Library and Information Science Research.

    ERIC Educational Resources Information Center

    Sandstrom, Alan R.; Sandstrom, Pamela Effrein

    1995-01-01

    Notes an antiscientific bias in writings that proclaim the value of qualitative approaches for studying information problems. Discusses the following methodological concerns: (1) scientific versus nonscientific traditions; (2) distinction between emic and etic perspectives; (3) artificial divide between qualitative and quantitative techniques; (4)…

  6. Pathfinding in the Research Forest: The Pearl Harvesting Method for Effective Information Retrieval

    ERIC Educational Resources Information Center

    Sandieson, Robert

    2006-01-01

    Knowledge of empirical research has become important for everyone involved in education and special education. Policy, practice, and informed reporting rely on locating and understanding unfiltered, original source material. Although access to vast amounts of research has been greatly facilitated by online databases, such as ERIC and PsychInfo,…

  7. Graph-Based Weakly-Supervised Methods for Information Extraction & Integration

    ERIC Educational Resources Information Center

    Talukdar, Partha Pratim

    2010-01-01

    The variety and complexity of potentially-related data resources available for querying--webpages, databases, data warehouses--has been growing ever more rapidly. There is a growing need to pose integrative queries "across" multiple such sources, exploiting foreign keys and other means of interlinking data to merge information from diverse…

  8. Multifunction extension of simplex optimization method for mutual information-based registration of ultrasound volumes

    NASA Astrophysics Data System (ADS)

    Zagrodsky, Vladimir; Shekhar, Raj; Cornhill, J. Fredrick

    2001-07-01

    Mutual information has been demonstrated to be an accurate and reliable criterion function to perform registration of medical data. Due to speckle noise, ultrasound volumes do not provide a smooth mutual information function. Consequently the optimization technique used must be robust enough to avoid local maxima and converge on the desired global maximum eventually. While the well-known downhill simplex optimization uses a single criterion function, our extension to multi-function optimization uses three criterion functions, namely mutual information computed at three levels of intensity quantization and hence three degrees of noise suppression. Registration was performed with rigid as well as simple non-rigid transformation modes for real-time 3D ultrasound datasets of the left ventricle. Pairs of frames corresponding to the most stationary end- diastolic cardiac phase were chosen, and an initial misalignment was artificially introduced between them. The multi-function simplex optimization reduced the failure rate by a factor of two in comparison to the standard simplex optimization, while the average accuracy for the successful cases was unchanged. A more robust registration resulted form the parallel use of criterion functions. The additional computational cost was negligible, as each of the three implementations of the mutual information used the same joint histogram and required no extra spatial transformation.

  9. The Hybrid Application of an Inductive Learning Method and a Neural Network for Intelligent Information Retrieval.

    ERIC Educational Resources Information Center

    Cortez, Edwin M.; And Others

    1995-01-01

    Proposes an information retrieval system based on a hybrid model consisting of an inductive learning and neural network system. Evaluates the system's responses to incomplete queries and inconsistent indexing. Query terms, discriminant descriptors, and the American Documentation Institute's query and document titles used in the evaluation are…

  10. Query Methods in Information Retrieval--Criteria for Selection and Application.

    ERIC Educational Resources Information Center

    Goldenson, A.F.; Cardwell, D.W.

    This report studies and classifies, according to distinct user-oriented features, the various computer-aided systems developed for storage and retrieval of information in a wide range of fields. Such features are described and evaluated to determine the nature of characteristics that have a strong bearing on the relative success of various query…

  11. 78 FR 25440 - Request for Information and Citations on Methods for Cumulative Risk Assessment

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-01

    ... environment. The EPA is developing guidelines for the assessment of cumulative risk as defined and..., characterization, and possible quantification of the combined risks to health or the environment from multiple... development of regulations and permits. This notice solicits information and citations pertaining...

  12. 42 CFR 423.888 - Payment methods, including provision of necessary information.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... historical data and generally accepted actuarial principles) of the difference between such gross costs and... necessary information. (a) Basis. The provisions of § 423.301 through § 423.343, including requirements to... annual basis, as elected by the plansponsor under guidance specified by CMS, unless CMS determines...

  13. An Informal Reading Readiness Inventory: A Diagnostic Method of Predicting First Grade Reading Achievement.

    ERIC Educational Resources Information Center

    Anderson, Carolyn C.; Koenke, Karl

    A study was undertaken to create and validate a diagnostic, task-based Informal Reading Readiness Inventory (IRRI). IRRI subtests were created to reflect four areas found to be important in reading readiness: awareness of self and media, language experience, reasoning, and phonics. Prereading curriculum components formed the basis for test item…

  14. A Hierarchy Fuzzy MCDM Method for Studying Electronic Marketing Strategies in the Information Service Industry.

    ERIC Educational Resources Information Center

    Tang, Michael T.; Tzeng, Gwo-Hshiung

    In this paper, the impacts of Electronic Commerce (EC) on the international marketing strategies of information service industries are studied. In seeking to blend humanistic concerns in this research with technological development by addressing challenges for deterministic attitudes, the paper examines critical environmental factors relevant to…

  15. Bootstrap rank-ordered conditional mutual information (broCMI): A nonlinear input variable selection method for water resources modeling

    NASA Astrophysics Data System (ADS)

    Quilty, John; Adamowski, Jan; Khalil, Bahaa; Rathinasamy, Maheswaran

    2016-03-01

    The input variable selection problem has recently garnered much interest in the time series modeling community, especially within water resources applications, demonstrating that information theoretic (nonlinear)-based input variable selection algorithms such as partial mutual information (PMI) selection (PMIS) provide an improved representation of the modeled process when compared to linear alternatives such as partial correlation input selection (PCIS). PMIS is a popular algorithm for water resources modeling problems considering nonlinear input variable selection; however, this method requires the specification of two nonlinear regression models, each with parametric settings that greatly influence the selected input variables. Other attempts to develop input variable selection methods using conditional mutual information (CMI) (an analog to PMI) have been formulated under different parametric pretenses such as k nearest-neighbor (KNN) statistics or kernel density estimates (KDE). In this paper, we introduce a new input variable selection method based on CMI that uses a nonparametric multivariate continuous probability estimator based on Edgeworth approximations (EA). We improve the EA method by considering the uncertainty in the input variable selection procedure by introducing a bootstrap resampling procedure that uses rank statistics to order the selected input sets; we name our proposed method bootstrap rank-ordered CMI (broCMI). We demonstrate the superior performance of broCMI when compared to CMI-based alternatives (EA, KDE, and KNN), PMIS, and PCIS input variable selection algorithms on a set of seven synthetic test problems and a real-world urban water demand (UWD) forecasting experiment in Ottawa, Canada.

  16. Patent information analysis methods and their effective use : A study through activities of PAT-LIST Research Workshop adviser

    NASA Astrophysics Data System (ADS)

    Nakamura, Sakae

    For effective use of technical information, various analytical tools and methods (e.g., patent map analysis) have been proposed. It was against this background that the “PAT-LIST Research Workshop” (supported by Raytec Co., Ltd.) was established in 2006. This article discusses, as an example, some actual research subject that the author as an adviser to the forum has studied through our activities in the past six years, especially the subject for 2010 (unveiling intellectual property strategies of specified enterprises from technical information analysis results). Practically useful analysis methods will be proposed showing some points of notes in analysis about the methods. What is also introduced is macroanalysis using text mining tools and the significance of controlled technical classification in a problem/solution map for determining critical fields.

  17. Testing for measurement invariance and latent mean differences across methods: interesting incremental information from multitrait-multimethod studies

    PubMed Central

    Geiser, Christian; Burns, G. Leonard; Servera, Mateu

    2014-01-01

    Models of confirmatory factor analysis (CFA) are frequently applied to examine the convergent validity of scores obtained from multiple raters or methods in so-called multitrait-multimethod (MTMM) investigations. We show that interesting incremental information about method effects can be gained from including mean structures and tests of MI across methods in MTMM models. We present a modeling framework for testing MI in the first step of a CFA-MTMM analysis. We also discuss the relevance of MI in the context of four more complex CFA-MTMM models with method factors. We focus on three recently developed multiple-indicator CFA-MTMM models for structurally different methods [the correlated traits-correlated (methods – 1), latent difference, and latent means models; Geiser et al., 2014a; Pohl and Steyer, 2010; Pohl et al., 2008] and one model for interchangeable methods (Eid et al., 2008). We demonstrate that some of these models require or imply MI by definition for a proper interpretation of trait or method factors, whereas others do not, and explain why MI may or may not be required in each model. We show that in the model for interchangeable methods, testing for MI is critical for determining whether methods can truly be seen as interchangeable. We illustrate the theoretical issues in an empirical application to an MTMM study of attention deficit and hyperactivity disorder (ADHD) with mother, father, and teacher ratings as methods. PMID:25400603

  18. Development of a Theory-Based Intervention to Increase Prescription of Inspiratory Muscle Training by Health Professionals in the Management of People with Chronic Obstructive Pulmonary Disease

    PubMed Central

    Li, Linda C.; Reid, W. Darlene

    2011-01-01

    ABSTRACT Purpose: The purpose of this paper is twofold: (1) to provide an overview of the literature on barriers to evidence-based practice (EBP) and the effectiveness of implementation interventions in health care; and (2) to outline the development of an implementation intervention for improving the prescription of inspiratory muscle training (IMT) by physical therapists and other health professionals for people with chronic obstructive pulmonary disease (COPD). Summary of Key Points: Individuals, organizations, and the research itself present barriers to EBP in physical therapy. Despite the evidence supporting the use of IMT, this treatment continues to be under-used in managing COPD. Current health services research shows that traditional information-based approaches to implementation, such as didactic lectures, do not adequately address the challenges health professionals face when trying to make changes in practice. We propose the development of a theory-based intervention to improve health professionals' use of IMT in the management of COPD. It is postulated that a behavioural intervention, based on the theory of planned behaviour (TPB), may be more effective than an information-based strategy in increasing the prescription of IMT by health professionals. Conclusion: TPB may be used to understand the antecedents of health professionals' behaviour and to guide the development of implementation interventions. Further research is needed to evaluate the effectiveness of this proposed intervention in the management of people with COPD. PMID:22654237

  19. An Information Retrieval Model Based on Vector Space Method by Supervised Learning.

    ERIC Educational Resources Information Center

    Tai, Xiaoying; Ren, Fuji; Kita, Kenji

    2002-01-01

    Proposes a method to improve retrieval performance of the vector space model by using users' relevance feedback. Discusses the use of singular value decomposition and the latent semantic indexing model, and reports the results of two experiments that show the effectiveness of the proposed method. (Author/LRW)

  20. Personalised Information Services Using a Hybrid Recommendation Method Based on Usage Frequency

    ERIC Educational Resources Information Center

    Kim, Yong; Chung, Min Gyo

    2008-01-01

    Purpose: This paper seeks to describe a personal recommendation service (PRS) involving an innovative hybrid recommendation method suitable for deployment in a large-scale multimedia user environment. Design/methodology/approach: The proposed hybrid method partitions content and user into segments and executes association rule mining,…

  1. Longitudinal Methods as Tools for Evaluating Vocational Education. Information Series No. 155.

    ERIC Educational Resources Information Center

    Pucel, David J.

    One of a series of sixteen knowledge transformation papers, this paper examines the advantages and disadvantages of longitudinal studies as a method for evaluating vocational programs. First, longitudinal methods are defined for the differences between them and cross-sectional studies are established. The results of a literature search conducted…

  2. Method and apparatus for optimizing a train trip using signal information

    DOEpatents

    Kumar, Ajith Kuttannair; Daum, Wolfgang; Otsubo, Tom; Hershey, John Erik; Hess, Gerald James

    2014-06-10

    A system is provided for operating a railway network including a first railway vehicle during a trip along track segments. The system includes a first element for determining travel parameters of the first railway vehicle, a second element for determining travel parameters of a second railway vehicle relative to the track segments to be traversed by the first vehicle during the trip, a processor for receiving information from the first and the second elements and for determining a relationship between occupation of a track segment by the second vehicle and later occupation of the same track segment by the first vehicle and an algorithm embodied within the processor having access to the information to create a trip plan that determines a speed trajectory for the first vehicle. The speed trajectory is responsive to the relationship and further in accordance with one or more operational criteria for the first vehicle.

  3. Groundwater Potential Assessment Using Geographic Information Systems and Ahp Method (case Study: Baft City, Kerman, Iran)

    NASA Astrophysics Data System (ADS)

    Zeinolabedini, M.; Esmaeily, A.

    2015-12-01

    The purpose of the present study is to use Geographical Information Systems (GISs) for determining the best areas having ground water potential in Baft city. To achieve this objective, parameters such as precipitation, slope, fault, vegetation, land cover and lithology were used. Regarding different weight of these parameters effect, Analytic Hierarchy Process (AHP) was used. After developing informational layers in GIS and weighing each of them, a model was developed. The final map of ground waters potential was calculated through the above-mentioned model. Through applying our developed model four areas having high, average, low potential and without required potential distinguished. Results of this research indicated that 0.74, 41.23 and 45.63 percent of the area had high, average and low potential, respectively. Moreover, 12.38% of this area had no potential. Obtained results can be useful in management plans of ground water resources and preventing excessive exploitation.

  4. Method for the evaluation of structure-activity relationship information associated with coordinated activity cliffs.

    PubMed

    Dimova, Dilyana; Stumpfe, Dagmar; Bajorath, Jürgen

    2014-08-14

    Activity cliffs are generally defined as pairs of active compounds having a large difference in potency. Although this definition of activity cliffs focuses on compound pairs, the vast majority of cliffs are formed in a coordinated manner. This means that multiple highly and weakly potent compounds form series of activity cliffs, which often overlap. In activity cliff networks, coordinated cliffs emerge as disjoint activity cliff clusters. Recently, we have identified all cliff clusters from current bioactive compounds and analyzed their topologies. For structure-activity relationship (SAR) analysis, activity cliff clusters are of high interest, since they contain more SAR information than cliffs that are individually considered. For medicinal chemistry applications, a key question becomes how to best extract SAR information from activity cliff clusters. This represents a challenging problem, given the complexity of many activity cliff configurations. Herein we introduce a generally applicable methodology to organize activity cliff clusters on the basis of structural relationships, prioritize clusters, and systematically extract SAR information from them. PMID:25014781

  5. A mutual-information-based mining method for marine abnormal association rules

    NASA Astrophysics Data System (ADS)

    Cunjin, Xue; Wanjiao, Song; Lijuan, Qin; Qing, Dong; Xiaoyang, Wen

    2015-03-01

    Long time series of remote sensing images are a key source of data for exploring large-scale marine abnormal association patterns, but pose significant challenges for traditional approaches to spatiotemporal analysis. This paper proposes a mutual-information-based quantitative association rule-mining algorithm (MIQarma) to address these challenges. MIQarma comprises three key steps. First, MIQarma calculates the asymmetrical mutual information between items with one scan of the database, and extracts pair-wise related items according to the user-specified information threshold. Second, a linking-pruning-generating recursive loop generates (k+1)-dimensional candidate association rules from k-dimensional rules on basis of the user-specified minimum support threshold, and this step is repeated until no more candidate association rules are generated. Finally, strong association rules are generated according to the user-specified minimum evaluation indicators. To demonstrate the feasibility and efficiency of MIQarma, we present two case studies: one considers performance analysis and the other identifies marine abnormal association relationships.

  6. Method for combining information from white matter fiber tracking and gray matter parcellation.

    PubMed

    Park, Hae-Jeong; Kubicki, Marek; Westin, Carl-Fredrik; Talos, Ion-Florin; Brun, Anders; Peiper, Steve; Kikinis, Ron; Jolesz, Ference A; McCarley, Robert W; Shenton, Martha E

    2004-09-01

    We introduce a method for combining fiber tracking from diffusion-tensor (DT) imaging with cortical gray matter parcellation from structural high-spatial-resolution 3D spoiled gradient-recalled acquisition in the steady state images. We applied this method to a tumor case to determine the impact of the tumor on white matter architecture. We conclude that this new method for combining structural and DT imaging data is useful for understanding cortical connectivity and the localization of fiber tracts and their relationship with cortical anatomy and brain abnormalities.

  7. Evaluation of non-animal methods for assessing skin sensitisation hazard: A Bayesian Value-of-Information analysis.

    PubMed

    Leontaridou, Maria; Gabbert, Silke; Van Ierland, Ekko C; Worth, Andrew P; Landsiedel, Robert

    2016-07-01

    This paper offers a Bayesian Value-of-Information (VOI) analysis for guiding the development of non-animal testing strategies, balancing information gains from testing with the expected social gains and costs from the adoption of regulatory decisions. Testing is assumed to have value, if, and only if, the information revealed from testing triggers a welfare-improving decision on the use (or non-use) of a substance. As an illustration, our VOI model is applied to a set of five individual non-animal prediction methods used for skin sensitisation hazard assessment, seven battery combinations of these methods, and 236 sequential 2-test and 3-test strategies. Their expected values are quantified and compared to the expected value of the local lymph node assay (LLNA) as the animal method. We find that battery and sequential combinations of non-animal prediction methods reveal a significantly higher expected value than the LLNA. This holds for the entire range of prior beliefs. Furthermore, our results illustrate that the testing strategy with the highest expected value does not necessarily have to follow the order of key events in the sensitisation adverse outcome pathway (AOP). PMID:27494625

  8. Evaluation of non-animal methods for assessing skin sensitisation hazard: A Bayesian Value-of-Information analysis.

    PubMed

    Leontaridou, Maria; Gabbert, Silke; Van Ierland, Ekko C; Worth, Andrew P; Landsiedel, Robert

    2016-07-01

    This paper offers a Bayesian Value-of-Information (VOI) analysis for guiding the development of non-animal testing strategies, balancing information gains from testing with the expected social gains and costs from the adoption of regulatory decisions. Testing is assumed to have value, if, and only if, the information revealed from testing triggers a welfare-improving decision on the use (or non-use) of a substance. As an illustration, our VOI model is applied to a set of five individual non-animal prediction methods used for skin sensitisation hazard assessment, seven battery combinations of these methods, and 236 sequential 2-test and 3-test strategies. Their expected values are quantified and compared to the expected value of the local lymph node assay (LLNA) as the animal method. We find that battery and sequential combinations of non-animal prediction methods reveal a significantly higher expected value than the LLNA. This holds for the entire range of prior beliefs. Furthermore, our results illustrate that the testing strategy with the highest expected value does not necessarily have to follow the order of key events in the sensitisation adverse outcome pathway (AOP).

  9. A sub-domain based regularization method with prior information for human thorax imaging using electrical impedance tomography

    NASA Astrophysics Data System (ADS)

    In Kang, Suk; Khambampati, Anil Kumar; Jeon, Min Ho; Kim, Bong Seok; Kim, Kyung Youn

    2016-02-01

    Electrical impedance tomography (EIT) is a non-invasive imaging technique that can be used as a bed-side monitoring tool for human thorax imaging. EIT has high temporal resolution characteristics but at the same time it suffers from poor spatial resolution due to ill-posedness of the inverse problem. Often regularization methods are used as a penalty term in the cost function to stabilize the sudden changes in resistivity. In human thorax monitoring, with conventional regularization methods employing Tikhonov type regularization, the reconstructed image is smoothed between the heart and the lungs, that is, it makes it difficult to distinguish the exact boundaries of the lungs and the heart. Sometimes, obtaining structural information of the object prior to this can be incorporated into the regularization method to improve the spatial resolution along with helping create clear and distinct boundaries between the objects. However, the boundary of the heart is changed rapidly due to the cardiac cycle hence there is no information concerning the exact boundary of the heart. Therefore, to improve the spatial resolution for human thorax monitoring during the cardiac cycle, in this paper, a sub-domain based regularization method is proposed assuming the lungs and part of background region is known. In the proposed method, the regularization matrix is modified anisotropically to include sub-domains as prior information, and the regularization parameter is assigned with different weights to each sub-domain. Numerical simulations and phantom experiments for 2D human thorax monitoring are performed to evaluate the performance of the proposed regularization method. The results show a better reconstruction performance with the proposed regularization method.

  10. Precision calibration method for binocular vision measurement systems based on arbitrary translations and 3D-connection information

    NASA Astrophysics Data System (ADS)

    Yang, Jinghao; Jia, Zhenyuan; Liu, Wei; Fan, Chaonan; Xu, Pengtao; Wang, Fuji; Liu, Yang

    2016-10-01

    Binocular vision systems play an important role in computer vision, and high-precision system calibration is a necessary and indispensable process. In this paper, an improved calibration method for binocular stereo vision measurement systems based on arbitrary translations and 3D-connection information is proposed. First, a new method for calibrating the intrinsic parameters of binocular vision system based on two translations with an arbitrary angle difference is presented, which reduces the effect of the deviation of the motion actuator on calibration accuracy. This method is simpler and more accurate than existing active-vision calibration methods and can provide a better initial value for the determination of extrinsic parameters. Second, a 3D-connection calibration and optimization method is developed that links the information of the calibration target in different positions, further improving the accuracy of the system calibration. Calibration experiments show that the calibration error can be reduced to 0.09%, outperforming traditional methods for the experiments of this study.

  11. Mending Metacognitive Illusions: A Comparison of Mnemonic-Based and Theory-Based Procedures

    ERIC Educational Resources Information Center

    Koriat, Asher; Bjork, Robert A.

    2006-01-01

    Previous research indicated that learners experience an illusion of competence during learning (termed foresight bias) because judgments of learning (JOLs) are made in the presence of information that will be absent at test. The authors examined the following 2 procedures for alleviating foresight bias: enhancing learners' sensitivity to…

  12. New Term Weighting Formulas for the Vector Space Method in Information Retrieval

    SciTech Connect

    Chisholm, E.; Kolda, T.G.

    1999-03-01

    The goal in information retrieval is to enable users to automatically and accurately find data relevant to their queries. One possible approach to this problem i use the vector space model, which models documents and queries as vectors in the term space. The components of the vectors are determined by the term weighting scheme, a function of the frequencies of the terms in the document or query as well as throughout the collection. We discuss popular term weighting schemes and present several new schemes that offer improved performance.

  13. [Exchange of medical imaging and data information in radiotherapy: needs, methods and current limits].

    PubMed

    Manens, J P

    1997-01-01

    Extension of the image network within radiotherapy departments provides the technical infrastructure which is made necessary by the rapid evolution of techniques in the field of diagnosis and treatment in radiotherapy. The system is aimed at managing the whole set of data (textual data and images) that are needed for planning and control of treatments. The radiotherapy network addresses two objectives: managing both the information necessary for treatment planning (target volumes definition, planning dosimetry) and the control of all parameters involved during the patient's treatment under the treatment unit. The major challenge is to improve the quality of treatment. Multimodal imaging is a major advance as it allows the use of new dosimetry and simulation techniques. The need for standards to exchange medical imaging information is now recognized by all the institutions and a majority of users and manufacturers. It is widely accepted that the lack of standard has been one of the fundamental obstacles in the deployment of operational "Picture Archiving Communication Systems". The International Standard Organisation Open System Interconnection model is the standard reference mode used to describe network protocols. The network is based on the Ethernet and TCP/IP protocol that provides the means to interconnect imaging devices and workstations dedicated to specific image processing or machines used in radiotherapy. The network uses Ethernet cabled on twisted-pair (10 BaseT) or optical fibres in a star-shaped physical layout. Dicom V3.0 supports fundamental network interactions: transfer of images (computerized tomography magnetic resonance imaging query and retrieve of images), printing on network attached cameras, support of HIS/RIS related interfacing and image management. The supplement to the Dicom standard, Dicom RT, specifies five data objects known in Dicom as Information Object Definition for relevant radiotherapy. Dicom RT objects can provide a mean for

  14. [Exchange of medical imaging and data information in radiotherapy: needs, methods and current limits].

    PubMed

    Manens, J P

    1997-01-01

    Extension of the image network within radiotherapy departments provides the technical infrastructure which is made necessary by the rapid evolution of techniques in the field of diagnosis and treatment in radiotherapy. The system is aimed at managing the whole set of data (textual data and images) that are needed for planning and control of treatments. The radiotherapy network addresses two objectives: managing both the information necessary for treatment planning (target volumes definition, planning dosimetry) and the control of all parameters involved during the patient's treatment under the treatment unit. The major challenge is to improve the quality of treatment. Multimodal imaging is a major advance as it allows the use of new dosimetry and simulation techniques. The need for standards to exchange medical imaging information is now recognized by all the institutions and a majority of users and manufacturers. It is widely accepted that the lack of standard has been one of the fundamental obstacles in the deployment of operational "Picture Archiving Communication Systems". The International Standard Organisation Open System Interconnection model is the standard reference mode used to describe network protocols. The network is based on the Ethernet and TCP/IP protocol that provides the means to interconnect imaging devices and workstations dedicated to specific image processing or machines used in radiotherapy. The network uses Ethernet cabled on twisted-pair (10 BaseT) or optical fibres in a star-shaped physical layout. Dicom V3.0 supports fundamental network interactions: transfer of images (computerized tomography magnetic resonance imaging query and retrieve of images), printing on network attached cameras, support of HIS/RIS related interfacing and image management. The supplement to the Dicom standard, Dicom RT, specifies five data objects known in Dicom as Information Object Definition for relevant radiotherapy. Dicom RT objects can provide a mean for

  15. Empirical studies on informal patient payments for health care services: a systematic and critical review of research methods and instruments

    PubMed Central

    2010-01-01

    Background Empirical evidence demonstrates that informal patient payments are an important feature of many health care systems. However, the study of these payments is a challenging task because of their potentially illegal and sensitive nature. The aim of this paper is to provide a systematic review and analysis of key methodological difficulties in measuring informal patient payments. Methods The systematic review was based on the following eligibility criteria: English language publications that reported on empirical studies measuring informal patient payments. There were no limitations with regard to the year of publication. The content of the publications was analysed qualitatively and the results were organised in the form of tables. Data sources were Econlit, Econpapers, Medline, PubMed, ScienceDirect, SocINDEX. Results Informal payments for health care services are most often investigated in studies involving patients or the general public, but providers and officials are also sample units in some studies. The majority of the studies apply a single mode of data collection that involves either face-to-face interviews or group discussions. One of the main methodological difficulties reported in the publication concerns the inability of some respondents to distinguish between official and unofficial payments. Another complication is associated with the refusal of some respondents to answer questions on informal patient payments. We do not exclude the possibility that we have missed studies that reported in non-English language journals as well as very recent studies that are not yet published. Conclusions Given the recent evidence from research on survey methods, a self-administrated questionnaire during a face-to-face interview could be a suitable mode of collecting sensitive data, such as data on informal patient payments. PMID:20849658

  16. Designing Health Websites Based on Users’ Web-Based Information-Seeking Behaviors: A Mixed-Method Observational Study

    PubMed Central

    Pang, Patrick Cheong-Iao; Verspoor, Karin; Pearce, Jon

    2016-01-01

    Background Laypeople increasingly use the Internet as a source of health information, but finding and discovering the right information remains problematic. These issues are partially due to the mismatch between the design of consumer health websites and the needs of health information seekers, particularly the lack of support for “exploring” health information. Objective The aim of this research was to create a design for consumer health websites by supporting different health information–seeking behaviors. We created a website called Better Health Explorer with the new design. Through the evaluation of this new design, we derive design implications for future implementations. Methods Better Health Explorer was designed using a user-centered approach. The design was implemented and assessed through a laboratory-based observational study. Participants tried to use Better Health Explorer and another live health website. Both websites contained the same content. A mixed-method approach was adopted to analyze multiple types of data collected in the experiment, including screen recordings, activity logs, Web browsing histories, and audiotaped interviews. Results Overall, 31 participants took part in the observational study. Our new design showed a positive result for improving the experience of health information seeking, by providing a wide range of information and an engaging environment. The results showed better knowledge acquisition, a higher number of page reads, and more query reformulations in both focused and exploratory search tasks. In addition, participants spent more time to discover health information with our design in exploratory search tasks, indicating higher engagement with the website. Finally, we identify 4 design considerations for designing consumer health websites and health information–seeking apps: (1) providing a dynamic information scope; (2) supporting serendipity; (3) considering trust implications; and (4) enhancing interactivity

  17. A simple method for estimating basin-scale groundwater discharge by vegetation in the basin and range province of Arizona using remote sensing information and geographic information systems

    USGS Publications Warehouse

    Tillman, F.D.; Callegary, J.B.; Nagler, P.L.; Glenn, E.P.

    2012-01-01

    Groundwater is a vital water resource in the arid to semi-arid southwestern United States. Accurate accounting of inflows to and outflows from the groundwater system is necessary to effectively manage this shared resource, including the important outflow component of groundwater discharge by vegetation. A simple method for estimating basin-scale groundwater discharge by vegetation is presented that uses remote sensing data from satellites, geographic information systems (GIS) land cover and stream location information, and a regression equation developed within the Southern Arizona study area relating the Enhanced Vegetation Index from the MODIS sensors on the Terra satellite to measured evapotranspiration. Results computed for 16-day composited satellite passes over the study area during the 2000 through 2007 time period demonstrate a sinusoidal pattern of annual groundwater discharge by vegetation with median values ranging from around 0.3 mm per day in the cooler winter months to around 1.5 mm per day during summer. Maximum estimated annual volume of groundwater discharge by vegetation was between 1.4 and 1.9 billion m3 per year with an annual average of 1.6 billion m3. A simplified accounting of the contribution of precipitation to vegetation greenness was developed whereby monthly precipitation data were subtracted from computed vegetation discharge values, resulting in estimates of minimum groundwater discharge by vegetation. Basin-scale estimates of minimum and maximum groundwater discharge by vegetation produced by this simple method are useful bounding values for groundwater budgets and groundwater flow models, and the method may be applicable to other areas with similar vegetation types.

  18. Methods and apparatus for multi-resolution replication of files in a parallel computing system using semantic information

    DOEpatents

    Faibish, Sorin; Bent, John M.; Tzelnic, Percy; Grider, Gary; Torres, Aaron

    2015-10-20

    Techniques are provided for storing files in a parallel computing system using different resolutions. A method is provided for storing at least one file generated by a distributed application in a parallel computing system. The file comprises one or more of a complete file and a sub-file. The method comprises the steps of obtaining semantic information related to the file; generating a plurality of replicas of the file with different resolutions based on the semantic information; and storing the file and the plurality of replicas of the file in one or more storage nodes of the parallel computing system. The different resolutions comprise, for example, a variable number of bits and/or a different sub-set of data elements from the file. A plurality of the sub-files can be merged to reproduce the file.

  19. Complementary medicine use among cancer patients receiving radiotherapy and chemotherapy: methods, sources of information and the need for counselling.

    PubMed

    Pihlak, R; Liivand, R; Trelin, O; Neissar, H; Peterson, I; Kivistik, S; Lilo, K; Jaal, J

    2014-03-01

    Complementary medicine (CM) use is common among cancer patients. However, little is known about CM products that are utilised during radiotherapy and/or chemotherapy. Out of 62 cancer patients who completed a specialised survey, 35 (56%) consumed some type of CM during active anti-cancer therapy. Cancer patients reported the use of herbal teas (52%), vitamins and other dietary supplements (45%), vegetables and juices (39%), special diets (19%), herbal medicines, including Chinese medicines (19%) and 'immunomodulators' (3%). Most of patients (86%) consumed CM products every day. However, nearly 47% of CM users did not admit this to their oncologists. Majority of CM users (85%) were convinced that supplementary products increase the efficacy of standard anti-cancer therapy and prolong their survival. Information about CM was mainly obtained through internet sources (36%), books and brochures (25%). Although most CM users (82%) trusted the received information, 73% of them admitted that additional information about CM methods would be necessary. Patients would like to receive additional information through a specialised consultation (60%), but also from brochures (44%) and the internet (20%). Adequate counselling of patients is of paramount importance since some CM methods may cause significant side effects and decrease the efficacy of radiotherapy and/or chemotherapy.

  20. Assessment of occupational exposure to uranium by indirect methods needs information on natural background variations.

    PubMed

    Muikku, M; Heikkinen, T; Puhakainen, M; Rahola, T; Salonen, L

    2007-01-01

    Urine monitoring is the preferred method to determine exposure to soluble compounds of uranium in workplaces. The interpretation of uranium contents in workers bioassay samples requires knowledge on uranium excretion and its dependence on intake by diet. Exceptionally high concentrations of natural uranium in private drinking water sources have been measured in the granite areas of Southern Finland. Consequently, high concentrations of natural uranium have been observed in the urine and hair samples of people using water from their own drilled wells. Natural uranium content in urine and hair samples of family members, who use uranium-rich household water, have been analyzed by using ICP-MS. The uranium concentrations both in urine and hair samples of the study subjects were significantly higher than the world-wide average values. In addition, gammaspectrometric methods have been tested for determining uranium in hair samples. This method can be used only for samples with highly elevated uranium concentrations.

  1. Methods for semi-automated indexing for high precision information retrieval

    NASA Technical Reports Server (NTRS)

    Berrios, Daniel C.; Cucina, Russell J.; Fagan, Lawrence M.

    2002-01-01

    OBJECTIVE: To evaluate a new system, ISAID (Internet-based Semi-automated Indexing of Documents), and to generate textbook indexes that are more detailed and more useful to readers. DESIGN: Pilot evaluation: simple, nonrandomized trial comparing ISAID with manual indexing methods. Methods evaluation: randomized, cross-over trial comparing three versions of ISAID and usability survey. PARTICIPANTS: Pilot evaluation: two physicians. Methods evaluation: twelve physicians, each of whom used three different versions of the system for a total of 36 indexing sessions. MEASUREMENTS: Total index term tuples generated per document per minute (TPM), with and without adjustment for concordance with other subjects; inter-indexer consistency; ratings of the usability of the ISAID indexing system. RESULTS: Compared with manual methods, ISAID decreased indexing times greatly. Using three versions of ISAID, inter-indexer consistency ranged from 15% to 65% with a mean of 41%, 31%, and 40% for each of three documents. Subjects using the full version of ISAID were faster (average TPM: 5.6) and had higher rates of concordant index generation. There were substantial learning effects, despite our use of a training/run-in phase. Subjects using the full version of ISAID were much faster by the third indexing session (average TPM: 9.1). There was a statistically significant increase in three-subject concordant indexing rate using the full version of ISAID during the second indexing session (p < 0.05). SUMMARY: Users of the ISAID indexing system create complex, precise, and accurate indexing for full-text documents much faster than users of manual methods. Furthermore, the natural language processing methods that ISAID uses to suggest indexes contributes substantially to increased indexing speed and accuracy.

  2. Methods for Semi-automated Indexing for High Precision Information Retrieval

    PubMed Central

    Berrios, Daniel C.; Cucina, Russell J.; Fagan, Lawrence M.

    2002-01-01

    Objective. To evaluate a new system, ISAID (Internet-based Semi-automated Indexing of Documents), and to generate textbook indexes that are more detailed and more useful to readers. Design. Pilot evaluation: simple, nonrandomized trial comparing ISAID with manual indexing methods. Methods evaluation: randomized, cross-over trial comparing three versions of ISAID and usability survey. Participants. Pilot evaluation: two physicians. Methods evaluation: twelve physicians, each of whom used three different versions of the system for a total of 36 indexing sessions. Measurements. Total index term tuples generated per document per minute (TPM), with and without adjustment for concordance with other subjects; inter-indexer consistency; ratings of the usability of the ISAID indexing system. Results. Compared with manual methods, ISAID decreased indexing times greatly. Using three versions of ISAID, inter-indexer consistency ranged from 15% to 65% with a mean of 41%, 31%, and 40% for each of three documents. Subjects using the full version of ISAID were faster (average TPM: 5.6) and had higher rates of concordant index generation. There were substantial learning effects, despite our use of a training/run-in phase. Subjects using the full version of ISAID were much faster by the third indexing session (average TPM: 9.1). There was a statistically significant increase in three-subject concordant indexing rate using the full version of ISAID during the second indexing session (p < 0.05). Summary. Users of the ISAID indexing system create complex, precise, and accurate indexing for full-text documents much faster than users of manual methods. Furthermore, the natural language processing methods that ISAID uses to suggest indexes contributes substantially to increased indexing speed and accuracy. PMID:12386114

  3. Method and apparatus for optimizing a train trip using signal information

    DOEpatents

    Kumar, Ajith Kuttannair; Daum, Wolfgang; Otsubo, Tom; Hershey, John Erik; Hess, Gerald James

    2013-02-05

    One embodiment of the invention includes a system for operating a railway network comprising a first railway vehicle (400) during a trip along track segments (401/412/420). The system comprises a first element (65) for determining travel parameters of the first railway vehicle (400), a second element (65) for determining travel parameters of a second railway vehicle (418) relative to the track segments to be traversed by the first vehicle during the trip, a processor (62) for receiving information from the first (65) and the second (65) elements and for determining a relationship between occupation of a track segment (401/412/420) by the second vehicle (418) and later occupation of the same track segment by the first vehicle (400) and an algorithm embodied within the processor (62) having access to the information to create a trip plan that determines a speed trajectory for the first vehicle (400), wherein the speed trajectory is responsive to the relationship and further in accordance with one or more operational criteria for the first vehicle (400).

  4. The development of systematic quality control method using laboratory information system and unity program.

    PubMed

    Min, Won-Ki; Lee, Woochang; Park, Hyosoon

    2002-01-01

    Quality control (QC) process is performed to detect and correct errors in the laboratory, of which systematic errors are repeated and affect all the laboratory process thereafter. This makes it necessary for all the laboratories to detect and correct errors effectively and efficiently. We developed an on-line quality assurance system for detection and correction of systematic error, and linked it to the Unity Plus/Pro (Bio-Rad Laboratories, Irvine, USA), a commercially available quality management system. The laboratory information system based on the client-server paradigm was developed using NCR3600 (NCR, West Columbia, USA) as the server and database for server was Oracle 7.2 (Oracle, Belmont, USA) and development tool was Powerbuilder (Powersoft Burlignton, UK). Each QC material is registered and gets its own identification number and tested the same way as patient sample. The resulting QC data is entered into the Unity Plus/Pro program by in-house data entering program or by manual input. With the implementation of in-house laboratory information system (LIS) and linking it to Unity Plus/Pro, we could apply Westgard's multi-rule for higher error detection rate, resulting in more systematic and precise quality assurance for laboratory product, as well as complementary to conventional external quality assessment.

  5. The development of systematic quality control method using laboratory information system and unity program.

    PubMed

    Min, Won-Ki; Lee, Woochang; Park, Hyosoon

    2002-01-01

    Quality control (QC) process is performed to detect and correct errors in the laboratory, of which systematic errors are repeated and affect all the laboratory process thereafter. This makes it necessary for all the laboratories to detect and correct errors effectively and efficiently. We developed an on-line quality assurance system for detection and correction of systematic error, and linked it to the Unity Plus/Pro (Bio-Rad Laboratories, Irvine, USA), a commercially available quality management system. The laboratory information system based on the client-server paradigm was developed using NCR3600 (NCR, West Columbia, USA) as the server and database for server was Oracle 7.2 (Oracle, Belmont, USA) and development tool was Powerbuilder (Powersoft Burlignton, UK). Each QC material is registered and gets its own identification number and tested the same way as patient sample. The resulting QC data is entered into the Unity Plus/Pro program by in-house data entering program or by manual input. With the implementation of in-house laboratory information system (LIS) and linking it to Unity Plus/Pro, we could apply Westgard's multi-rule for higher error detection rate, resulting in more systematic and precise quality assurance for laboratory product, as well as complementary to conventional external quality assessment. PMID:12755272

  6. Methods of information geometry in computational system biology (consistency between chemical and biological evolution).

    PubMed

    Astakhov, Vadim

    2009-01-01

    Interest in simulation of large-scale metabolic networks, species development, and genesis of various diseases requires new simulation techniques to accommodate the high complexity of realistic biological networks. Information geometry and topological formalisms are proposed to analyze information processes. We analyze the complexity of large-scale biological networks as well as transition of the system functionality due to modification in the system architecture, system environment, and system components. The dynamic core model is developed. The term dynamic core is used to define a set of causally related network functions. Delocalization of dynamic core model provides a mathematical formalism to analyze migration of specific functions in biosystems which undergo structure transition induced by the environment. The term delocalization is used to describe these processes of migration. We constructed a holographic model with self-poetic dynamic cores which preserves functional properties under those transitions. Topological constraints such as Ricci flow and Pfaff dimension were found for statistical manifolds which represent biological networks. These constraints can provide insight on processes of degeneration and recovery which take place in large-scale networks. We would like to suggest that therapies which are able to effectively implement estimated constraints, will successfully adjust biological systems and recover altered functionality. Also, we mathematically formulate the hypothesis that there is a direct consistency between biological and chemical evolution. Any set of causal relations within a biological network has its dual reimplementation in the chemistry of the system environment.

  7. Retrieval practice is an efficient method of enhancing the retention of anatomy and physiology information.

    PubMed

    Dobson, John L

    2013-06-01

    Although a great deal of empirical evidence has indicated that retrieval practice is an effective means of promoting learning and memory, very few studies have investigated the strategy in the context of an actual class. The primary purpose of this study was to determine if a series of very brief retrieval quizzes could significantly improve the retention of previously tested information throughout an anatomy and physiology course. A second purpose was to determine if there were any significant differences between expanding and uniform patterns of retrieval that followed a standardized initial retrieval delay. Anatomy and physiology students were assigned to either a control group or groups that were repeatedly prompted to retrieve a subset of previously tested course information via a series of quizzes that were administered on either an expanding or a uniform schedule. Each retrieval group completed a total of 10 retrieval quizzes, and the series of quizzes required (only) a total of 2 h to complete. Final retention of the exam subset material was assessed during the last week of the semester. There were no significant differences between the expanding and uniform retrieval groups, but both retained an average of 41% more of the subset material than did the control group (ANOVA, F = 129.8, P = 0.00, ηp(2) = 0.36). In conclusion, retrieval practice is a highly efficient and effective strategy for enhancing the retention of anatomy and physiology material.

  8. Extracting land use information from the earth resources technology satellite data by conventional interpretation methods

    NASA Technical Reports Server (NTRS)

    Vegas, P. L.

    1974-01-01

    A procedure for obtaining land use data from satellite imagery by the use of conventional interpretation methods is presented. The satellite is described briefly, and the advantages of various scales and multispectral scanner bands are discussed. Methods for obtaining satellite imagery and the sources of this imagery are given. Equipment used in the study is described, and samples of land use maps derived from satellite imagery are included together with the land use classification system used. Accuracy percentages are cited and are compared to those of a previous experiment using small scale aerial photography.

  9. Finding a needle in a haystack: toward a psychologically informed method for aviation security screening.

    PubMed

    Ormerod, Thomas C; Dando, Coral J

    2015-02-01

    Current aviation security systems identify behavioral indicators of deception to assess risks to flights, but they lack a strong psychological basis or empirical validation. We present a new method that tests the veracity of passenger accounts. In an in vivo double-blind randomized-control trial conducted in international airports, security agents detected 66% of deceptive passengers using the veracity test method compared with less than 5% using behavioral indicator recognition. As well as revealing advantages of veracity testing over behavioral indicator identification, the study provides the highest levels to date of deception detection in a realistic setting where the known base rate of deceptive individuals is low.

  10. Finding a needle in a haystack: toward a psychologically informed method for aviation security screening.

    PubMed

    Ormerod, Thomas C; Dando, Coral J

    2015-02-01

    Current aviation security systems identify behavioral indicators of deception to assess risks to flights, but they lack a strong psychological basis or empirical validation. We present a new method that tests the veracity of passenger accounts. In an in vivo double-blind randomized-control trial conducted in international airports, security agents detected 66% of deceptive passengers using the veracity test method compared with less than 5% using behavioral indicator recognition. As well as revealing advantages of veracity testing over behavioral indicator identification, the study provides the highest levels to date of deception detection in a realistic setting where the known base rate of deceptive individuals is low. PMID:25365531

  11. Evaluation of the clinical process in a critical care information system using the Lean method: a case study

    PubMed Central

    2012-01-01

    Background There are numerous applications for Health Information Systems (HIS) that support specific tasks in the clinical workflow. The Lean method has been used increasingly to optimize clinical workflows, by removing waste and shortening the delivery cycle time. There are a limited number of studies on Lean applications related to HIS. Therefore, we applied the Lean method to evaluate the clinical processes related to HIS, in order to evaluate its efficiency in removing waste and optimizing the process flow. This paper presents the evaluation findings of these clinical processes, with regards to a critical care information system (CCIS), known as IntelliVue Clinical Information Portfolio (ICIP), and recommends solutions to the problems that were identified during the study. Methods We conducted a case study under actual clinical settings, to investigate how the Lean method can be used to improve the clinical process. We used observations, interviews, and document analysis, to achieve our stated goal. We also applied two tools from the Lean methodology, namely the Value Stream Mapping and the A3 problem-solving tools. We used eVSM software to plot the Value Stream Map and A3 reports. Results We identified a number of problems related to inefficiency and waste in the clinical process, and proposed an improved process model. Conclusions The case study findings show that the Value Stream Mapping and the A3 reports can be used as tools to identify waste and integrate the process steps more efficiently. We also proposed a standardized and improved clinical process model and suggested an integrated information system that combines database and software applications to reduce waste and data redundancy. PMID:23259846

  12. Methods, systems, and apparatus for storage, transfer and/or control of information via matter wave dynamics

    NASA Technical Reports Server (NTRS)

    Vestergaard Hau, Lene (Inventor)

    2012-01-01

    Methods, systems and apparatus for generating atomic traps, and for storing, controlling and transferring information between first and second spatially separated phase-coherent objects, or using a single phase-coherent object. For plural objects, both phase-coherent objects have a macroscopic occupation of a particular quantum state by identical bosons or identical BCS-paired fermions. The information may be optical information, and the phase-coherent object(s) may be Bose-Einstein condensates, superfluids, or superconductors. The information is stored in the first phase-coherent object at a first storage time and recovered from the second phase-coherent object, or the same first phase-coherent object, at a second revival time. In one example, an integrated silicon wafer-based optical buffer includes an electrolytic atom source to provide the phase-coherent object(s), a nanoscale atomic trap for the phase-coherent object(s), and semiconductor-based optical sources to cool the phase-coherent object(s) and provide coupling fields for storage and transfer of optical information.

  13. Prospect theory based estimation of drivers' risk attitudes in route choice behaviors.

    PubMed

    Zhou, Lizhen; Zhong, Shiquan; Ma, Shoufeng; Jia, Ning

    2014-12-01

    This paper applied prospect theory (PT) to describe drivers' route choice behavior under Variable Message Sign (VMS), which presented visual traffic information to assist them to make route choice decisions. A quite rich empirical data from questionnaire and field spot was used to estimate parameters of PT. In order to make the parameters more realistic with drivers' attitudes, they were classified into different types by significant factors influencing their behaviors. Based on the travel time distribution of alternative routes and route choice results from questionnaire, the parameterized value function of each category was figured out, which represented drivers' risk attitudes and choice characteristics. The empirical verification showed that the estimates were acceptable and effective. The result showed drivers' risk attitudes and route choice characteristics could be captured by PT under real-time information shown on VMS. For practical application, once drivers' route choice characteristics and parameters were identified, their route choice behavior under different road conditions could be predicted accurately, which was the basis of traffic guidance measures formulation and implementation for targeted traffic management. Moreover, the heterogeneous risk attitudes among drivers should be considered when releasing traffic information and regulating traffic flow.

  14. Game Theory Based Security in Wireless Body Area Network with Stackelberg Security Equilibrium.

    PubMed

    Somasundaram, M; Sivakumar, R

    2015-01-01

    Wireless Body Area Network (WBAN) is effectively used in healthcare to increase the value of the patient's life and also the value of healthcare services. The biosensor based approach in medical care system makes it difficult to respond to the patients with minimal response time. The medical care unit does not deploy the accessing of ubiquitous broadband connections full time and hence the level of security will not be high always. The security issue also arises in monitoring the user body function records. Most of the systems on the Wireless Body Area Network are not effective in facing the security deployment issues. To access the patient's information with higher security on WBAN, Game Theory with Stackelberg Security Equilibrium (GTSSE) is proposed in this paper. GTSSE mechanism takes all the players into account. The patients are monitored by placing the power position authority initially. The position authority in GTSSE is the organizer and all the other players react to the organizer decision. Based on our proposed approach, experiment has been conducted on factors such as security ratio based on patient's health information, system flexibility level, energy consumption rate, and information loss rate. Stackelberg Security considerably improves the strength of solution with higher security.

  15. Game Theory Based Security in Wireless Body Area Network with Stackelberg Security Equilibrium

    PubMed Central

    Somasundaram, M.; Sivakumar, R.

    2015-01-01

    Wireless Body Area Network (WBAN) is effectively used in healthcare to increase the value of the patient's life and also the value of healthcare services. The biosensor based approach in medical care system makes it difficult to respond to the patients with minimal response time. The medical care unit does not deploy the accessing of ubiquitous broadband connections full time and hence the level of security will not be high always. The security issue also arises in monitoring the user body function records. Most of the systems on the Wireless Body Area Network are not effective in facing the security deployment issues. To access the patient's information with higher security on WBAN, Game Theory with Stackelberg Security Equilibrium (GTSSE) is proposed in this paper. GTSSE mechanism takes all the players into account. The patients are monitored by placing the power position authority initially. The position authority in GTSSE is the organizer and all the other players react to the organizer decision. Based on our proposed approach, experiment has been conducted on factors such as security ratio based on patient's health information, system flexibility level, energy consumption rate, and information loss rate. Stackelberg Security considerably improves the strength of solution with higher security. PMID:26759829

  16. Background information for the Leaching environmental Assessment Framework (LEAF) test methods

    EPA Science Inventory

    The U.S. Environmental Protection Agency Office of Resource Conservation and Recovery has initiated the review and validation process for four leaching tests under consideration for inclusion into SW-846: Method 1313 "Liquid-Solid Partitioning as a Function of Extract pH for Co...

  17. Provision of assistive technology services method (ATSM) according to evidence-based information and knowledge management.

    PubMed

    Elsaesser, Linda-Jeanne; Bauer, Stephen M

    2011-01-01

    PURPOSE. This article develops a standardised method for assistive technology service (ATS) provision and a logical basis for research to improve health care quality. The method is 'interoperable' across disabilities, disciplines, assistive technology devices and ATSs. BACKGROUND. Absence of a standardised and interoperable method for ATS provision results in ineffective communication between providers, manufacturers, researchers, policy-makers and individuals with disabilities (IWD), a fragmented service delivery system, inefficient resource allocation and sub-optimal outcomes. OBJECTIVES. Synthesise a standardised, interoperable AT service method (ATSM) fully consistent with key guidelines, systems, models and Federal legislation. Express the ATSM using common and unambiguous language. RESULTS. Guidelines, systems, models and Federal legislation relevant to ATS provision are reviewed. These include the RESNA Guidelines for Knowledge and Skills for Provision of Assistive Technology Products and Services (RESNA Guidelines), IMPACT2 model, international classification of functioning, disability and health (ICF) and AT device classification (ATDC). Federal legislation includes the Assistive Technology Act of 2004, Americans with Disabilities Act of 2008 and Social Security Act. Based on these findings, the ATSM is synthesised and translated into common and accessible language. CONCLUSION. ATSM usage will improve communication between stakeholders, service delivery coherence, resource allocation and intervention outcomes.

  18. Provision of assistive technology services method (ATSM) according to evidence-based information and knowledge management.

    PubMed

    Elsaesser, Linda-Jeanne; Bauer, Stephen M

    2011-01-01

    PURPOSE. This article develops a standardised method for assistive technology service (ATS) provision and a logical basis for research to improve health care quality. The method is 'interoperable' across disabilities, disciplines, assistive technology devices and ATSs. BACKGROUND. Absence of a standardised and interoperable method for ATS provision results in ineffective communication between providers, manufacturers, researchers, policy-makers and individuals with disabilities (IWD), a fragmented service delivery system, inefficient resource allocation and sub-optimal outcomes. OBJECTIVES. Synthesise a standardised, interoperable AT service method (ATSM) fully consistent with key guidelines, systems, models and Federal legislation. Express the ATSM using common and unambiguous language. RESULTS. Guidelines, systems, models and Federal legislation relevant to ATS provision are reviewed. These include the RESNA Guidelines for Knowledge and Skills for Provision of Assistive Technology Products and Services (RESNA Guidelines), IMPACT2 model, international classification of functioning, disability and health (ICF) and AT device classification (ATDC). Federal legislation includes the Assistive Technology Act of 2004, Americans with Disabilities Act of 2008 and Social Security Act. Based on these findings, the ATSM is synthesised and translated into common and accessible language. CONCLUSION. ATSM usage will improve communication between stakeholders, service delivery coherence, resource allocation and intervention outcomes. PMID:21345000

  19. Mixed Methods Analysis and Information Visualization: Graphical Display for Effective Communication of Research Results

    ERIC Educational Resources Information Center

    Onwuegbuzie, Anthony J.; Dickinson, Wendy B.

    2008-01-01

    In this paper, we introduce various graphical methods that can be used to represent data in mixed research. First, we present a broad taxonomy of visual representation. Next, we use this taxonomy to provide an overview of visual techniques for quantitative data display and qualitative data display. Then, we propose what we call "crossover" visual…

  20. Understanding the Effects of Different Study Methods on Retention of Information and Transfer of Learning

    ERIC Educational Resources Information Center

    Egan, Rylan G.

    2012-01-01

    Introduction: The following study investigates relationships between spaced practice (re-studying after a delay) and transfer of learning. Specifically, the impact on learners ability to transfer learning after participating in spaced model-building or unstructured study of narrated text. Method: Subjects were randomly assigned either to a…

  1. A new method of CCD dark current correction via extracting the dark Information from scientific images

    NASA Astrophysics Data System (ADS)

    Ma, Bin; Shang, Zhaohui; Hu, Yi; Liu, Qiang; Wang, Lifan; Wei, Peng

    2014-07-01

    We have developed a new method to correct dark current at relatively high temperatures for Charge-Coupled Device (CCD) images when dark frames cannot be obtained on the telescope. For images taken with the Antarctic Survey Telescopes (AST3) in 2012, due to the low cooling efficiency, the median CCD temperature was -46°C, resulting in a high dark current level of about 3e-/pix/sec, even comparable to the sky brightness (10e-/pix/sec). If not corrected, the nonuniformity of the dark current could even overweight the photon noise of the sky background. However, dark frames could not be obtained during the observing season because the camera was operated in frame-transfer mode without a shutter, and the telescope was unattended in winter. Here we present an alternative, but simple and effective method to derive the dark current frame from the scientific images. Then we can scale this dark frame to the temperature at which the scientific images were taken, and apply the dark frame corrections to the scientific images. We have applied this method to the AST3 data, and demonstrated that it can reduce the noise to a level roughly as low as the photon noise of the sky brightness, solving the high noise problem and improving the photometric precision. This method will also be helpful for other projects that suffer from similar issues.

  2. Death Valley regional groundwater flow model calibration using optimal parameter estimation methods and geoscientific information systems

    USGS Publications Warehouse

    D'Agnese, F. A.; Faunt, C.C.; Hill, M.C.; Turner, A.K.

    1996-01-01

    A three-layer Death Valley regional groundwater flow model was constructed to evaluate potential regional groundwater flow paths in the vicinity of Yucca Mountain, Nevada. Geoscientific information systems were used to characterize the complex surface and subsurface hydrogeological conditions of the area, and this characterization was used to construct likely conceptual models of the flow system. The high contrasts and abrupt contacts of the different hydrogeological units in the subsurface make zonation the logical choice for representing the hydraulic conductivity distribution. Hydraulic head and spring flow data were used to test different conceptual models by using nonlinear regression to determine parameter values that currently provide the best match between the measured and simulated heads and flows.

  3. Improving Perceived and Actual Text Difficulty for Health Information Consumers using Semi-Automated Methods

    PubMed Central

    Leroy, Gondy; Endicott, James E.; Mouradi, Obay; Kauchak, David; Just, Melissa L.

    2012-01-01

    We are developing algorithms for semi-automated simplification of medical text. Based on lexical and grammatical corpus analysis, we identified a new metric, term familiarity, to help estimate text difficulty. We developed an algorithm that uses term familiarity to identify difficult text and select easier alternatives from lexical resources such as WordNet, UMLS and Wiktionary. Twelve sentences were simplified to measure perceived difficulty using a 5-point Likert scale. Two documents were simplified to measure actual difficulty by posing questions with and without the text present (information understanding and retention). We conducted a user study by inviting participants (N=84) via Amazon Mechanical Turk. There was a significant effect of simplification on perceived difficulty (p<.001). We also saw slightly improved understanding with better question-answering for simplified documents but the effect was not significant (p=.097). Our results show how term familiarity is a valuable component in simplifying text in an efficient and scalable manner. PMID:23304324

  4. An evaluation of information-theoretic methods for detecting structural microbial biosignatures.

    PubMed

    Wagstaff, Kiri L; Corsetti, Frank A

    2010-05-01

    The first observations of extraterrestrial environments will most likely be in the form of digital images. Given an image of a rock that contains layered structures, is it possible to determine whether the layers were created by life (biogenic)? While conclusive judgments about biogenicity are unlikely to be made solely on the basis of image features, an initial assessment of the importance of a given sample can inform decisions about follow-up searches for other types of possible biosignatures (e.g., isotopic or chemical analysis). In this study, we evaluated several quantitative measures that capture the degree of complexity in visible structures, in terms of compressibility (to detect order) and the entropy (spread) of their intensity distributions. Computing complexity inside a sliding analysis window yields a map of each of these features that indicates how they vary spatially across the sample. We conducted experiments on both biogenic and abiogenic terrestrial stromatolites and on laminated structures found on Mars. The degree to which each feature separated biogenic from abiogenic samples (separability) was assessed quantitatively. None of the techniques provided a consistent, statistically significant distinction between all biogenic and abiogenic samples. However, the PNG compression ratio provided the strongest distinction (2.80 in standard deviation units) and could inform future techniques. Increasing the analysis window size or the magnification level, or both, improved the separability of the samples. Finally, data from all four Mars samples plotted well outside the biogenic field suggested by the PNG analyses, although we caution against a direct comparison of terrestrial stromatolites and martian non-stromatolites.

  5. The Personalized Reminder Information and Social Management System (PRISM) Trial: Rationale, Methods and Baseline Characteristics

    PubMed Central

    Czaja, Sara J.; Boot, Walter R.; Charness, Neil; Rogers, Wendy; Sharit, Joseph; Fisk, Arthur D.; Lee, Chin Chin; Nair, Sankaran N.

    2014-01-01

    Technology holds promise in terms of providing support to older adults. To date there have been limited robust systematic efforts to evaluate the psychosocial benefits of technology for older people and identify factors that influence both the usability and uptake of technology systems. In response to these issues we developed the Personal Reminder Information and Social Management System (PRISM), a software application designed for older adults to support social connectivity, memory, knowledge about topics, leisure activities and access to resources. This trail is evaluating the impact of access to the PRISM system on outcomes such as social isolation, social support and connectivity. This paper reports on the approach used to design the PRISM system, study design, methodology and baseline data for the trial. The trial is multi-site randomized field trial. PRISM is being compared to a Binder condition where participants received a binder that contained content similar to that found on PRISM. The sample includes 300 older adults, aged 65 – 98 years, who lived alone and at risk for being isolated. The primary outcome measures for the trial include indices of social isolation and support and well-being. Secondary outcomes measures include indices of computer proficiency, technology uptake and attitudes towards technology. Follow-up assessments occurred at 6 and 12 months post-randomization. The results of this study will yield important information about the potential value of technology for older adults. The study also demonstrates how a user-centered iterative design approach can be incorporated into the design and evaluation of an intervention protocol. PMID:25460342

  6. Early assessment of medical technologies to inform product development and market access: a review of methods and applications.

    PubMed

    Ijzerman, Maarten J; Steuten, Lotte M G

    2011-09-01

    Worldwide, billions of dollars are invested in medical product development and there is an increasing pressure to maximize the revenues of these investments. That is, governments need to be informed about the benefits of spending public resources, companies need more information to manage their product development portfolios and even universities may need to direct their research programmes in order to maximize societal benefits. Assuming that all medical products need to be adopted by the heavily regulated healthcare market at one point in time, it is worthwhile to look at the logic behind healthcare decision making, specifically, decisions on the coverage of medical products and decisions on the use of these products under competing and uncertain conditions. With the growing tension between leveraging economic growth through R&D spending on the one hand and stricter control of healthcare budgets on the other, several attempts have been made to apply the health technology assessment (HTA) methodology to earlier stages of technology development and implementation. For instance, horizon scanning was introduced to systematically assess emerging technologies in order to inform health policy. Others have introduced iterative economic evaluation, e.g. economic evaluations in earlier stages of clinical research. However, most of these methods are primarily intended to support governments in making decisions regarding potentially expensive new medical products. They do not really inform biomedical product developers on the probability of return on investment, nor do they inform about the market needs and specific requirements of technologies in development. It is precisely this aspect that increasingly receives attention, i.e. is it possible to use HTA tools and methods to inform biomedical product development and to anticipate further development and market access. Several methods have been used in previous decades, but have never been compiled in a comprehensive review

  7. Methods for providing probe position and temperature information on MR images during interventional procedures.

    PubMed

    Patel, K C; Duerk, J L; Zhang, Q; Chung, Y C; Williams, M; Kaczynski, K; Wendt, M; Lewin, J S

    1998-10-01

    Interventional magnetic resonance imaging (MRI) can be defined as the use of MR images for guiding and monitoring interventional procedures (e.g., biopsy, drainage) or minimally invasive therapy (e.g., thermal ablation). This work describes the development of a prototype graphical user interface and the appropriate software methods to accurately overlay a representation of a rigid interventional device [e.g., biopsy needle, radio-frequency (RF) probe] onto an MR image given only the probe's spatial position and orientation as determined from a three-dimensional (3-D) localizer used for interactive scan plane definition. This permits 1) "virtual tip tracking," where the probe tip location is displayed on the image without the use of separate receiver coils or a "road map" image data set, and, 2) "extending" the probe to predict its path if it were directly moved forward toward the target tissue. Further, this paper describes the design and implementation of a method to facilitate the monitoring of thermal ablation procedures by displaying and overlaying temperature maps from temperature sensitive MR acquisitions. These methods provide rapid graphical updates of probe position and temperature changes to aid the physician during the actual interventional MRI procedures without altering the usual operation of the MR imager. PMID:9874304

  8. Determining of Factors Influencing the Success and Failure of Hospital Information System and Their Evaluation Methods: A Systematic Review

    PubMed Central

    Sadoughi, Farahnaz; Kimiafar, Khalil; Ahmadi, Maryam; Shakeri, Mohammad Taghi

    2013-01-01

    Background: Nowadays, using new information technology (IT) has provided remarkable opportunities to decrease medical errors, support health care specialist, increase the efficiency and even the quality of patient’s care and safety. Objectives: The purpose of this study was the identification of Hospital Information System (HIS) success and failure factors and the evaluation methods of these factors. This research emphasizes the need to a comprehensive evaluation of HISs which considers a wide range of success and failure factors in these systems. Materials and Methods: We searched for relevant English language studies based on keywords in title and abstract, using PubMed, Ovid Medline (by applying MeSH terms), Scopus, ScienceDirect and Embase (earliest entry to march 17, 2012). Studies which considered success models and success or failure factors, or studied the evaluation models of HISs and the related ones were chosen. Since the studies used in this systematic review were heterogeneous, the combination of extracted data was carried out by using narrative synthesis method. Results: We found 16 articles which required detailed analysis. Finally, the suggested framework includes 12 main factors (functional, organizational, behavioral, cultural, management, technical, strategy, economy, education, legal, ethical and political factors), 67 sub factors, and 33 suggested methods for the evaluation of these sub factors. Conclusions: The results of the present research indicates that the emphasis of the HIS evaluation moves from technical subjects to human and organizational subjects, and from objective to subjective issues. Therefore, this issue entails more familiarity with more qualitative evaluation methods. In most of the reviewed studies, the main focus has been laid on the necessity of using multi-method approaches and combining methods to obtain more comprehensive and useful results. PMID:24693386

  9. A theory-based primary health care intervention for women who have left abusive partners.

    PubMed

    Ford-Gilboe, Marilyn; Merritt-Gray, Marilyn; Varcoe, Colleen; Wuest, Judith

    2011-01-01

    Although intimate partner violence is a significant global health problem, few tested interventions have been designed to improve women's health and quality of life, particularly beyond the crisis of leaving. The Intervention for Health Enhancement After Leaving is a comprehensive, trauma informed, primary health care intervention, which builds on the grounded theory Strengthening Capacity to Limit Intrusion and other research findings. Delivered by a nurse and a domestic violence advocate working collaboratively with women through 6 components (safeguarding, managing basics, managing symptoms, cautious connecting, renewing self, and regenerating family), this promising intervention is in the early phases of testing. PMID:21654310

  10. A fuzzy-set-theory-based approach to analyse species membership in DNA barcoding.

    PubMed

    Zhang, A-B; Muster, C; Liang, H-B; Zhu, C-D; Crozier, R; Wan, P; Feng, J; Ward, R D

    2012-04-01

    Reliable assignment of an unknown query sequence to its correct species remains a methodological problem for the growing field of DNA barcoding. While great advances have been achieved recently, species identification from barcodes can still be unreliable if the relevant biodiversity has been insufficiently sampled. We here propose a new notion of species membership for DNA barcoding-fuzzy membership, based on fuzzy set theory-and illustrate its successful application to four real data sets (bats, fishes, butterflies and flies) with more than 5000 random simulations. Two of the data sets comprise especially dense species/population-level samples. In comparison with current DNA barcoding methods, the newly proposed minimum distance (MD) plus fuzzy set approach, and another computationally simple method, 'best close match', outperform two computationally sophisticated Bayesian and BootstrapNJ methods. The new method proposed here has great power in reducing false-positive species identification compared with other methods when conspecifics of the query are absent from the reference database.

  11. SYMPHONY, an information-theoretic method for gene-gene and gene-environment interaction analysis of disease syndromes.

    PubMed

    Knights, J; Yang, J; Chanda, P; Zhang, A; Ramanathan, M

    2013-06-01

    We develop an information-theoretic method for gene-gene (GGI) and gene-environmental interactions (GEI) analysis of syndromes, defined as a phenotype vector comprising multiple quantitative traits (QTs). The K-way interaction information (KWII), an information-theoretic metric, was derived for multivariate normal distributed phenotype vectors. The utility of the method was challenged with three simulated data sets, the Genetic Association Workshop-15 (GAW15) rheumatoid arthritis data set, a high-density lipoprotein (HDL) and atherosclerosis data set from a mouse QT locus study, and the 1000 Genomes data. The dependence of the KWII on effect size, minor allele frequency, linkage disequilibrium, population stratification/admixture, as well as the power and computational time requirements of the novel method was systematically assessed in simulation studies. In these studies, phenotype vectors containing two and three constituent multivariate normally distributed QTs were used and the KWII was found to be effective at detecting GEI associated with the phenotype. High KWII values were observed for variables and variable combinations associated with the syndrome phenotype compared with uninformative variables not associated with the phenotype. The KWII values for the phenotype-associated combinations increased monotonically with increasing effect size values. The KWII also exhibited utility in simulations with non-linear dependence between the constituent QTs. Analysis of the HDL and atherosclerosis data set indicated that the simultaneous analysis of both phenotypes identified interactions not detected in the analysis of the individual traits. The information-theoretic approach may be useful for non-parametric analysis of GGI and GEI of complex syndromes.

  12. New PAPR Reduction in OFDM System Using Hybrid of PTS-APPR Methods with Coded Side Information Technique

    NASA Astrophysics Data System (ADS)

    Pradabpet, Chusit; Yoshizawa, Shingo; Miyanaga, Yoshikazu; Dejhan, Kobchai

    In this paper, we propose a new PAPR reduction by using the hybrid of a partial transmit sequences (PTS) and an adaptive peak power reduction (APPR) methods with coded side information (SI) technique. These methods are used in an Orthogonal Frequency Division Multiplexing (OFDM) system. The OFDM employs orthogonal sub-carriers for data modulation. These sub-carriers unexpectedly present a large Peak to Average Power Ratio (PAPR) in some cases. In order to reduce PAPR, the sequence of input data is rearranged by PTS. The APPR method is also used to controls the peak level of modulation signals by an adaptive algorithm. A proposed reduction method consists of these two methods and realizes both advantages at the same time. In order to make the optimum condition on PTS for PAPR reduction, a quite large calculation cost must be demanded and thus it is impossible to obtain the optimum PTS. In the proposed method, by using the pseudo-optimum condition with a coded SI technique, the total calculation cost becomes drastically reduced. In simulation results, the proposed method shows the improvement on PAPR and also reveals the high performance on bit error rate (BER) of an OFDM system.

  13. Parents of children with eating disorders: developing theory-based health communication messages to promote caregiver well-being.

    PubMed

    Patel, Sheetal; Shafer, Autumn; Brown, Jane; Bulik, Cynthia; Zucker, Nancy

    2014-01-01

    Parents of children with eating disorders experience extreme emotional burden because of the intensity and duration of the recovery process. While parental involvement in a child's eating disorder treatment improves outcomes, parents often neglect their own well-being, which can impede their child's recovery. This study extends the research on caregivers and on health theory in practice by conducting formative research to develop a theory-based communication intervention encouraging parents to engage in adaptive coping and self-care behaviors. The Transactional Model of Stress and Coping and the Transtheoretical Model guided qualitative assessments of the determinants of parents' coping behaviors. Three focus groups with 19 parents of children with eating disorders and 19 semi-structured interviews with experts specializing in eating disorders were conducted. Findings indicate that parents and experts see parents' need for permission to take time for themselves as the main barrier to self-care. The main motivator for parents to engage in coping behaviors is awareness of a connection between self-care and their child's health outcomes. Participant evaluation of six potential messages for main themes and effectiveness revealed that theory-based elements, such as certain processes of change within the Transtheoretical Model, were important to changing health behavior.

  14. A theory-based newsletter nutrition education program reduces nutritional risk and improves dietary intake for congregate meal participants.

    PubMed

    Francis, Sarah L; MacNab, Lindsay; Shelley, Mack

    2014-01-01

    At-risk older adults need community-based nutrition programs that improve nutritional status and practices. This 6-month study assessed the impact of the traditional Chef Charles (CC) program (Control) compared to a theory-based CC program (Treatment) on nutritional risk (NR), dietary intakes, self-efficacy (SE), food security (FS), and program satisfaction for congregate meal participants. Participants were mostly educated, single, "food secure" White females. NR change for the treatment group was significantly higher (P = 0.042) than the control group. No differences were noted for SE or FS change and program satisfaction between groups. The overall distribution classification levels of FS changed significantly (P < .001) from pre to post. Over half (n = 46, 76.7%) reported making dietary changes and the majority (n = 52, 86.7%) rated CC as good to excellent. Results suggest the theory-based CC program (treatment) is more effective in reducing NR and dietary practices than the traditional CC program (control).

  15. Testing a social cognitive theory-based model of indoor tanning: implications for skin cancer prevention messages.

    PubMed

    Noar, Seth M; Myrick, Jessica Gall; Zeitany, Alexandra; Kelley, Dannielle; Morales-Pico, Brenda; Thomas, Nancy E

    2015-01-01

    The lack of a theory-based understanding of indoor tanning is a major impediment to the development of effective messages to prevent or reduce this behavior. This study applied the Comprehensive Indoor Tanning Expectations (CITE) scale in an analysis of indoor tanning behavior among sorority women (total N = 775). Confirmatory factor analyses indicated that CITE positive and negative expectations were robust, multidimensional factors and that a hierarchical structure fit the data well. Social cognitive theory-based structural equation models demonstrated that appearance-oriented variables were significantly associated with outcome expectations. Outcome expectations were, in turn, significantly associated with temptations to tan, intention to tan indoors, and indoor tanning behavior. The implications of these findings for the development of messages to prevent and reduce indoor tanning behavior are discussed in two domains: (a) messages that attempt to change broader societal perceptions about tan skin, and (b) messages that focus more narrowly on indoor tanning-challenging positive expectations, enhancing negative expectations, and encouraging substitution of sunless tanning products.

  16. A Robust Apnea Period Detection Method in Changing Sleep Posture by Average Mutual Information of Heartbeat and Respiration

    NASA Astrophysics Data System (ADS)

    Kurihara, Yosuke; Watanabe, Kajiro; Kobayashi, Kazuyuki; Tanaka, Tanaka

    Sleep disorders disturb the recovery from mental and physical fatigues, one of the functions of the sleep. The majority of those who with the disorders are suffering from Sleep Apnea Syndrome (SAS). Continuous Hypoxia during sleep due to SAS cause Circulatory Disturbances, such as hypertension and ischemic heart disease, and Malfunction of Autonomic Nervous System, and other severe complications, often times bringing the suffers to death. In order to prevent these from happening, it is important to detect the SAS in its early stage by monitoring the daily respirations during sleep, and to provide appropriate treatments at medical institutions. In this paper, the Pneumatic Method to detect the Apnea period during sleep is proposed. Pneumatic method can measure heartbeat and respiration signal. Respiration signal can be considered as noise against heartbeat signal, and the decrease in the respiration signal due to Apnea increases the Average Mutual Information of heartbeat. The result of scaling analysis of the average mutual information is defined as threshold to detect the apnea period. The root mean square error between the lengths of Apnea measured by Strain Gauge using for reference and those measured by using the proposed method was 3.1 seconds. And, error of the number of apnea times judged by doctor and proposal method in OSAS patients was 3.3 times.

  17. Application of the phylogenetic informativeness method to chloroplast markers: a test case of closely related species in tribe Hydrangeeae (Hydrangeaceae).

    PubMed

    Granados Mendoza, Carolina; Wanke, Stefan; Salomo, Karsten; Goetghebeur, Paul; Samain, Marie-Stéphanie

    2013-01-01

    In evolutionary biology appropriate marker selection for the reconstruction of solid phylogenetic hypotheses is fundamental. One of the most challenging tasks addresses the appropriate choice of genomic regions in studies of closely related species. Robust phylogenetic frameworks are central to studies dealing with questions ranging from evolutionary and conservation biology, biogeography to plant breeding. Phylogenetic informativeness profiles provide a quantitative measure of the phylogenetic signal in markers and therefore a method for locus prioritization. The present work profiles phylogenetic informativeness of mostly non-coding chloroplast regions in an angiosperm lineage of closely related species: the popular ornamental tribe Hydrangeeae (Hydrangeaceae, Cornales, Asterids). A recent phylogenetic study denoted a case of resolution contrast between the two strongly supported clades within tribe Hydrangeeae. We evaluate the phylogenetic signal of 13 highly variable plastid markers for estimating relationships within and among the currently recognized monophyletic groups of this tribe. A selection of combined loci based on their phylogenetic informativeness retrieved more robust phylogenetic hypotheses than simply combining individual markers performing best with respect to resolution, nodal support and accuracy or those presenting the highest number of parsimony informative characters. We propose the rpl32-ndhF intergenic spacer (IGS), trnV-ndhC IGS, trnL-rpl32 IGS, psbT-petB region and ndhA intron as the best candidates for future phylogenetic studies in Hydrangeeae and potentially in other Asterids. We also contrasted the phylogenetic informativeness of coded indels against substitutions concluding that, despite their low phylogenetic informativeness, coded indels provide additional phylogenetic signal that is nearly free of noise. Phylogenetic relationships obtained from our total combined analyses showed improved resolution and nodal support with respect

  18. General theory based on fluctuational electrodynamics for van der Waals interactions in colloidal systems

    SciTech Connect

    Yannopapas, Vassilios

    2007-12-15

    A rigorous theory for the determination of the van der Waals interactions in colloidal systems is presented. The method is based on fluctuational electrodynamics and a multiple-scattering method which provides the electromagnetic Green's tensor. In particular, expressions for the Green's tensor are presented for arbitrary, finite collections of colloidal particles, for infinitely periodic or defected crystals, as well as for finite slabs of crystals. The presented formalism allows for ab initio calculations of the van der Waals interactions in colloidal systems since it takes fully into account retardation, many-body, multipolar, and near-field effects.

  19. An ensemble method based on uninformative variable elimination and mutual information for spectral multivariate calibration.

    PubMed

    Tan, Chao; Wang, Jinyue; Wu, Tong; Qin, Xin; Li, Menglong

    2010-12-01

    Based on the combination of uninformative variable elimination (UVE), bootstrap and mutual information (MI), a simple ensemble algorithm, named ESPLS, is proposed for spectral multivariate calibration (MVC). In ESPLS, those uninformative variables are first removed; and then a preparatory training set is produced by bootstrap, on which a MI spectrum of retained variables is calculated. The variables that exhibit higher MI than a defined threshold form a subspace on which a candidate partial least-squares (PLS) model is constructed. This process is repeated. After a number of candidate models are obtained, a small part of models is picked out to construct an ensemble model by simple/weighted average. Four near/mid-infrared (NIR/MIR) spectral datasets concerning the determination of six components are used to verify the proposed ESPLS. The results indicate that ESPLS is superior to UVEPLS and its combination with MI-based variable selection (SPLS) in terms of both the accuracy and robustness. Besides, from the perspective of end-users, ESPLS does not increase the complexity of a calibration when enhancing its performance.

  20. Methods for evaluating Lyme disease risks using geographic information systems and geospatial analysis.

    PubMed

    Nicholson, M C; Mather, T N

    1996-09-01

    Lyme disease is a tick-transmitted borreliosis of humans and domestic animals emerging as one of the most significant threats to public health in north temperate regions of the world. However, despite a myriad of studies into symptomology, causes, and treatment of the disease, few researchers have addressed the spatial aspects of Lyme disease transmission. Using statewide data collected in Rhode Island (United States) as a test case, we demonstrated that exposure to deer ticks and the risk of contracting Lyme disease occurs mostly in the peridomestic environment. A Geographic Information System model was developed indicating a strong association among Lyme disease in humans, the degree of nymphal blacklegged tick, Ixodes scapularis Say, abundance in the environment, and prevalence of Borrelia burgdorferi infection in ticks. In contrast, occurrence of plant communities suitable for sustaining I. scapularis populations (forests) was not predictive of Lyme disease risk. Instead, we observed a highly significant spatial trend for decreasing number of ticks and incident cases of Lyme disease with increasing latitude. Geostatistics were employed for modeling spatial autocorrelation of tick densities. These findings were combined to create a model that predicts Lyme disease transmission risk, thereby demonstrating the utility of incorporating geospatial modeling techniques in studying the epidemiology of Lyme disease. PMID:8840676