Science.gov

Sample records for information theory-based methods

  1. Optimal cross-sectional sampling for river modelling with bridges: An information theory-based method

    NASA Astrophysics Data System (ADS)

    Ridolfi, E.; Alfonso, L.; Di Baldassarre, G.; Napolitano, F.

    2016-06-01

    The description of river topography has a crucial role in accurate one-dimensional (1D) hydraulic modelling. Specifically, cross-sectional data define the riverbed elevation, the flood-prone area, and thus, the hydraulic behavior of the river. Here, the problem of the optimal cross-sectional spacing is solved through an information theory-based concept. The optimal subset of locations is the one with the maximum information content and the minimum amount of redundancy. The original contribution is the introduction of a methodology to sample river cross sections in the presence of bridges. The approach is tested on the Grosseto River (IT) and is compared to existing guidelines. The results show that the information theory-based approach can support traditional methods to estimate rivers' cross-sectional spacing.

  2. IMMAN: free software for information theory-based chemometric analysis.

    PubMed

    Urias, Ricardo W Pino; Barigye, Stephen J; Marrero-Ponce, Yovani; García-Jacas, César R; Valdes-Martiní, José R; Perez-Gimenez, Facundo

    2015-05-01

    The features and theoretical background of a new and free computational program for chemometric analysis denominated IMMAN (acronym for Information theory-based CheMoMetrics ANalysis) are presented. This is multi-platform software developed in the Java programming language, designed with a remarkably user-friendly graphical interface for the computation of a collection of information-theoretic functions adapted for rank-based unsupervised and supervised feature selection tasks. A total of 20 feature selection parameters are presented, with the unsupervised and supervised frameworks represented by 10 approaches in each case. Several information-theoretic parameters traditionally used as molecular descriptors (MDs) are adapted for use as unsupervised rank-based feature selection methods. On the other hand, a generalization scheme for the previously defined differential Shannon's entropy is discussed, as well as the introduction of Jeffreys information measure for supervised feature selection. Moreover, well-known information-theoretic feature selection parameters, such as information gain, gain ratio, and symmetrical uncertainty are incorporated to the IMMAN software ( http://mobiosd-hub.com/imman-soft/ ), following an equal-interval discretization approach. IMMAN offers data pre-processing functionalities, such as missing values processing, dataset partitioning, and browsing. Moreover, single parameter or ensemble (multi-criteria) ranking options are provided. Consequently, this software is suitable for tasks like dimensionality reduction, feature ranking, as well as comparative diversity analysis of data matrices. Simple examples of applications performed with this program are presented. A comparative study between IMMAN and WEKA feature selection tools using the Arcene dataset was performed, demonstrating similar behavior. In addition, it is revealed that the use of IMMAN unsupervised feature selection methods improves the performance of both IMMAN and WEKA

  3. Correlation theory-based signal processing method for CMF signals

    NASA Astrophysics Data System (ADS)

    Shen, Yan-lin; Tu, Ya-qing

    2016-06-01

    Signal processing precision of Coriolis mass flowmeter (CMF) signals affects measurement accuracy of Coriolis mass flowmeters directly. To improve the measurement accuracy of CMFs, a correlation theory-based signal processing method for CMF signals is proposed, which is comprised of the correlation theory-based frequency estimation method and phase difference estimation method. Theoretical analysis shows that the proposed method eliminates the effect of non-integral period sampling signals on frequency and phase difference estimation. The results of simulations and field experiments demonstrate that the proposed method improves the anti-interference performance of frequency and phase difference estimation and has better estimation performance than the adaptive notch filter, discrete Fourier transform and autocorrelation methods in terms of frequency estimation and the data extension-based correlation, Hilbert transform, quadrature delay estimator and discrete Fourier transform methods in terms of phase difference estimation, which contributes to improving the measurement accuracy of Coriolis mass flowmeters.

  4. Evaluation of the performance of information theory-based methods and cross-correlation to estimate the functional connectivity in cortical networks.

    PubMed

    Garofalo, Matteo; Nieus, Thierry; Massobrio, Paolo; Martinoia, Sergio

    2009-01-01

    Functional connectivity of in vitro neuronal networks was estimated by applying different statistical algorithms on data collected by Micro-Electrode Arrays (MEAs). First we tested these "connectivity methods" on neuronal network models at an increasing level of complexity and evaluated the performance in terms of ROC (Receiver Operating Characteristic) and PPC (Positive Precision Curve), a new defined complementary method specifically developed for functional links identification. Then, the algorithms better estimated the actual connectivity of the network models, were used to extract functional connectivity from cultured cortical networks coupled to MEAs. Among the proposed approaches, Transfer Entropy and Joint-Entropy showed the best results suggesting those methods as good candidates to extract functional links in actual neuronal networks from multi-site recordings. PMID:19652720

  5. Scale effects on information theory-based measures applied to streamflow patterns in two rural watersheds

    NASA Astrophysics Data System (ADS)

    Pan, Feng; Pachepsky, Yakov A.; Guber, Andrey K.; McPherson, Brian J.; Hill, Robert L.

    2012-01-01

    varying from 0.2 to 0.6 in the two watersheds. We conclude that temporal effects must be evaluated and accounted for when the information theory-based methods are used for performance evaluation and comparison of hydrological models.

  6. Kinetic theory based new upwind methods for inviscid compressible flows

    NASA Technical Reports Server (NTRS)

    Deshpande, S. M.

    1986-01-01

    Two new upwind methods called the Kinetic Numerical Method (KNM) and the Kinetic Flux Vector Splitting (KFVS) method for the solution of the Euler equations have been presented. Both of these methods can be regarded as some suitable moments of an upwind scheme for the solution of the Boltzmann equation provided the distribution function is Maxwellian. This moment-method strategy leads to a unification of the Riemann approach and the pseudo-particle approach used earlier in the development of upwind methods for the Euler equations. A very important aspect of the moment-method strategy is that the new upwind methods satisfy the entropy condition because of the Boltzmann H-Theorem and suggest a possible way of extending the Total Variation Diminishing (TVD) principle within the framework of the H-Theorem. The ability of these methods in obtaining accurate wiggle-free solution is demonstrated by applying them to two test problems.

  7. Density functional theory based generalized effective fragment potential method

    SciTech Connect

    Nguyen, Kiet A. E-mail: ruth.pachter@wpafb.af.mil; Pachter, Ruth E-mail: ruth.pachter@wpafb.af.mil; Day, Paul N.

    2014-06-28

    We present a generalized Kohn-Sham (KS) density functional theory (DFT) based effective fragment potential (EFP2-DFT) method for the treatment of solvent effects. Similar to the original Hartree-Fock (HF) based potential with fitted parameters for water (EFP1) and the generalized HF based potential (EFP2-HF), EFP2-DFT includes electrostatic, exchange-repulsion, polarization, and dispersion potentials, which are generated for a chosen DFT functional for a given isolated molecule. The method does not have fitted parameters, except for implicit parameters within a chosen functional and the dispersion correction to the potential. The electrostatic potential is modeled with a multipolar expansion at each atomic center and bond midpoint using Stone's distributed multipolar analysis. The exchange-repulsion potential between two fragments is composed of the overlap and kinetic energy integrals and the nondiagonal KS matrices in the localized molecular orbital basis. The polarization potential is derived from the static molecular polarizability. The dispersion potential includes the intermolecular D3 dispersion correction of Grimme et al. [J. Chem. Phys. 132, 154104 (2010)]. The potential generated from the CAMB3LYP functional has mean unsigned errors (MUEs) with respect to results from coupled cluster singles, doubles, and perturbative triples with a complete basis set limit (CCSD(T)/CBS) extrapolation, of 1.7, 2.2, 2.0, and 0.5 kcal/mol, for the S22, water-benzene clusters, water clusters, and n-alkane dimers benchmark sets, respectively. The corresponding EFP2-HF errors for the respective benchmarks are 2.41, 3.1, 1.8, and 2.5 kcal/mol. Thus, the new EFP2-DFT-D3 method with the CAMB3LYP functional provides comparable or improved results at lower computational cost and, therefore, extends the range of applicability of EFP2 to larger system sizes.

  8. Density functional theory based generalized effective fragment potential method.

    PubMed

    Nguyen, Kiet A; Pachter, Ruth; Day, Paul N

    2014-06-28

    We present a generalized Kohn-Sham (KS) density functional theory (DFT) based effective fragment potential (EFP2-DFT) method for the treatment of solvent effects. Similar to the original Hartree-Fock (HF) based potential with fitted parameters for water (EFP1) and the generalized HF based potential (EFP2-HF), EFP2-DFT includes electrostatic, exchange-repulsion, polarization, and dispersion potentials, which are generated for a chosen DFT functional for a given isolated molecule. The method does not have fitted parameters, except for implicit parameters within a chosen functional and the dispersion correction to the potential. The electrostatic potential is modeled with a multipolar expansion at each atomic center and bond midpoint using Stone's distributed multipolar analysis. The exchange-repulsion potential between two fragments is composed of the overlap and kinetic energy integrals and the nondiagonal KS matrices in the localized molecular orbital basis. The polarization potential is derived from the static molecular polarizability. The dispersion potential includes the intermolecular D3 dispersion correction of Grimme et al. [J. Chem. Phys. 132, 154104 (2010)]. The potential generated from the CAMB3LYP functional has mean unsigned errors (MUEs) with respect to results from coupled cluster singles, doubles, and perturbative triples with a complete basis set limit (CCSD(T)/CBS) extrapolation, of 1.7, 2.2, 2.0, and 0.5 kcal/mol, for the S22, water-benzene clusters, water clusters, and n-alkane dimers benchmark sets, respectively. The corresponding EFP2-HF errors for the respective benchmarks are 2.41, 3.1, 1.8, and 2.5 kcal/mol. Thus, the new EFP2-DFT-D3 method with the CAMB3LYP functional provides comparable or improved results at lower computational cost and, therefore, extends the range of applicability of EFP2 to larger system sizes. PMID:24985612

  9. A variable-order laminated plate theory based on the variational-asymptotical method

    NASA Technical Reports Server (NTRS)

    Lee, Bok W.; Sutyrin, Vladislav G.; Hodges, Dewey H.

    1993-01-01

    The variational-asymptotical method is a mathematical technique by which the three-dimensional analysis of laminated plate deformation can be split into a linear, one-dimensional, through-the-thickness analysis and a nonlinear, two-dimensional, plate analysis. The elastic constants used in the plate analysis are obtained from the through-the-thickness analysis, along with approximate, closed-form three-dimensional distributions of displacement, strain, and stress. In this paper, a theory based on this technique is developed which is capable of approximating three-dimensional elasticity to any accuracy desired. The asymptotical method allows for the approximation of the through-the-thickness behavior in terms of the eigenfunctions of a certain Sturm-Liouville problem associated with the thickness coordinate. These eigenfunctions contain all the necessary information about the nonhomogeneities along the thickness coordinate of the plate and thus possess the appropriate discontinuities in the derivatives of displacement. The theory is presented in this paper along with numerical results for the eigenfunctions of various laminated plates.

  10. A new theory-based social classification in Japan and its validation using historically collected information.

    PubMed

    Hiyoshi, Ayako; Fukuda, Yoshiharu; Shipley, Martin J; Bartley, Mel; Brunner, Eric J

    2013-06-01

    Studies of health inequalities in Japan have increased since the millennium. However, there remains a lack of an accepted theory-based classification to measure occupation-related social position for Japan. This study attempts to derive such a classification based on the National Statistics Socio-economic Classification in the UK. Using routinely collected data from the nationally representative Comprehensive Survey of the Living Conditions of People on Health and Welfare, the Japanese Socioeconomic Classification was derived using two variables - occupational group and employment status. Validation analyses were conducted using household income, home ownership, self-rated good or poor health, and Kessler 6 psychological distress (n ≈ 36,000). After adjustment for age, marital status, and area (prefecture), one step lower social class was associated with mean 16% (p < 0.001) lower income, and a risk ratio of 0.93 (p < 0.001) for home ownership. The probability of good health showed a trend in men and women (risk ratio 0.94 and 0.93, respectively, for one step lower social class, p < 0.001). The trend for poor health was significant in women (odds ratio 1.12, p < 0.001) but not in men. Kessler 6 psychological distress showed significant trends in men (risk ratio 1.03, p = 0.044) and in women (1.05, p = 0.004). We propose the Japanese Socioeconomic Classification, derived from basic occupational and employment status information, as a meaningful, theory-based and standard classification system suitable for monitoring occupation-related health inequalities in Japan. PMID:23631782

  11. Using a Mixed Methods Sequential Design to Identify Factors Associated with African American Mothers' Intention to Vaccinate Their Daughters Aged 9 to 12 for HPV with a Purpose of Informing a Culturally-Relevant, Theory-Based Intervention

    ERIC Educational Resources Information Center

    Cunningham, Jennifer L.

    2013-01-01

    The purpose of this sequential, explanatory mixed methods research study was to understand what factors influenced African American maternal intentions to get their daughters aged 9 years to 12 years vaccinated in Alabama. In the first, quantitative phase of the study, the research questions focused on identifying the predictive power of eleven…

  12. Spectral analysis comparisons of Fourier-theory-based methods and minimum variance (Capon) methods

    NASA Astrophysics Data System (ADS)

    Garbanzo-Salas, Marcial; Hocking, Wayne. K.

    2015-09-01

    In recent years, adaptive (data dependent) methods have been introduced into many areas where Fourier spectral analysis has traditionally been used. Although the data-dependent methods are often advanced as being superior to Fourier methods, they do require some finesse in choosing the order of the relevant filters. In performing comparisons, we have found some concerns about the mappings, particularly when related to cases involving many spectral lines or even continuous spectral signals. Using numerical simulations, several comparisons between Fourier transform procedures and minimum variance method (MVM) have been performed. For multiple frequency signals, the MVM resolves most of the frequency content only for filters that have more degrees of freedom than the number of distinct spectral lines in the signal. In the case of Gaussian spectral approximation, MVM will always underestimate the width, and can misappropriate the location of spectral line in some circumstances. Large filters can be used to improve results with multiple frequency signals, but are computationally inefficient. Significant biases can occur when using MVM to study spectral information or echo power from the atmosphere. Artifacts and artificial narrowing of turbulent layers is one such impact.

  13. Scale effects on information theory-based measures applied to streamflow patterns in two rural watersheds

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Understanding streamflow patterns in space and time is important to improve the flood and drought forecasting, water resources management, and predictions of ecological changes. The objectives of this work were (a) to characterize the spatial and temporal patterns of streamflow using information the...

  14. Multivariate drought index: An information theory based approach for integrated drought assessment

    NASA Astrophysics Data System (ADS)

    Rajsekhar, Deepthi; Singh, Vijay. P.; Mishra, Ashok. K.

    2015-07-01

    Most of the existing drought indices are based on a single variable (e.g. precipitation) or a combination of two variables (e.g., precipitation and streamflow). This may not be sufficient for reliable quantification of the existing drought condition. It is possible that a region might be experiencing only a single type of drought at times, but multiple drought types affecting a region is quite common too. To have a comprehensive representation, it is better to consider all the variables that lead to different physical forms of drought, such as meteorological, hydrological, and agricultural droughts. Therefore, we propose to develop a multivariate drought index (MDI) that will utilize information from hydroclimatic variables, including precipitation, runoff, evapotranspiration and soil moisture as indicator variables, thus accounting for all the physical forms of drought. The entropy theory was utilized to develop this proposed index, that led to the smallest set of features maximally preserving the information of the input data set. MDI was then compared with the Palmer drought severity index (PDSI) for all climate regions within Texas for the time period 1950-2012, with particular attention to the two major drought occurrences in Texas, viz. the droughts which occurred in 1950-1957, and 2010-2011. The proposed MDI was found to represent drought conditions well, due to its multivariate, multi scalar, and nonlinear properties. To help the user choose the right time scale for further analysis, entropy maps of MDI at different time scales were used as a guideline. The MDI time scale that has the highest entropy value may be chosen, since a higher entropy indicates a higher information content.

  15. An information theory based framework for the measurement of population health.

    PubMed

    Nesson, Erik T; Robinson, Joshua J

    2015-04-01

    This paper proposes a new framework for the measurement of population health and the ranking of the health of different geographies. Since population health is a latent variable, studies which measure and rank the health of different geographies must aggregate observable health attributes into one summary measure. We show that the methods used in nearly all the literature to date implicitly assume that all attributes are infinitely substitutable. Our method, based on the measurement of multidimensional welfare and inequality, minimizes the entropic distance between the summary measure of population health and the distribution of the underlying attributes. This summary function coincides with the constant elasticity of substitution and Cobb-Douglas production functions and naturally allows different assumptions regarding attribute substitutability or complementarity. To compare methodologies, we examine a well-known ranking of the population health of U.S. states, America's Health Rankings. We find that states' rankings are somewhat sensitive to changes in the weight given to each attribute, but very sensitive to changes in aggregation methodology. Our results have broad implications for well-known health rankings such as the 2000 World Health Report, as well as other measurements of population and individual health levels and the measurement and decomposition of health inequality. PMID:25792258

  16. A second-order accurate kinetic-theory-based method for inviscid compressible flows

    NASA Technical Reports Server (NTRS)

    Deshpande, Suresh M.

    1986-01-01

    An upwind method for the numerical solution of the Euler equations is presented. This method, called the kinetic numerical method (KNM), is based on the fact that the Euler equations are moments of the Boltzmann equation of the kinetic theory of gases when the distribution function is Maxwellian. The KNM consists of two phases, the convection phase and the collision phase. The method is unconditionally stable and explicit. It is highly vectorizable and can be easily made total variation diminishing for the distribution function by a suitable choice of the interpolation strategy. The method is applied to a one-dimensional shock-propagation problem and to a two-dimensional shock-reflection problem.

  17. An hybrid computing approach to accelerating the multiple scattering theory based ab initio methods

    NASA Astrophysics Data System (ADS)

    Wang, Yang; Stocks, G. Malcolm

    2014-03-01

    The multiple scattering theory method, also known as the Korringa-Kohn-Rostoker (KKR) method, is considered an elegant approach to the ab initio electronic structure calculation for solids. Its convenience in accessing the one-electron Green function has led to the development of locally-self consistent multiple scattering (LSMS) method, a linear scaling ab initio method that allows for the electronic structure calculation for complex structures requiring tens of thousands of atoms in unit cell. It is one of the few applications that demonstrated petascale computing capability. In this presentation, we discuss our recent efforts in developing a hybrid computing approach for accelerating the full potential electronic structure calculation. Specifically, in the framework of our existing LSMS code in FORTRAN 90/95, we explore the many core resources on GPGPU accelerators by implementing the compute intensive functions (for the calculation of multiple scattering matrices and the single site solutions) in CUDA, and move the computational tasks to the GPGPUs if they are found available. We explain in details our approach to the CUDA programming and the code structure, and show the speed-up of the new hybrid code by comparing its performances on CPU/GPGPU and on CPU only. The work was supported in part by the Center for Defect Physics, a DOE-BES Energy Frontier Research Center.

  18. Analytic Gradient for Density Functional Theory Based on the Fragment Molecular Orbital Method.

    PubMed

    Brorsen, Kurt R; Zahariev, Federico; Nakata, Hiroya; Fedorov, Dmitri G; Gordon, Mark S

    2014-12-01

    The equations for the response terms for the fragment molecular orbital (FMO) method interfaced with the density functional theory (DFT) gradient are derived and implemented. Compared to the previous FMO-DFT gradient, which lacks response terms, the FMO-DFT analytic gradient has improved accuracy for a variety of functionals, when compared to numerical gradients. The FMO-DFT gradient agrees with the fully ab initio DFT gradient in which no fragmentation is performed, while reducing the nonlinear scaling associated with standard DFT. Solving for the response terms requires the solution of the coupled perturbed Kohn-Sham (CPKS) equations, where the CPKS equations are solved through a decoupled Z-vector procedure called the self-consistent Z-vector method. FMO-DFT is a nonvariational method and the FMO-DFT gradient is unique compared to standard DFT gradients in that the FMO-DFT gradient requires terms from both DFT and time-dependent density functional theory (TDDFT) theories. PMID:26583213

  19. Second order Møller-Plesset perturbation theory based upon the fragment molecular orbital method

    NASA Astrophysics Data System (ADS)

    Fedorov, Dmitri G.; Kitaura, Kazuo

    2004-08-01

    The fragment molecular orbital (FMO) method was combined with the second order Møller-Plesset (MP2) perturbation theory. The accuracy of the method using the 6-31G* basis set was tested on (H2O)n, n=16,32,64; α-helices and β-strands of alanine n-mers, n=10,20,40; as well as on (H2O)n, n=16,32,64 using the 6-31++G** basis set. Relative to the regular MP2 results that could be afforded, the FMO2-MP2 error in the correlation energy did not exceed 0.003 a.u., the error in the correlation energy gradient did not exceed 0.000 05 a.u./bohr and the error in the correlation contribution to dipole moment did not exceed 0.03 debye. An approximation reducing computational load based on fragment separation was introduced and tested. The FMO2-MP2 method demonstrated nearly linear scaling and drastically reduced the memory requirements of the regular MP2, making possible calculations with several thousands basis functions using small Pentium clusters. As an example, (H2O)64 with the 6-31++G** basis set (1920 basis functions) can be run in 1 Gbyte RAM and it took 136 s on a 40-node Pentium4 cluster.

  20. Gas-Kinetic Theory Based Flux Splitting Method for Ideal Magnetohydrodynamics

    NASA Technical Reports Server (NTRS)

    Xu, Kun

    1998-01-01

    A gas-kinetic solver is developed for the ideal magnetohydrodynamics (MHD) equations. The new scheme is based on the direct splitting of the flux function of the MHD equations with the inclusion of "particle" collisions in the transport process. Consequently, the artificial dissipation in the new scheme is much reduced in comparison with the MHD Flux Vector Splitting Scheme. At the same time, the new scheme is compared with the well-developed Roe-type MHD solver. It is concluded that the kinetic MHD scheme is more robust and efficient than the Roe- type method, and the accuracy is competitive. In this paper the general principle of splitting the macroscopic flux function based on the gas-kinetic theory is presented. The flux construction strategy may shed some light on the possible modification of AUSM- and CUSP-type schemes for the compressible Euler equations, as well as to the development of new schemes for a non-strictly hyperbolic system.

  1. Assessing the density functional theory-based multireference configuration interaction (DFT/MRCI) method for transition metal complexes

    SciTech Connect

    Escudero, Daniel E-mail: thiel@kofo.mpg.de; Thiel, Walter E-mail: thiel@kofo.mpg.de

    2014-05-21

    We report an assessment of the performance of density functional theory-based multireference configuration interaction (DFT/MRCI) calculations for a set of 3d- and 4d-transition metal (TM) complexes. The DFT/MRCI results are compared to published reference data from reliable high-level multi-configurational ab initio studies. The assessment covers the relative energies of different ground-state minima of the highly correlated CrF{sub 6} complex, the singlet and triplet electronically excited states of seven typical TM complexes (MnO{sub 4}{sup −}, Cr(CO){sub 6}, [Fe(CN){sub 6}]{sup 4−}, four larger Fe and Ru complexes), and the corresponding electronic spectra (vertical excitation energies and oscillator strengths). It includes comparisons with results from different flavors of time-dependent DFT (TD-DFT) calculations using pure, hybrid, and long-range corrected functionals. The DFT/MRCI method is found to be superior to the tested TD-DFT approaches and is thus recommended for exploring the excited-state properties of TM complexes.

  2. Battling the challenges of training nurses to use information systems through theory-based training material design.

    PubMed

    Galani, Malatsi; Yu, Ping; Paas, Fred; Chandler, Paul

    2014-01-01

    The attempts to train nurses to effectively use information systems have had mixed results. One problem is that training materials are not adequately designed to guide trainees to gradually learn to use a system without experiencing a heavy cognitive load. This is because training design often does not take into consideration a learner's cognitive ability to absorb new information in a short training period. Given the high cost and difficulty of organising training in healthcare organisations, there is an urgent need for information system trainers to be aware of how cognitive overload or information overload affect a trainee's capability to acquire new knowledge and skills, and what instructional techniques can be used to facilitate effective learning. This paper introduces the concept of cognitive load and how it affects nurses when learning to use a new health information system. This is followed by the relevant strategies for instructional design, underpinned by the principles of cognitive load theory, which may be helpful for the development of effective instructional materials and activities for training nurses to use information systems. PMID:25087524

  3. Nonlinear gyrokinetic theory based on a new method and computation of the guiding-center orbit in tokamaks

    SciTech Connect

    Xu, Yingfeng Dai, Zongliang; Wang, Shaojie

    2014-04-15

    The nonlinear gyrokinetic theory in the tokamak configuration based on the two-step transform is developed; in the first step, we transform the magnetic potential perturbation to the Hamiltonian part, and in the second step, we transform away the gyroangle-dependent part of the perturbed Hamiltonian. Then the I-transform method is used to decoupled the perturbation part of the motion from the unperturbed motion. The application of the I-transform method to the computation of the guiding-center orbit and the guiding-center distribution function in tokamaks is presented. It is demonstrated that the I-transform method of the orbit computation which involves integrating only along the unperturbed orbit agrees with the conventional method which integrates along the full orbit. A numerical code based on the I-transform method is developed and two numerical examples are given to verify the new method.

  4. Did you have an impact? A theory-based method for planning and evaluating knowledge-transfer and exchange activities in occupational health and safety.

    PubMed

    Kramer, Desré M; Wells, Richard P; Carlan, Nicolette; Aversa, Theresa; Bigelow, Philip P; Dixon, Shane M; McMillan, Keith

    2013-01-01

    Few evaluation tools are available to assess knowledge-transfer and exchange interventions. The objective of this paper is to develop and demonstrate a theory-based knowledge-transfer and exchange method of evaluation (KEME) that synthesizes 3 theoretical frameworks: the promoting action on research implementation of health services (PARiHS) model, the transtheoretical model of change, and a model of knowledge use. It proposes a new term, keme, to mean a unit of evidence-based transferable knowledge. The usefulness of the evaluation method is demonstrated with 4 occupational health and safety knowledge transfer and exchange (KTE) implementation case studies that are based upon the analysis of over 50 pre-existing interviews. The usefulness of the evaluation model has enabled us to better understand stakeholder feedback, frame our interpretation, and perform a more comprehensive evaluation of the knowledge use outcomes of our KTE efforts. PMID:23498710

  5. Fuzzy theory based control method for an in-pipe robot to move in variable resistance environment

    NASA Astrophysics Data System (ADS)

    Li, Te; Ma, Shugen; Li, Bin; Wang, Minghui; Wang, Yuechao

    2015-11-01

    Most of the existing screw drive in-pipe robots cannot actively adjust the maximum traction capacity, which limits the adaptability to the wide range of variable environment resistance, especially in curved pipes. In order to solve this problem, a screw drive in-pipe robot based on adaptive linkage mechanism is proposed. The differential property of the adaptive linkage mechanism allows the robot to move without motion interference in the straight and varied curved pipes by adjusting inclining angles of rollers self-adaptively. The maximum traction capacity of the robot can be changed by actively adjusting the inclining angles of rollers. In order to improve the adaptability to the variable resistance, a torque control method based on the fuzzy controller is proposed. For the variable environment resistance, the proposed control method can not only ensure enough traction force, but also limit the output torque in a feasible region. In the simulations, the robot with the proposed control method is compared to the robot with fixed inclining angles of rollers. The results show that the combination of the torque control method and the proposed robot achieves the better adaptability to the variable resistance in the straight and curved pipes.

  6. Analytic second derivative of the energy for density functional theory based on the three-body fragment molecular orbital method.

    PubMed

    Nakata, Hiroya; Fedorov, Dmitri G; Zahariev, Federico; Schmidt, Michael W; Kitaura, Kazuo; Gordon, Mark S; Nakamura, Shinichiro

    2015-03-28

    Analytic second derivatives of the energy with respect to nuclear coordinates have been developed for spin restricted density functional theory (DFT) based on the fragment molecular orbital method (FMO). The derivations were carried out for the three-body expansion (FMO3), and the two-body expressions can be obtained by neglecting the three-body corrections. Also, the restricted Hartree-Fock (RHF) Hessian for FMO3 can be obtained by neglecting the density-functional related terms. In both the FMO-RHF and FMO-DFT Hessians, certain terms with small magnitudes are neglected for computational efficiency. The accuracy of the FMO-DFT Hessian in terms of the Gibbs free energy is evaluated for a set of polypeptides and water clusters and found to be within 1 kcal/mol of the corresponding full (non-fragmented) ab initio calculation. The FMO-DFT method is also applied to transition states in SN2 reactions and for the computation of the IR and Raman spectra of a small Trp-cage protein (PDB: 1L2Y). Some computational timing analysis is also presented. PMID:25833559

  7. Analytic second derivative of the energy for density functional theory based on the three-body fragment molecular orbital method

    SciTech Connect

    Nakata, Hiroya; Fedorov, Dmitri G.; Zahariev, Federico; Schmidt, Michael W.; Gordon, Mark S.; Kitaura, Kazuo; Nakamura, Shinichiro

    2015-03-28

    Analytic second derivatives of the energy with respect to nuclear coordinates have been developed for spin restricted density functional theory (DFT) based on the fragment molecular orbital method (FMO). The derivations were carried out for the three-body expansion (FMO3), and the two-body expressions can be obtained by neglecting the three-body corrections. Also, the restricted Hartree-Fock (RHF) Hessian for FMO3 can be obtained by neglecting the density-functional related terms. In both the FMO-RHF and FMO-DFT Hessians, certain terms with small magnitudes are neglected for computational efficiency. The accuracy of the FMO-DFT Hessian in terms of the Gibbs free energy is evaluated for a set of polypeptides and water clusters and found to be within 1 kcal/mol of the corresponding full (non-fragmented) ab initio calculation. The FMO-DFT method is also applied to transition states in S{sub N}2 reactions and for the computation of the IR and Raman spectra of a small Trp-cage protein (PDB: 1L2Y). Some computational timing analysis is also presented.

  8. Analytic second derivative of the energy for density functional theory based on the three-body fragment molecular orbital method

    NASA Astrophysics Data System (ADS)

    Nakata, Hiroya; Fedorov, Dmitri G.; Zahariev, Federico; Schmidt, Michael W.; Kitaura, Kazuo; Gordon, Mark S.; Nakamura, Shinichiro

    2015-03-01

    Analytic second derivatives of the energy with respect to nuclear coordinates have been developed for spin restricted density functional theory (DFT) based on the fragment molecular orbital method (FMO). The derivations were carried out for the three-body expansion (FMO3), and the two-body expressions can be obtained by neglecting the three-body corrections. Also, the restricted Hartree-Fock (RHF) Hessian for FMO3 can be obtained by neglecting the density-functional related terms. In both the FMO-RHF and FMO-DFT Hessians, certain terms with small magnitudes are neglected for computational efficiency. The accuracy of the FMO-DFT Hessian in terms of the Gibbs free energy is evaluated for a set of polypeptides and water clusters and found to be within 1 kcal/mol of the corresponding full (non-fragmented) ab initio calculation. The FMO-DFT method is also applied to transition states in SN2 reactions and for the computation of the IR and Raman spectra of a small Trp-cage protein (PDB: 1L2Y). Some computational timing analysis is also presented.

  9. Theory-Based Stakeholder Evaluation

    ERIC Educational Resources Information Center

    Hansen, Morten Balle; Vedung, Evert

    2010-01-01

    This article introduces a new approach to program theory evaluation called theory-based stakeholder evaluation or the TSE model for short. Most theory-based approaches are program theory driven and some are stakeholder oriented as well. Practically, all of the latter fuse the program perceptions of the various stakeholder groups into one unitary…

  10. NbIT--a new information theory-based analysis of allosteric mechanisms reveals residues that underlie function in the leucine transporter LeuT.

    PubMed

    LeVine, Michael V; Weinstein, Harel

    2014-05-01

    Complex networks of interacting residues and microdomains in the structures of biomolecular systems underlie the reliable propagation of information from an input signal, such as the concentration of a ligand, to sites that generate the appropriate output signal, such as enzymatic activity. This information transduction often carries the signal across relatively large distances at the molecular scale in a form of allostery that is essential for the physiological functions performed by biomolecules. While allosteric behaviors have been documented from experiments and computation, the mechanism of this form of allostery proved difficult to identify at the molecular level. Here, we introduce a novel analysis framework, called N-body Information Theory (NbIT) analysis, which is based on information theory and uses measures of configurational entropy in a biomolecular system to identify microdomains and individual residues that act as (i)-channels for long-distance information sharing between functional sites, and (ii)-coordinators that organize dynamics within functional sites. Application of the new method to molecular dynamics (MD) trajectories of the occluded state of the bacterial leucine transporter LeuT identifies a channel of allosteric coupling between the functionally important intracellular gate and the substrate binding sites known to modulate it. NbIT analysis is shown also to differentiate residues involved primarily in stabilizing the functional sites, from those that contribute to allosteric couplings between sites. NbIT analysis of MD data thus reveals rigorous mechanistic elements of allostery underlying the dynamics of biomolecular systems. PMID:24785005

  11. A recursively formulated first-order semianalytic artificial satellite theory based on the generalized method of averaging. Volume 1: The generalized method of averaging applied to the artificial satellite problem

    NASA Technical Reports Server (NTRS)

    Mcclain, W. D.

    1977-01-01

    A recursively formulated, first-order, semianalytic artificial satellite theory, based on the generalized method of averaging is presented in two volumes. Volume I comprehensively discusses the theory of the generalized method of averaging applied to the artificial satellite problem. Volume II presents the explicit development in the nonsingular equinoctial elements of the first-order average equations of motion. The recursive algorithms used to evaluate the first-order averaged equations of motion are also presented in Volume II. This semianalytic theory is, in principle, valid for a term of arbitrary degree in the expansion of the third-body disturbing function (nonresonant cases only) and for a term of arbitrary degree and order in the expansion of the nonspherical gravitational potential function.

  12. Methods of Organizational Information Security

    NASA Astrophysics Data System (ADS)

    Martins, José; Dos Santos, Henrique

    The principle objective of this article is to present a literature review for the methods used in the security of information at the level of organizations. Some of the principle problems are identified and a first group of relevant dimensions is presented for an efficient management of information security. The study is based on the literature review made, using some of the more relevant certified articles of this theme, in international reports and in the principle norms of management of information security. From the readings that were done, we identified some of the methods oriented for risk management, norms of certification and good practice of security of information. Some of the norms are oriented for the certification of the product or system and others oriented to the processes of the business. There are also studies with the proposal of Frameworks that suggest the integration of different approaches with the foundation of norms focused on technologies, in processes and taking into consideration the organizational and human environment of the organizations. In our perspective, the biggest contribute to the security of information is the development of a method of security of information for an organization in a conflicting environment. This should make available the security of information, against the possible dimensions of attack that the threats could exploit, through the vulnerability of the organizational actives. This method should support the new concepts of "Network centric warfare", "Information superiority" and "Information warfare" especially developed in this last decade, where information is seen simultaneously as a weapon and as a target.

  13. Derivation of a measure of systolic blood pressure mutability: a novel information theory-based metric from ambulatory blood pressure tests.

    PubMed

    Contreras, Danitza J; Vogel, Eugenio E; Saravia, Gonzalo; Stockins, Benjamin

    2016-03-01

    We provide ambulatory blood pressure (BP) exams with tools based on information theory to quantify fluctuations thus increasing the capture of dynamic test components. Data from 515 ambulatory 24-hour BP exams were considered. Average age was 54 years, 54% were women, and 53% were under BP treatment. The average systolic pressure (SP) was 127 ± 8 mm Hg. A data compressor (wlzip) designed to recognize meaningful information is invoked to measure mutability which is a form of dynamical variability. For patients with the same average SP, different mutability values are obtained which reflects the differences in dynamical variability. In unadjusted linear regression models, mutability had low association with the mean systolic BP (R(2) = 0.056; P < .000001) but larger association with the SP deviation (R(2) = 0.761; P < .001). Wlzip allows detecting levels of variability in SP that could be hazardous. This new indicator can be easily added to the 24-hour BP monitors improving information toward diagnosis. PMID:26965751

  14. Information storage media and method

    DOEpatents

    Miller, Steven D.; Endres, George W.

    1999-01-01

    Disclosed is a method for storing and retrieving information. More specifically, the present invention is a method for forming predetermined patterns, or data structures, using materials which exhibit enhanced absorption of light at certain wavelengths or, when interrogated with a light having a first wavelength, provide a luminescent response at a second wavelength. These materials may exhibit this response to light inherently, or may be made to exhibit this response by treating the materials with ionizing radiation.

  15. Information storage media and method

    SciTech Connect

    Miller, S.D.; Endres, G.W.

    1999-09-28

    Disclosed is a method for storing and retrieving information. More specifically, the present invention is a method for forming predetermined patterns, or data structures, using materials which exhibit enhanced absorption of light at certain wavelengths or, when interrogated with a light having a first wavelength, provide a luminescent response at a second wavelength. These materials may exhibit this response to light inherently, or may be made to exhibit this response by treating the materials with ionizing radiation.

  16. Levels of Reconstruction as Complementarity in Mixed Methods Research: A Social Theory-Based Conceptual Framework for Integrating Qualitative and Quantitative Research

    PubMed Central

    Carroll, Linda J.; Rothe, J. Peter

    2010-01-01

    Like other areas of health research, there has been increasing use of qualitative methods to study public health problems such as injuries and injury prevention. Likewise, the integration of qualitative and quantitative research (mixed-methods) is beginning to assume a more prominent role in public health studies. Likewise, using mixed-methods has great potential for gaining a broad and comprehensive understanding of injuries and their prevention. However, qualitative and quantitative research methods are based on two inherently different paradigms, and their integration requires a conceptual framework that permits the unity of these two methods. We present a theory-driven framework for viewing qualitative and quantitative research, which enables us to integrate them in a conceptually sound and useful manner. This framework has its foundation within the philosophical concept of complementarity, as espoused in the physical and social sciences, and draws on Bergson’s metaphysical work on the ‘ways of knowing’. Through understanding how data are constructed and reconstructed, and the different levels of meaning that can be ascribed to qualitative and quantitative findings, we can use a mixed-methods approach to gain a conceptually sound, holistic knowledge about injury phenomena that will enhance our development of relevant and successful interventions. PMID:20948937

  17. High-resolution wave-theory-based ultrasound reflection imaging using the split-step fourier and globally optimized fourier finite-difference methods

    SciTech Connect

    Huang, Lianjie

    2013-10-29

    Methods for enhancing ultrasonic reflection imaging are taught utilizing a split-step Fourier propagator in which the reconstruction is based on recursive inward continuation of ultrasonic wavefields in the frequency-space and frequency-wave number domains. The inward continuation within each extrapolation interval consists of two steps. In the first step, a phase-shift term is applied to the data in the frequency-wave number domain for propagation in a reference medium. The second step consists of applying another phase-shift term to data in the frequency-space domain to approximately compensate for ultrasonic scattering effects of heterogeneities within the tissue being imaged (e.g., breast tissue). Results from various data input to the method indicate significant improvements are provided in both image quality and resolution.

  18. Dynamic mean field theory for lattice gas models of fluids confined in porous materials: Higher order theory based on the Bethe-Peierls and path probability method approximations

    SciTech Connect

    Edison, John R.; Monson, Peter A.

    2014-07-14

    Recently we have developed a dynamic mean field theory (DMFT) for lattice gas models of fluids in porous materials [P. A. Monson, J. Chem. Phys. 128(8), 084701 (2008)]. The theory can be used to describe the relaxation processes in the approach to equilibrium or metastable states for fluids in pores and is especially useful for studying system exhibiting adsorption/desorption hysteresis. In this paper we discuss the extension of the theory to higher order by means of the path probability method (PPM) of Kikuchi and co-workers. We show that this leads to a treatment of the dynamics that is consistent with thermodynamics coming from the Bethe-Peierls or Quasi-Chemical approximation for the equilibrium or metastable equilibrium states of the lattice model. We compare the results from the PPM with those from DMFT and from dynamic Monte Carlo simulations. We find that the predictions from PPM are qualitatively similar to those from DMFT but give somewhat improved quantitative accuracy, in part due to the superior treatment of the underlying thermodynamics. This comes at the cost of greater computational expense associated with the larger number of equations that must be solved.

  19. Discovering the Optimal Route for Alane Synthesis on Ti doped Al Surfaces Using Density Functional Theory Based Kinetic Monte Carlo Methods

    NASA Astrophysics Data System (ADS)

    Karim, Altaf; Muckerman, James T.

    2010-03-01

    Issues such as catalytic dissociation of hydrogen and the mobility of alane species on Ti-doped Al surfaces are major challenges in the synthesis of aluminum hydride. Our recently developed modeling framework (DFT-based KMC simulation) enabled us to study the steady-state conditions of dissociative adsorption of hydrogen, its diffusion, and its reaction with Al adatoms leading to the formation of alane species on Ti-doped Al surfaces. Our studies show that the doping of Ti atoms in the top layer of Al surfaces significantly reduces the mobility of alane species. On the other hand, the doping of Ti atoms beneath the top layer of Al surfaces enhances the mobility of alane species. The arrangement of dopant Ti atoms in different layers not only affects the diffusion barriers of alane species but it also affects hydrogen dissociation barriers when Ti-Ti pairs are arranged in different ways in the top layer. Using our theoretical methods, we identified a few configurations of dopant Ti atoms having lower barriers for alane diffusion and hydrogen dissociation. Further, we discovered the optimal values of Ti concentration, temperature, and pressure under which the rate of alane formation is maximized.

  20. Theory-Based Considerations Influence the Interpretation of Generic Sentences

    ERIC Educational Resources Information Center

    Cimpian, Andrei; Gelman, Susan A.; Brandone, Amanda C.

    2010-01-01

    Under what circumstances do people agree that a kind-referring generic sentence (e.g., "Swans are beautiful") is true? We hypothesised that theory-based considerations are sufficient, independently of prevalence/frequency information, to lead to acceptance of a generic statement. To provide evidence for this general point, we focused on…

  1. Theory-Based Evaluation: Reflections Ten Years On. Theory-Based Evaluation: Past, Present, and Future

    ERIC Educational Resources Information Center

    Rogers, Patricia J.; Weiss, Carol H.

    2007-01-01

    This chapter begins with a brief introduction by Rogers, in which she highlights the continued salience of Carol Weiss's decade-old questions about theory-based evaluation. Theory-based evaluation has developed significantly since Carol Weiss's chapter was first published ten years ago. In 1997 Weiss pointed to theory-based evaluation being mostly…

  2. Information technology equipment cooling method

    DOEpatents

    Schultz, Mark D.

    2015-10-20

    According to one embodiment, a system for removing heat from a rack of information technology equipment may include a sidecar indoor air to liquid heat exchanger that cools air utilized by the rack of information technology equipment to cool the rack of information technology equipment. The system may also include a liquid to liquid heat exchanger and an outdoor heat exchanger. The system may further include configurable pathways to connect and control fluid flow through the sidecar heat exchanger, the liquid to liquid heat exchanger, the rack of information technology equipment, and the outdoor heat exchanger based upon ambient temperature and/or ambient humidity to remove heat generated by the rack of information technology equipment.

  3. Multiple outcome measures and mixed methods for evaluating the effectiveness of theory-based behaviour-change interventions: a case study targeting health professionals' adoption of a national suicide prevention guideline.

    PubMed

    Hanbury, A; Wallace, L M; Clark, M

    2011-05-01

    Interest in behaviour-change interventions targeting health professionals' adoption of clinical guidelines is growing. Recommendations have been made for interventions to have a theoretical base, explore the local context and to use mixed and multiple methods of evaluation to establish intervention effectiveness. This article presents a case study of a behaviour-change intervention delivered to community mental health professionals in one Primary Care Trust, aimed at raising adherence to a national suicide prevention guideline. A discussion of how the theory-base was selected, the local context explored, and how the intervention was developed and delivered is provided. Time series analysis, mediational analysis and qualitative process evaluation were used to evaluate and explore intervention effectiveness. The time series analysis revealed that the intervention was not effective at increasing adherence to the guideline. The mediational analysis indicates that the intervention failed to successfully target the key barrier to adoption of the guidance, and the qualitative process evaluation identified certain intervention components that were well received by the health professionals, and also identified weaknesses in the delivery of the intervention. It is recommended that future research should seek to further develop the evidence-base for linking specific intervention strategies to specific behavioural barriers, explore the potential of theories that take into account broader social and organisational factors that influence health professionals' practice and focus on the process of data synthesis for identifying key factors to target with tailored interventions. Multiple and mixed evaluation techniques are recommended not only to explore whether an intervention is effective or not but also why it is effective or not. PMID:21491337

  4. Intervention mapping protocol for developing a theory-based diabetes self-management education program.

    PubMed

    Song, Misoon; Choi, Suyoung; Kim, Se-An; Seo, Kyoungsan; Lee, Soo Jin

    2015-01-01

    Development of behavior theory-based health promotion programs is encouraged with the paradigm shift from contents to behavior outcomes. This article describes the development process of the diabetes self-management program for older Koreans (DSME-OK) using intervention mapping (IM) protocol. The IM protocol includes needs assessment, defining goals and objectives, identifying theory and determinants, developing a matrix to form change objectives, selecting strategies and methods, structuring the program, and planning for evaluation and pilot testing. The DSME-OK adopted seven behavior objectives developed by the American Association of Diabetes Educators as behavioral outcomes. The program applied an information-motivation-behavioral skills model, and interventions were targeted to 3 determinants to change health behaviors. Specific methods were selected to achieve each objective guided by IM protocol. As the final step, program evaluation was planned including a pilot test. The DSME-OK was structured as the 3 determinants of the IMB model were intervened to achieve behavior objectives in each session. The program has 12 weekly 90-min sessions tailored for older adults. Using the IM protocol in developing a theory-based self-management program was beneficial in terms of providing a systematic guide to developing theory-based and behavior outcome-focused health education programs. PMID:26062288

  5. Information theoretic methods for image processing algorithm optimization

    NASA Astrophysics Data System (ADS)

    Prokushkin, Sergey F.; Galil, Erez

    2015-01-01

    Modern image processing pipelines (e.g., those used in digital cameras) are full of advanced, highly adaptive filters that often have a large number of tunable parameters (sometimes > 100). This makes the calibration procedure for these filters very complex, and the optimal results barely achievable in the manual calibration; thus an automated approach is a must. We will discuss an information theory based metric for evaluation of algorithm adaptive characteristics ("adaptivity criterion") using noise reduction algorithms as an example. The method allows finding an "orthogonal decomposition" of the filter parameter space into the "filter adaptivity" and "filter strength" directions. This metric can be used as a cost function in automatic filter optimization. Since it is a measure of a physical "information restoration" rather than perceived image quality, it helps to reduce the set of the filter parameters to a smaller subset that is easier for a human operator to tune and achieve a better subjective image quality. With appropriate adjustments, the criterion can be used for assessment of the whole imaging system (sensor plus post-processing).

  6. Theory-Based Programme Development and Evaluation in Physiotherapy

    PubMed Central

    Kay, Theresa; Klinck, Beth

    2008-01-01

    ABSTRACT Purpose: Programme evaluation has been defined as “the systematic process of collecting credible information for timely decision making about a particular program.” Where possible, findings are used to develop, revise, and improve programmes. Theory-based programme development and evaluation provides a comprehensive approach to programme evaluation. Summary of key points: In order to obtain meaningful information from evaluation activities, relevant programme components need to be understood. Theory-based programme development and evaluation starts with a comprehensive description of the programme. A useful tool to describe a programme is the Sidani and Braden Model of Program Theory, consisting of six programme components: problem definition, critical inputs, mediating factors, expected outcomes, extraneous factors, and implementation issues. Articulation of these key components may guide physiotherapy programme implementation and delivery and assist in the development of key evaluation questions and methodologies. Using this approach leads to a better understanding of client needs, programme processes, and programme outcomes and can help to identify barriers to and enablers of successful implementation. Two specific examples, representing public and private sectors, will illustrate the application of this approach to clinical practice. Conclusions: Theory-based programme development helps clinicians, administrators, and researchers develop an understanding of who benefits the most from which types of programmes and facilitates the implementation of processes to improve programmes. PMID:20145741

  7. Method for gathering and summarizing internet information

    DOEpatents

    Potok, Thomas E [Oak Ridge, TN; Elmore, Mark Thomas [Oak Ridge, TN; Reed, Joel Wesley [Knoxville, TN; Treadwell, Jim N; Samatova, Nagiza Faridovna [Oak Ridge, TN

    2008-01-01

    A computer method of gathering and summarizing large amounts of information comprises collecting information from a plurality of information sources (14, 51) according to respective maps (52) of the information sources (14), converting the collected information from a storage format to XML-language documents (26, 53) and storing the XML-language documents in a storage medium, searching for documents (55) according to a search query (13) having at least one term and identifying the documents (26) found in the search, and displaying the documents as nodes (33) of a tree structure (32) having links (34) and nodes (33) so as to indicate similarity of the documents to each other.

  8. Method for gathering and summarizing internet information

    DOEpatents

    Potok, Thomas E.; Elmore, Mark Thomas; Reed, Joel Wesley; Treadwell, Jim N.; Samatova, Nagiza Faridovna

    2010-04-06

    A computer method of gathering and summarizing large amounts of information comprises collecting information from a plurality of information sources (14, 51) according to respective maps (52) of the information sources (14), converting the collected information from a storage format to XML-language documents (26, 53) and storing the XML-language documents in a storage medium, searching for documents (55) according to a search query (13) having at least one term and identifying the documents (26) found in the search, and displaying the documents as nodes (33) of a tree structure (32) having links (34) and nodes (33) so as to indicate similarity of the documents to each other.

  9. Analysis of methods. [information systems evolution environment

    NASA Technical Reports Server (NTRS)

    Mayer, Richard J. (Editor); Ackley, Keith A.; Wells, M. Sue; Mayer, Paula S. D.; Blinn, Thomas M.; Decker, Louis P.; Toland, Joel A.; Crump, J. Wesley; Menzel, Christopher P.; Bodenmiller, Charles A.

    1991-01-01

    Information is one of an organization's most important assets. For this reason the development and maintenance of an integrated information system environment is one of the most important functions within a large organization. The Integrated Information Systems Evolution Environment (IISEE) project has as one of its primary goals a computerized solution to the difficulties involved in the development of integrated information systems. To develop such an environment a thorough understanding of the enterprise's information needs and requirements is of paramount importance. This document is the current release of the research performed by the Integrated Development Support Environment (IDSE) Research Team in support of the IISEE project. Research indicates that an integral part of any information system environment would be multiple modeling methods to support the management of the organization's information. Automated tool support for these methods is necessary to facilitate their use in an integrated environment. An integrated environment makes it necessary to maintain an integrated database which contains the different kinds of models developed under the various methodologies. In addition, to speed the process of development of models, a procedure or technique is needed to allow automatic translation from one methodology's representation to another while maintaining the integrity of both. The purpose for the analysis of the modeling methods included in this document is to examine these methods with the goal being to include them in an integrated development support environment. To accomplish this and to develop a method for allowing intra-methodology and inter-methodology model element reuse, a thorough understanding of multiple modeling methodologies is necessary. Currently the IDSE Research Team is investigating the family of Integrated Computer Aided Manufacturing (ICAM) DEFinition (IDEF) languages IDEF(0), IDEF(1), and IDEF(1x), as well as ENALIM, Entity

  10. Control Theory based Shape Design for the Incompressible Navier-Stokes Equations

    NASA Astrophysics Data System (ADS)

    Cowles, G.; Martinelli, L.

    2003-12-01

    A design method for shape optimization in incompressible turbulent viscous flow has been developed and validated for inverse design. The gradient information is determined using a control theory based algorithm. With such an approach, the cost of computing the gradient is negligible. An additional adjoint system must be solved which requires the cost of a single steady state flow solution. Thus, this method has an enormous advantage over traditional finite-difference based algorithms. The method of artificial compressibility is utilized to solve both the flow and adjoint systems. An algebraic turbulence model is used to compute the eddy viscosity. The method is validated using several inverse wing design test cases. In each case, the program must modify the shape of the initial wing such that its pressure distribution matches that of the target wing. Results are shown for the inversion of both finite thickness wings as well as zero thickness wings which can be considered a model of yacht sails.

  11. Research Investigation of Information Access Methods

    ERIC Educational Resources Information Center

    Heinrichs, John H.; Sharkey, Thomas W.; Lim, Jeen-Su

    2006-01-01

    This study investigates the satisfaction of library users at Wayne State University who utilize alternative information access methods. The LibQUAL+[TM] desired and perceived that satisfaction ratings are used to determine the user's "superiority gap." By focusing limited library resources to address "superiority gap" issues identified by each…

  12. Scenistic Methods for Training: Applications and Practice

    ERIC Educational Resources Information Center

    Lyons, Paul R.

    2011-01-01

    Purpose: This paper aims to complement an earlier article (2010) in "Journal of European Industrial Training" in which the description and theory bases of scenistic methods were presented. This paper also offers a description of scenistic methods and information on theory bases. However, the main thrust of this paper is to describe, give suggested…

  13. Theory-based telehealth and patient empowerment.

    PubMed

    Suter, Paula; Suter, W Newton; Johnston, Donna

    2011-04-01

    Health care technology holds great potential to improve the quality of health care delivery. One effective technology is remote patient monitoring, whereby patient data, such as vital signs or symptom reports, are captured from home monitoring devices and transmitted to health care professionals for review. The use of remote patient monitoring, often referred to as telehealth, has been widely adopted by health care providers, particularly home care agencies. Most agencies have invested in telehealth to facilitate the early identification of disease exacerbation, particularly for patients with chronic diseases such as heart failure and diabetes. This technology has been successfully harnessed by agencies to reduce rehospitalization rates through remote data interpretation and the provision of timely interventions. We propose that the use of telehealth by home care agencies and other health care providers be expanded to empower patients and promote disease self-management with resultant improved health care outcomes. This article describes how remote monitoring, in combination with the application of salient adult learning and cognitive behavioral theories and applied to telehealth care delivery and practice, can promote improved patient self-efficacy with disease management. We present theories applicable for improving health-related behaviors and illustrate how theory-based practices can be implemented in the field of home care. Home care teams that deliver theory-based telehealth function as valuable partners to physicians and hospitals in an integrated health care delivery system. PMID:21241182

  14. Method and apparatus for displaying information

    NASA Technical Reports Server (NTRS)

    Ingber, Donald E. (Inventor); Huang, Sui (Inventor); Eichler, Gabriel (Inventor)

    2010-01-01

    A method for displaying large amounts of information. The method includes the steps of forming a spatial layout of tiles each corresponding to a representative reference element; mapping observed elements onto the spatial layout of tiles of representative reference elements; assigning a respective value to each respective tile of the spatial layout of the representative elements; and displaying an image of the spatial layout of tiles of representative elements. Each tile includes atomic attributes of representative elements. The invention also relates to an apparatus for displaying large amounts of information. The apparatus includes a tiler forming a spatial layout of tiles, each corresponding to a representative reference element; a comparator mapping observed elements onto said spatial layout of tiles of representative reference elements; an assigner assigning a respective value to each respective tile of said spatial layout of representative reference elements; and a display displaying an image of the spatial layout of tiles of representative reference elements.

  15. Information Work Analysis: An Approach to Research on Information Interactions and Information Behaviour in Context

    ERIC Educational Resources Information Center

    Huvila, Isto

    2008-01-01

    Introduction: A work roles and role theory-based approach to conceptualise human information activity, denoted information work analysis is discussed. The present article explicates the approach and its special characteristics and benefits in comparison to earlier methods of analysing human information work. Method: The approach is discussed in…

  16. Knowledge information management toolkit and method

    DOEpatents

    Hempstead, Antoinette R.; Brown, Kenneth L.

    2006-08-15

    A system is provided for managing user entry and/or modification of knowledge information into a knowledge base file having an integrator support component and a data source access support component. The system includes processing circuitry, memory, a user interface, and a knowledge base toolkit. The memory communicates with the processing circuitry and is configured to store at least one knowledge base. The user interface communicates with the processing circuitry and is configured for user entry and/or modification of knowledge pieces within a knowledge base. The knowledge base toolkit is configured for converting knowledge in at least one knowledge base from a first knowledge base form into a second knowledge base form. A method is also provided.

  17. Compressed sensing theory-based channel estimation for optical orthogonal frequency division multiplexing communication system

    NASA Astrophysics Data System (ADS)

    Zhao, Hui; Li, Minghui; Wang, Ruyan; Liu, Yuanni; Song, Daiping

    2014-09-01

    Due to the spare multipath property of the channel, a channel estimation method, which is based on partial superimposed training sequence and compressed sensing theory, is proposed for line of sight optical orthogonal frequency division multiplexing communication systems. First, a continuous training sequence is added at variable power ratio to the cyclic prefix of orthogonal frequency division multiplexing symbols at the transmitter prior to transmission. Then the observation matrix of compressed sensing theory is structured by the use of the training symbols at receiver. Finally, channel state information is estimated using sparse signal reconstruction algorithm. Compared to traditional training sequences, the proposed partial superimposed training sequence not only improves the spectral efficiency, but also reduces the influence to information symbols. In addition, compared with classical least squares and linear minimum mean square error methods, the proposed compressed sensing theory based channel estimation method can improve both the estimation accuracy and the system performance. Simulation results are given to demonstrate the performance of the proposed method.

  18. Evaluating Theory-Based Evaluation: Information, Norms, and Adherence

    ERIC Educational Resources Information Center

    Jacobs, W. Jake; Sisco, Melissa; Hill, Dawn; Malter, Frederic; Figueredo, Aurelio Jose

    2012-01-01

    Programmatic social interventions attempt to produce appropriate social-norm-guided behavior in an open environment. A marriage of applicable psychological theory, appropriate program evaluation theory, and outcome of evaluations of specific social interventions assures the acquisition of cumulative theory and the production of successful social…

  19. Assessment of density functional theory based ΔSCF (self-consistent field) and linear response methods for longest wavelength excited states of extended π-conjugated molecular systems

    SciTech Connect

    Filatov, Michael; Huix-Rotllant, Miquel

    2014-07-14

    Computational investigation of the longest wavelength excitations in a series of cyanines and linear n-acenes is undertaken with the use of standard spin-conserving linear response time-dependent density functional theory (TD-DFT) as well as its spin-flip variant and a ΔSCF method based on the ensemble DFT. The spin-conserving linear response TD-DFT fails to accurately reproduce the lowest excitation energy in these π-conjugated systems by strongly overestimating the excitation energies of cyanines and underestimating the excitation energies of n-acenes. The spin-flip TD-DFT is capable of correcting the underestimation of excitation energies of n-acenes by bringing in the non-dynamic electron correlation into the ground state; however, it does not fully correct for the overestimation of the excitation energies of cyanines, for which the non-dynamic correlation does not seem to play a role. The ensemble DFT method employed in this work is capable of correcting for the effect of missing non-dynamic correlation in the ground state of n-acenes and for the deficient description of differential correlation effects between the ground and excited states of cyanines and yields the excitation energies of both types of extended π-conjugated systems with the accuracy matching high-level ab initio multireference calculations.

  20. Advanced Feedback Methods in Information Retrieval.

    ERIC Educational Resources Information Center

    Salton, G.; And Others

    1985-01-01

    In this study, automatic feedback techniques are applied to Boolean query statements in online information retrieval to generate improved query statements based on information contained in previously retrieved documents. Feedback operations are carried out using conventional Boolean logic and extended logic. Experimental output is included to…

  1. Theory-based categorization under speeded conditions

    PubMed Central

    Luhmann, Christian C.; Ahn, Woo-Kyoung; Palmeri, Thomas J.

    2009-01-01

    It is widely accepted that similarity influences rapid categorization, whereas theories can influence only more leisurely category judgments. In contrast, we argue that it is not the type of knowledge used that determines categorization speed, but rather the complexity of the categorization processes. In two experiments, participants learned four categories of items, each consisting of three causally related features. Participants gave more weight to cause features than to effect features, even under speeded response conditions. Furthermore, the time required to make judgments was equivalent, regardless of whether participants were using causal knowledge or base-rate information. We argue that both causal knowledge and base-rate information, once precompiled during learning, can be used at roughly the same speeds during categorization, thus demonstrating an important parallel between these two types of knowledge. PMID:17128608

  2. Observational attachment theory-based parenting measures predict children's attachment narratives independently from social learning theory-based measures.

    PubMed

    Matias, Carla; O'Connor, Thomas G; Futh, Annabel; Scott, Stephen

    2014-01-01

    Conceptually and methodologically distinct models exist for assessing quality of parent-child relationships, but few studies contrast competing models or assess their overlap in predicting developmental outcomes. Using observational methodology, the current study examined the distinctiveness of attachment theory-based and social learning theory-based measures of parenting in predicting two key measures of child adjustment: security of attachment narratives and social acceptance in peer nominations. A total of 113 5-6-year-old children from ethnically diverse families participated. Parent-child relationships were rated using standard paradigms. Measures derived from attachment theory included sensitive responding and mutuality; measures derived from social learning theory included positive attending, directives, and criticism. Child outcomes were independently-rated attachment narrative representations and peer nominations. Results indicated that Attachment theory-based and Social Learning theory-based measures were modestly correlated; nonetheless, parent-child mutuality predicted secure child attachment narratives independently of social learning theory-based measures; in contrast, criticism predicted peer-nominated fighting independently of attachment theory-based measures. In young children, there is some evidence that attachment theory-based measures may be particularly predictive of attachment narratives; however, no single model of measuring parent-child relationships is likely to best predict multiple developmental outcomes. Assessment in research and applied settings may benefit from integration of different theoretical and methodological paradigms. PMID:24283669

  3. Fast multiple alignment of ungapped DNA sequences using information theory and a relaxation method.

    PubMed

    Schneider, Thomas D; Mastronarde, David N

    1996-12-01

    An information theory based multiple alignment ("Malign") method was used to align the DNA binding sequences of the OxyR and Fis proteins, whose sequence conservation is so spread out that it is difficult to identify the sites. In the algorithm described here, the information content of the sequences is used as a unique global criterion for the quality of the alignment. The algorithm uses look-up tables to avoid recalculating computationally expensive functions such as the logarithm. Because there are no arbitrary constants and because the results are reported in absolute units (bits), the best alignment can be chosen without ambiguity. Starting from randomly selected alignments, a hill-climbing algorithm can track through the immense space of s(n) combinations where s is the number of sequences and n is the number of positions possible for each sequence. Instead of producing a single alignment, the algorithm is fast enough that one can afford to use many start points and to classify the solutions. Good convergence is indicated by the presence of a single well-populated solution class having higher information content than other classes. The existence of several distinct classes for the Fis protein indicates that those binding sites have self-similar features. PMID:19953199

  4. Governance Methods Used in Externalizing Information Technology

    ERIC Educational Resources Information Center

    Chan, Steven King-Lun

    2012-01-01

    Information technology (IT) is the largest capital expenditure in many firms and is an integral part of many organizations' strategies. However, the benefits that each company receives from its IT investments vary. One study by Weill (2004) found that the top performer in the sample was estimated to have as high as a 40% greater return on its…

  5. Informing Patients About Placebo Effects: Using Evidence, Theory, and Qualitative Methods to Develop a New Website

    PubMed Central

    Greville-Harris, Maddy; Bostock, Jennifer; Din, Amy; Graham, Cynthia A; Lewith, George; Liossi, Christina; O’Riordan, Tim; White, Peter; Yardley, Lucy

    2016-01-01

    Background According to established ethical principles and guidelines, patients in clinical trials should be fully informed about the interventions they might receive. However, information about placebo-controlled clinical trials typically focuses on the new intervention being tested and provides limited and at times misleading information about placebos. Objective We aimed to create an informative, scientifically accurate, and engaging website that could be used to improve understanding of placebo effects among patients who might be considering taking part in a placebo-controlled clinical trial. Methods Our approach drew on evidence-, theory-, and person-based intervention development. We used existing evidence and theory about placebo effects to develop content that was scientifically accurate. We used existing evidence and theory of health behavior to ensure our content would be communicated persuasively, to an audience who might currently be ignorant or misinformed about placebo effects. A qualitative ‘think aloud’ study was conducted in which 10 participants viewed prototypes of the website and spoke their thoughts out loud in the presence of a researcher. Results The website provides information about 10 key topics and uses text, evidence summaries, quizzes, audio clips of patients’ stories, and a short film to convey key messages. Comments from participants in the think aloud study highlighted occasional misunderstandings and off-putting/confusing features. These were addressed by modifying elements of content, style, and navigation to improve participants’ experiences of using the website. Conclusions We have developed an evidence-based website that incorporates theory-based techniques to inform members of the public about placebos and placebo effects. Qualitative research ensured our website was engaging and convincing for our target audience who might not perceive a need to learn about placebo effects. Before using the website in clinical trials

  6. 5 CFR 1640.6 - Methods of providing information.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 5 Administrative Personnel 3 2011-01-01 2011-01-01 false Methods of providing information. 1640.6 Section 1640.6 Administrative Personnel FEDERAL RETIREMENT THRIFT INVESTMENT BOARD PERIODIC PARTICIPANT STATEMENTS § 1640.6 Methods of providing information. The TSP will furnish the information described in...

  7. 5 CFR 1640.6 - Methods of providing information.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 5 Administrative Personnel 3 2012-01-01 2012-01-01 false Methods of providing information. 1640.6 Section 1640.6 Administrative Personnel FEDERAL RETIREMENT THRIFT INVESTMENT BOARD PERIODIC PARTICIPANT STATEMENTS § 1640.6 Methods of providing information. The TSP will furnish the information described in...

  8. Applying Human Computation Methods to Information Science

    ERIC Educational Resources Information Center

    Harris, Christopher Glenn

    2013-01-01

    Human Computation methods such as crowdsourcing and games with a purpose (GWAP) have each recently drawn considerable attention for their ability to synergize the strengths of people and technology to accomplish tasks that are challenging for either to do well alone. Despite this increased attention, much of this transformation has been focused on…

  9. Using Qualitative Methods to Inform Scale Development

    ERIC Educational Resources Information Center

    Rowan, Noell; Wulff, Dan

    2007-01-01

    This article describes the process by which one study utilized qualitative methods to create items for a multi dimensional scale to measure twelve step program affiliation. The process included interviewing fourteen addicted persons while in twelve step focused treatment about specific pros (things they like or would miss out on by not being…

  10. Discourse and Practice in Information Literacy and Information Seeking: Gaps and Opportunities

    ERIC Educational Resources Information Center

    Julien, H.; Williamson, K.

    2010-01-01

    Introduction: This paper argues for increased research consideration of the conceptual overlap between information seeking and information literacy, and for scholarly attention to theory-based empirical research that has potential value to practitioners. Method: The paper reviews information seeking and information literacy research, and…

  11. Application of geo-information science methods in ecotourism exploitation

    NASA Astrophysics Data System (ADS)

    Dong, Suocheng; Hou, Xiaoli

    2004-11-01

    Application of geo-information science methods in ecotourism development was discussed in the article. Since 1990s, geo-information science methods, which take the 3S (Geographic Information System, Global Positioning System, and Remote Sensing) as core techniques, has played an important role in resources reconnaissance, data management, environment monitoring, and regional planning. Geo-information science methods can easily analyze and convert geographic spatial data. The application of 3S methods is helpful to sustainable development in tourism. Various assignments are involved in the development of ecotourism, such as reconnaissance of ecotourism resources, drawing of tourism maps, dealing with mass data, and also tourism information inquire, employee management, quality management of products. The utilization of geo-information methods in ecotourism can make the development more efficient by promoting the sustainable development of tourism and the protection of eco-environment.

  12. Information in Our World: Conceptions of Information and Problems of Method in Information Science

    ERIC Educational Resources Information Center

    Ma, Lai

    2012-01-01

    Many concepts of information have been proposed and discussed in library and information science. These concepts of information can be broadly categorized as empirical and situational information. Unlike nomenclatures in many sciences, however, the concept of information in library and information science does not bear a generally accepted…

  13. Axiomatic Evaluation Method and Content Structure for Information Appliances

    ERIC Educational Resources Information Center

    Guo, Yinni

    2010-01-01

    Extensive studies have been conducted to determine how best to present information in order to enhance usability, but not what information is needed to be presented for effective decision making. Hence, this dissertation addresses the factor structure of the nature of information needed for presentation and proposes a more effective method than…

  14. Method and system of integrating information from multiple sources

    DOEpatents

    Alford, Francine A.; Brinkerhoff, David L.

    2006-08-15

    A system and method of integrating information from multiple sources in a document centric application system. A plurality of application systems are connected through an object request broker to a central repository. The information may then be posted on a webpage. An example of an implementation of the method and system is an online procurement system.

  15. 48 CFR 2905.101 - Methods of disseminating information.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 7 2012-10-01 2012-10-01 false Methods of disseminating information. 2905.101 Section 2905.101 Federal Acquisition Regulations System DEPARTMENT OF LABOR ACQUISITION PLANNING PUBLICIZING CONTRACT ACTIONS Dissemination of Information 2905.101 Methods of...

  16. Improving breast cancer control among Latinas: evaluation of a theory-based educational program.

    PubMed

    Mishra, S I; Chavez, L R; Magaña, J R; Nava, P; Burciaga Valdez, R; Hubbell, F A

    1998-10-01

    The study evaluated a theory-based breast cancer control program specially developed for less acculturated Latinas. The authors used a quasi-experimental design with random assignment of Latinas into experimental (n = 51) or control (n = 37) groups that completed one pretest and two posttest surveys. The experimental group received the educational program, which was based on Bandura's self-efficacy theory and Freire's empowerment pedagogy. Outcome measures included knowledge, perceived self-efficacy, attitudes, breast self-examination (BSE) skills, and mammogram use. At posttest 1, controlling for pretest scores, the experimental group was significantly more likely than the control group to have more medically recognized knowledge (sum of square [SS] = 17.0, F = 6.58, p < .01), have less medically recognized knowledge (SS = 128.8, F = 39.24, p < .001), greater sense of perceived self-efficacy (SS = 316.5, F = 9.63, p < .01), and greater adeptness in the conduct of BSE (SS = 234.8, F = 153.33, p < .001). Cancer control programs designed for less acculturated women should use informal and interactive educational methods that incorporate skill-enhancing and empowering techniques. PMID:9768384

  17. A queuing-theory-based interval-fuzzy robust two-stage programming model for environmental management under uncertainty

    NASA Astrophysics Data System (ADS)

    Sun, Y.; Li, Y. P.; Huang, G. H.

    2012-06-01

    In this study, a queuing-theory-based interval-fuzzy robust two-stage programming (QB-IRTP) model is developed through introducing queuing theory into an interval-fuzzy robust two-stage (IRTP) optimization framework. The developed QB-IRTP model can not only address highly uncertain information for the lower and upper bounds of interval parameters but also be used for analysing a variety of policy scenarios that are associated with different levels of economic penalties when the promised targets are violated. Moreover, it can reflect uncertainties in queuing theory problems. The developed method has been applied to a case of long-term municipal solid waste (MSW) management planning. Interval solutions associated with different waste-generation rates, different waiting costs and different arriving rates have been obtained. They can be used for generating decision alternatives and thus help managers to identify desired MSW management policies under various economic objectives and system reliability constraints.

  18. Theory-Based Approaches to the Concept of Life

    ERIC Educational Resources Information Center

    El-Hani, Charbel Nino

    2008-01-01

    In this paper, I argue that characterisations of life through lists of properties have several shortcomings and should be replaced by theory-based accounts that explain the coexistence of a set of properties in living beings. The concept of life should acquire its meaning from its relationships with other concepts inside a theory. I illustrate…

  19. Theory Based Approaches to Learning. Implications for Adult Educators.

    ERIC Educational Resources Information Center

    Bolton, Elizabeth B.; Jones, Edward V.

    This paper presents a codification of theory-based approaches that are applicable to adult learning situations. It also lists some general guidelines that can be used when selecting a particular approach or theory as a basis for planning instruction. Adult education's emphasis on practicality and the relationship between theory and practice is…

  20. Theory-Based University Admissions Testing for a New Millennium

    ERIC Educational Resources Information Center

    Sternberg, Robert J.

    2004-01-01

    This article describes two projects based on Robert J. Sternberg's theory of successful intelligence and designed to provide theory-based testing for university admissions. The first, Rainbow Project, provided a supplementary test of analytical, practical, and creative skills to augment the SAT in predicting college performance. The Rainbow…

  1. 48 CFR 5.101 - Methods of disseminating information.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Methods of disseminating... ACQUISITION PLANNING PUBLICIZING CONTRACT ACTIONS Dissemination of Information 5.101 Methods of disseminating... various methods of satisfying the requirements of 5.207(c). For example, the contracting officer may...

  2. XML-based product information processing method for product design

    NASA Astrophysics Data System (ADS)

    Zhang, Zhen Yu

    2012-01-01

    Design knowledge of modern mechatronics product is based on information processing as the center of the knowledge-intensive engineering, thus product design innovation is essentially the knowledge and information processing innovation. Analysis of the role of mechatronics product design knowledge and information management features, a unified model of XML-based product information processing method is proposed. Information processing model of product design includes functional knowledge, structural knowledge and their relationships. For the expression of product function element, product structure element, product mapping relationship between function and structure based on the XML model are proposed. The information processing of a parallel friction roller is given as an example, which demonstrates that this method is obviously helpful for knowledge-based design system and product innovation.

  3. XML-based product information processing method for product design

    NASA Astrophysics Data System (ADS)

    Zhang, Zhen Yu

    2011-12-01

    Design knowledge of modern mechatronics product is based on information processing as the center of the knowledge-intensive engineering, thus product design innovation is essentially the knowledge and information processing innovation. Analysis of the role of mechatronics product design knowledge and information management features, a unified model of XML-based product information processing method is proposed. Information processing model of product design includes functional knowledge, structural knowledge and their relationships. For the expression of product function element, product structure element, product mapping relationship between function and structure based on the XML model are proposed. The information processing of a parallel friction roller is given as an example, which demonstrates that this method is obviously helpful for knowledge-based design system and product innovation.

  4. Self-informant Agreement for Personality and Evaluative Person Descriptors: Comparing Methods for Creating Informant Measures

    PubMed Central

    Simms, Leonard J.; Zelazny, Kerry; Yam, Wern How; Gros, Daniel F.

    2011-01-01

    Little attention typically is paid to the way self-report measures are translated for use in self-informant agreement studies. We studied two possible methods for creating informant measures: (a) the traditional method in which self-report items were translated from the first- to the third-person and (b) an alternative meta-perceptual method in which informants were directed to rate their perception of the targets’ self-perception. We hypothesized that the latter method would yield stronger self-informant agreement for evaluative personality dimensions measured by indirect item markers. We studied these methods in a sample of 303 undergraduate friendship dyads. Results revealed mean-level differences between methods, similar self-informant agreement across methods, stronger agreement for Big Five dimensions than for evaluative dimensions, and incremental validity for meta-perceptual informant rating methods. Limited power reduced the interpretability of several sparse acquaintanceship effects. We conclude that traditional informant methods are appropriate for most personality traits, but meta-perceptual methods may be more appropriate when personality questionnaire items reflect indirect indicators of the trait being measured, which is particularly likely for evaluative traits. PMID:21541262

  5. The use of density functional theory-based reactivity descriptors in molecular similarity calculations

    NASA Astrophysics Data System (ADS)

    Boon, Greet; De Proft, Frank; Langenaeker, Wilfried; Geerlings, Paul

    1998-10-01

    Molecular similarity is studied via density functional theory-based similarity indices using a numerical integration method. Complementary to the existing similarity indices, we introduce a reactivity-related similarity index based on the local softness. After a study of some test systems, a series of peptide isosteres is studied in view of their importance in pharmacology. The whole of the present work illustrates the importance of the study of molecular similarity based on both shape and reactivity.

  6. Collecting Information for Rating Global Assessment of Functioning (GAF): Sources of Information and Methods for Information Collection

    PubMed Central

    Aas, I. H. Monrad

    2014-01-01

    Introduction: Global Assessment of Functioning (GAF) is an assessment instrument that is known worldwide. It is widely used for rating the severity of illness. Results from evaluations in psychiatry should characterize the patients. Rating of GAF is based on collected information. The aim of the study is to identify the factors involved in collecting information that is relevant for rating GAF, and gaps in knowledge where it is likely that further development would play a role for improved scoring. Methods: A literature search was conducted with a combination of thorough hand search and search in the bibliographic databases PubMed, PsycINFO, Google Scholar, and Campbell Collaboration Library of Systematic Reviews. Results: Collection of information for rating GAF depends on two fundamental factors: the sources of information and the methods for information collection. Sources of information are patients, informants, health personnel, medical records, letters of referral and police records about violence and substance abuse. Methods for information collection include the many different types of interview – unstructured, semi-structured, structured, interviews for Axis I and II disorders, semistructured interviews for rating GAF, and interviews of informants – as well as instruments for rating symptoms and functioning, and observation. The different sources of information, and methods for collection, frequently result in inconsistencies in the information collected. The variation in collected information, and lack of a generally accepted algorithm for combining collected information, is likely to be important for rated GAF values, but there is a fundamental lack of knowledge about the degree of importance. Conclusions: Research to improve GAF has not reached a high level. Rated GAF values are likely to be influenced by both the sources of information used and the methods employed for information collection, but the lack of research-based information about these

  7. 48 CFR 2905.101 - Methods of disseminating information.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 7 2010-10-01 2010-10-01 false Methods of disseminating information. 2905.101 Section 2905.101 Federal Acquisition Regulations System DEPARTMENT OF LABOR ACQUISITION... dissemination of information concerning procurement actions. The Division of Acquisition Management...

  8. Consent, Informal Organization and Job Rewards: A Mixed Methods Analysis

    ERIC Educational Resources Information Center

    Laubach, Marty

    2005-01-01

    This study uses a mixed methods approach to workplace dynamics. Ethnographic observations show that the consent deal underlies an informal stratification that divides the workplace into an "informal periphery," a "conventional core" and an "administrative clan." The "consent deal" is defined as an exchange of autonomy, voice and schedule…

  9. Information theory in living systems, methods, applications, and challenges.

    PubMed

    Gatenby, Robert A; Frieden, B Roy

    2007-02-01

    Living systems are distinguished in nature by their ability to maintain stable, ordered states far from equilibrium. This is despite constant buffeting by thermodynamic forces that, if unopposed, will inevitably increase disorder. Cells maintain a steep transmembrane entropy gradient by continuous application of information that permits cellular components to carry out highly specific tasks that import energy and export entropy. Thus, the study of information storage, flow and utilization is critical for understanding first principles that govern the dynamics of life. Initial biological applications of information theory (IT) used Shannon's methods to measure the information content in strings of monomers such as genes, RNA, and proteins. Recent work has used bioinformatic and dynamical systems to provide remarkable insights into the topology and dynamics of intracellular information networks. Novel applications of Fisher-, Shannon-, and Kullback-Leibler informations are promoting increased understanding of the mechanisms by which genetic information is converted to work and order. Insights into evolution may be gained by analysis of the the fitness contributions from specific segments of genetic information as well as the optimization process in which the fitness are constrained by the substrate cost for its storage and utilization. Recent IT applications have recognized the possible role of nontraditional information storage structures including lipids and ion gradients as well as information transmission by molecular flux across cell membranes. Many fascinating challenges remain, including defining the intercellular information dynamics of multicellular organisms and the role of disordered information storage and flow in disease. PMID:17083004

  10. An Extraction Method of an Informative DOM Node from a Web Page by Using Layout Information

    NASA Astrophysics Data System (ADS)

    Tsuruta, Masanobu; Masuyama, Shigeru

    We propose an informative DOM node extraction method from a Web page for preprocessing of Web content mining. Our proposed method LM uses layout data of DOM nodes generated by a generic Web browser, and the learning set consists of hundreds of Web pages and the annotations of informative DOM nodes of those Web pages. Our method does not require large scale crawling of the whole Web site to which the target Web page belongs. We design LM so that it uses the information of the learning set more efficiently in comparison to the existing method that uses the same learning set. By experiments, we evaluate the methods obtained by combining one that consists of the method for extracting the informative DOM node both the proposed method and the existing methods, and the existing noise elimination methods: Heur removes advertisements and link-lists by some heuristics and CE removes the DOM nodes existing in the Web pages in the same Web site to which the target Web page belongs. Experimental results show that 1) LM outperforms other methods for extracting the informative DOM node, 2) the combination method (LM, {CE(10), Heur}) based on LM (precision: 0.755, recall: 0.826, F-measure: 0.746) outperforms other combination methods.

  11. A Method of Integrated Description of Design Information for Reusability

    NASA Astrophysics Data System (ADS)

    Tsumaya, Akira; Nagae, Masao; Wakamatsu, Hidefumi; Shirase, Keiichi; Arai, Eiji

    Much of product design is executed concurrently these days. For such concurrent design, the method which can share and ueuse varioud kind of design information among designers is needed. However, complete understanding of the design information among designers have been a difficult issue. In this paper, design process model with use of designers’ intention is proposed. A method to combine the design process information and the design object information is also proposed. We introduce how to describe designers’ intention by providing some databases. Keyword Database consists of ontological data related to design object/activities. Designers select suitable keyword(s) from Keyword Database and explain the reason/ideas for their design activities by the description with use of keyword(s). We also developed the integration design information management system architecture by using a method of integrated description with designers’ intension. This system realizes connections between the information related to design process and that related to design object through designers’ intention. Designers can communicate with each other to understand how others make decision in design through that. Designers also can re-use both design process information data and design object information data through detabase management sub-system.

  12. A Method to Separate Stochastic and Deterministic Information from Electrocardiograms

    NASA Astrophysics Data System (ADS)

    Gutiérrez, R. M.; Sandoval, L. A.

    2005-01-01

    In this work we present a new idea to develop a method to separate stochastic and deterministic information contained in an electrocardiogram, ECG, which may provide new sources of information with diagnostic purposes. We assume that the ECG has information corresponding to many different processes related with the cardiac activity as well as contamination from different sources related with the measurement procedure and the nature of the observed system itself. The method starts with the application of an improved archetypal analysis to separate the mentioned stochastic and deterministic information. From the stochastic point of view we analyze Renyi entropies, and with respect to the deterministic perspective we calculate the autocorrelation function and the corresponding correlation time. We show that healthy and pathologic information may be stochastic and/or deterministic, can be identified by different measures and located in different parts of the ECG.

  13. System and method for acquisition management of subject position information

    DOEpatents

    Carrender, Curt

    2007-01-23

    A system and method for acquisition management of subject position information that utilizes radio frequency identification (RF ID) to store position information in position tags. Tag programmers receive position information from external positioning systems, such as the Global Positioning System (GPS), from manual inputs, such as keypads, or other tag programmers. The tag programmers program each position tag with the received position information. Both the tag programmers and the position tags can be portable or fixed. Implementations include portable tag programmers and fixed position tags for subject position guidance, and portable tag programmers for collection sample labeling. Other implementations include fixed tag programmers and portable position tags for subject route recordation. Position tags can contain other associated information such as destination address of an affixed subject for subject routing.

  14. System and method for acquisition management of subject position information

    DOEpatents

    Carrender, Curt

    2005-12-13

    A system and method for acquisition management of subject position information that utilizes radio frequency identification (RF ID) to store position information in position tags. Tag programmers receive position information from external positioning systems, such as the Global Positioning System (GPS), from manual inputs, such as keypads, or other tag programmers. The tag programmers program each position tag with the received position information. Both the tag programmers and the position tags can be portable or fixed. Implementations include portable tag programmers and fixed position tags for subject position guidance, and portable tag programmers for collection sample labeling. Other implementations include fixed tag programmers and portable position tags for subject route recordation. Position tags can contain other associated information such as destination address of an affixed subject for subject routing.

  15. Adaptive windowed range-constrained Otsu method using local information

    NASA Astrophysics Data System (ADS)

    Zheng, Jia; Zhang, Dinghua; Huang, Kuidong; Sun, Yuanxi; Tang, Shaojie

    2016-01-01

    An adaptive windowed range-constrained Otsu method using local information is proposed for improving the performance of image segmentation. First, the reason why traditional thresholding methods do not perform well in the segmentation of complicated images is analyzed. Therein, the influences of global and local thresholdings on the image segmentation are compared. Second, two methods that can adaptively change the size of the local window according to local information are proposed by us. The characteristics of the proposed methods are analyzed. Thereby, the information on the number of edge pixels in the local window of the binarized variance image is employed to adaptively change the local window size. Finally, the superiority of the proposed method over other methods such as the range-constrained Otsu, the active contour model, the double Otsu, the Bradley's, and the distance-regularized level set evolution is demonstrated. It is validated by the experiments that the proposed method can keep more details and acquire much more satisfying area overlap measure as compared with the other conventional methods.

  16. A theory-based approach to thermal field-flow fractionation of polyacrylates.

    PubMed

    Runyon, J Ray; Williams, S Kim Ratanathanawongs

    2011-09-28

    A theory-based approach is presented for the development of thermal field-flow fractionation (ThFFF) of polyacrylates. The use of ThFFF for polymer analysis has been limited by an incomplete understanding of the thermal diffusion which plays an important role in retention and separation. Hence, a tedious trial-and-error approach to method development has been the normal practice when analyzing new materials. In this work, thermal diffusion theories based on temperature dependent osmotic pressure gradient and polymer-solvent interaction parameters were used to estimate thermal diffusion coefficients (D(T)) and retention times (t(r)) for different polymer-solvent pairs. These calculations identified methyl ethyl ketone as a solvent that would cause significant retention of poly(n-butyl acrylate) (PBA) and poly(methyl acrylate) (PMA). Experiments confirmed retention of these two polymers that have not been previously analyzed by ThFFF. Theoretical and experimental D(T)s and t(r)s for PBA, PMA, and polystyrene in different solvents agreed to within 20% and demonstrate the feasibility of this theory-based approach. PMID:21872869

  17. Financial time series analysis based on information categorization method

    NASA Astrophysics Data System (ADS)

    Tian, Qiang; Shang, Pengjian; Feng, Guochen

    2014-12-01

    The paper mainly applies the information categorization method to analyze the financial time series. The method is used to examine the similarity of different sequences by calculating the distances between them. We apply this method to quantify the similarity of different stock markets. And we report the results of similarity in US and Chinese stock markets in periods 1991-1998 (before the Asian currency crisis), 1999-2006 (after the Asian currency crisis and before the global financial crisis), and 2007-2013 (during and after global financial crisis) by using this method. The results show the difference of similarity between different stock markets in different time periods and the similarity of the two stock markets become larger after these two crises. Also we acquire the results of similarity of 10 stock indices in three areas; it means the method can distinguish different areas' markets from the phylogenetic trees. The results show that we can get satisfactory information from financial markets by this method. The information categorization method can not only be used in physiologic time series, but also in financial time series.

  18. Formative research to develop theory-based messages for a Western Australian child drowning prevention television campaign: study protocol

    PubMed Central

    Denehy, Mel; Crawford, Gemma; Leavy, Justine; Nimmo, Lauren; Jancey, Jonine

    2016-01-01

    Introduction Worldwide, children under the age of 5 years are at particular risk of drowning. Responding to this need requires the development of evidence-informed drowning prevention strategies. Historically, drowning prevention strategies have included denying access, learning survival skills and providing supervision, as well as education and information which includes the use of mass media. Interventions underpinned by behavioural theory and formative evaluation tend to be more effective, yet few practical examples exist in the drowning and/or injury prevention literature. The Health Belief Model and Social Cognitive Theory will be used to explore participants' perspectives regarding proposed mass media messaging. This paper describes a qualitative protocol to undertake formative research to develop theory-based messages for a child drowning prevention campaign. Methods and analysis The primary data source will be focus group interviews with parents and caregivers of children under 5 years of age in metropolitan and regional Western Australia. Qualitative content analysis will be used to analyse the data. Ethics and dissemination This study will contribute to the drowning prevention literature to inform the development of future child drowning prevention mass media campaigns. Findings from the study will be disseminated to practitioners, policymakers and researchers via international conferences, peer and non-peer-reviewed journals and evidence summaries. The study was submitted and approved by the Curtin University Human Research Ethics Committee. PMID:27207621

  19. Foreign Language Methods and an Information Processing Model of Memory.

    ERIC Educational Resources Information Center

    Willebrand, Julia

    The major approaches to language teaching (audiolingual method, generative grammar, Community Language Learning and Silent Way) are investigated to discover whether or not they are compatible in structure with an information-processing model of memory (IPM). The model of memory used was described by Roberta Klatzky in "Human Memory: Structures and…

  20. Determination of nuclear level densities from experimental information

    SciTech Connect

    Cole, B.J. ); Davidson, N.J. , P.O. Box 88, Manchester M60 1QD ); Miller, H.G. )

    1994-10-01

    A novel information theory based method for determining the density of states from prior information is presented. The energy dependence of the density of states is determined from the observed number of states per energy interval, and model calculations suggest that the method is sufficiently reliable to calculate the thermal properties of nuclei over a reasonable temperature range.

  1. Game theory based band selection for hyperspectral images

    NASA Astrophysics Data System (ADS)

    Shi, Aiye; He, Zhenyu; Huang, Fengchen

    2015-12-01

    This paper proposes a new evaluation criterion for band selection for hyperspectral imagery. The combination of information and class separability is used to be as a new evaluation criterion, at the same time, the correlation between bands is used as a constraint condition. In addition, the game theory is introduced into the band selection to coordinate the potential conflict of search the optimal band combination using information and class separability these two evaluation criteria. The experimental results show that the proposed method is effective on AVIRIS hyperspectral data.

  2. Evaluation methods for retrieving information from interferograms of biomedical objects

    NASA Astrophysics Data System (ADS)

    Podbielska, Halina; Rottenkolber, Matthias

    1996-04-01

    Interferograms in the form of fringe patterns can be produced in two-beam interferometers, holographic or speckle interferometers, in setups realizing moire techniques or in deflectometers. Optical metrology based on the principle of interference can be applied as a testing tool in biomedical research. By analyzing of the fringe pattern images, information about the shape or mechanical behavior of the object under study can be retrieved. Here, some of the techniques for creating fringe pattern images were presented along with methods of analysis. Intensity based analysis as well as methods of phase measurements, are mentioned. Applications of inteferometric methods, especially in the field of experimental orthopedics, endoscopy and ophthalmology are pointed out.

  3. Entropy theory based multi-criteria resampling of rain gauge networks for hydrological modelling - A case study of humid area in southern China

    NASA Astrophysics Data System (ADS)

    Xu, Hongliang; Xu, Chong-Yu; Sælthun, Nils Roar; Xu, Youpeng; Zhou, Bin; Chen, Hua

    2015-06-01

    Rain gauge networks are used to provide estimates of area average, spatial variability and point rainfalls at catchment scale and provide the most important input for hydrological models. Therefore, it is desired to design the optimal rain gauge networks with a minimal number of rain gauges to provide reliable data with both areal mean values and spatial-temporal variability. Based on a dense rain gauge network of 185 rain gauges in Xiangjiang River Basin, southern China, this study used an entropy theory based multi-criteria method which simultaneously considers the information derived from rainfall series, minimize the bias of areal mean rainfall as well as minimize the information overlapped by different gauges to resample the rain gauge networks with different gauge densities. The optimal networks were examined using two hydrological models: The lumped Xinanjiang Model and the distributed SWAT Model. The results indicate that the performances of the lumped model using different optimal networks are stable while the performances of the distributed model keep on improving as the number of rain gauges increases. The results reveal that the entropy theory based multi-criteria strategy provides an optimal design of rain gauge network which is of vital importance in regional hydrological study and water resources management.

  4. A Model-Driven Development Method for Management Information Systems

    NASA Astrophysics Data System (ADS)

    Mizuno, Tomoki; Matsumoto, Keinosuke; Mori, Naoki

    Traditionally, a Management Information System (MIS) has been developed without using formal methods. By the informal methods, the MIS is developed on its lifecycle without having any models. It causes many problems such as lack of the reliability of system design specifications. In order to overcome these problems, a model theory approach was proposed. The approach is based on an idea that a system can be modeled by automata and set theory. However, it is very difficult to generate automata of the system to be developed right from the start. On the other hand, there is a model-driven development method that can flexibly correspond to changes of business logics or implementing technologies. In the model-driven development, a system is modeled using a modeling language such as UML. This paper proposes a new development method for management information systems applying the model-driven development method to a component of the model theory approach. The experiment has shown that a reduced amount of efforts is more than 30% of all the efforts.

  5. Quantitative methods to direct exploration based on hydrogeologic information

    USGS Publications Warehouse

    Graettinger, A.J.; Lee, J.; Reeves, H.W.; Dethan, D.

    2006-01-01

    Quantitatively Directed Exploration (QDE) approaches based on information such as model sensitivity, input data covariance and model output covariance are presented. Seven approaches for directing exploration are developed, applied, and evaluated on a synthetic hydrogeologic site. The QDE approaches evaluate input information uncertainty, subsurface model sensitivity and, most importantly, output covariance to identify the next location to sample. Spatial input parameter values and covariances are calculated with the multivariate conditional probability calculation from a limited number of samples. A variogram structure is used during data extrapolation to describe the spatial continuity, or correlation, of subsurface information. Model sensitivity can be determined by perturbing input data and evaluating output response or, as in this work, sensitivities can be programmed directly into an analysis model. Output covariance is calculated by the First-Order Second Moment (FOSM) method, which combines the covariance of input information with model sensitivity. A groundwater flow example, modeled in MODFLOW-2000, is chosen to demonstrate the seven QDE approaches. MODFLOW-2000 is used to obtain the piezometric head and the model sensitivity simultaneously. The seven QDE approaches are evaluated based on the accuracy of the modeled piezometric head after information from a QDE sample is added. For the synthetic site used in this study, the QDE approach that identifies the location of hydraulic conductivity that contributes the most to the overall piezometric head variance proved to be the best method to quantitatively direct exploration. ?? IWA Publishing 2006.

  6. An Organizational Model to Distinguish between and Integrate Research and Evaluation Activities in a Theory Based Evaluation

    ERIC Educational Resources Information Center

    Sample McMeeking, Laura B.; Basile, Carole; Cobb, R. Brian

    2012-01-01

    Theory-based evaluation (TBE) is an evaluation method that shows how a program will work under certain conditions and has been supported as a viable, evidence-based option in cases where randomized trials or high-quality quasi-experiments are not feasible. Despite the model's widely accepted theoretical appeal there are few examples of its…

  7. Hybrid methods to represent incomplete and uncertain information

    SciTech Connect

    Joslyn, C.

    1996-12-31

    Decision making is cast in the semiotic context of perception, decision, and action loops. Towards the goal of properly grounding hybrid representations of information and uncertainty from this semiotic perspective, we consider the roles of and relations among the mathematical components of General Information Theory (GIT), particularly among fuzzy sets, possibility theory, probability theory, and random sets. We do so by using a clear distinction between the syntactic, mathematical formalism and the semantic domains of application of each of these fields, placing the emphasis on available measurement and action methods appropriate for each formalism, to which and from which the decision-making process flows.

  8. Enhancing subsurface information from the fusion of multiple geophysical methods

    NASA Astrophysics Data System (ADS)

    Jafargandomi, A.; Binley, A.

    2011-12-01

    Characterization of hydrologic systems is a key element in understanding and predicting their behaviour. Geophysical methods especially electrical methods (e.g., electrical resistivity tomography (ERT), induced polarization (IP) and electromagnetic (EM)) are becoming popular for such purpose due to their non-invasive nature, high sensitivity to hydrological parameters and the speed of measurements. However, interrogation of each geophysical method provides only limited information about some of the subsurface parameters. Therefore, in order to achieve a comprehensive picture from the hydrologic system, fusion of multiple geophysical data sets can be beneficial. Although a number of fusion approaches have been proposed in the literature, an aspect that has been generally overlooked is the assessment of information content from each measurement approach. Such an assessment provides useful insight for the design of future surveys. We develop a fusion strategy based on the capability of multiple geophysical methods to provide enough resolution to identify subsurface material parameters and structure. We apply a Bayesian framework to analyse the information in multiple geophysical data sets. In this approach multiple geophysical data sets are fed into a Markov chain Monte Carlo (McMC) inversion algorithm and the information content of the post-inversion result (posterior probability distribution) is quantified. We use Shannon's information measure to quantify the information obtained from the inversion of different combinations of geophysical data sets. In this strategy, information from multiple methods is brought together via introducing joint likelihood function and/or constraining the prior information. We apply the fusion tool to one of the target sites of the EU FP7 project ModelProbe which aims to develop technologies and tools for soil contamination assessment and site characterization. The target site is located close to Trecate (Novara - NW Italy). At this

  9. Identifying informative subsets of the Gene Ontology with information bottleneck methods

    PubMed Central

    Jin, Bo; Lu, Xinghua

    2010-01-01

    Motivation: The Gene Ontology (GO) is a controlled vocabulary designed to represent the biological concepts pertaining to gene products. This study investigates the methods for identifying informative subsets of GO terms in an automatic and objective fashion. This task in turn requires addressing the following issues: how to represent the semantic context of GO terms, what metrics are suitable for measuring the semantic differences between terms, how to identify an informative subset that retains as much as possible of the original semantic information of GO. Results: We represented the semantic context of a GO term using the word-usage-profile associated with the term, which enables one to measure the semantic differences between terms based on the differences in their semantic contexts. We further employed the information bottleneck methods to automatically identify subsets of GO terms that retain as much as possible of the semantic information in an annotation database. The automatically retrieved informative subsets align well with an expert-picked GO slim subset, cover important concepts and proteins, and enhance literature-based GO annotation. Availability: http://carcweb.musc.edu/TextminingProjects/ Contact: xinghua@pitt.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:20702400

  10. Hybrid methods for multisource information fusion and decision support

    NASA Astrophysics Data System (ADS)

    Braun, Jerome J.; Glina, Yan

    2006-04-01

    This paper presents the progress of an ongoing research effort in multisource information fusion for biodefense decision support. The effort concentrates on a novel machine-intelligence hybrid-of-hybrids decision support architecture termed FLASH (Fusion, Learning, Adaptive Super-Hybrid) we proposed. The highlights of FLASH discussed in the paper include its cognitive-processing orientation and the hybrid nature involving heterogeneous multiclassifier machine learning and approximate reasoning paradigms. Selected specifics of the FLASH internals, such as its feature selection techniques, supervised learning, clustering, recognition and reasoning methods, and their integration, are discussed. The results to date are presented, including the background type determination and bioattack detection computational experiments using data obtained with a multisensor fusion testbed we have also developed. The processing of imprecise information originating from sources other than sensors is considered. Finally, the paper discusses applicability of FLASH and its methods to complex battlespace management problems such as course-of-action decision support.

  11. Methods of obtaining meaningful information from disperse media holograms

    NASA Astrophysics Data System (ADS)

    Dyomin, Victor V.

    1997-05-01

    The problem of nondestructive testing of microstructure parameters, both aerosols and water suspension, is actual for biology, medicine, and environmental control. Among the methods of optical investigations and diagnostics of light scattering media the holographic method plays a special role. A hologram of scattering volume allows us to reproduce the optical wave field to obtain information on the parameters of microparticles: size, shape, and spatial position. Usually this is done by analysis of the particle images reconstructed from the hologram. On the basis of calculated and experimental results, characteristics of holographic methods are analyzed in this paper. These estimations demonstrate a possibility to use the above methods for investigation of media in biomedical science and clinical practice. A lot of micro-organisms and other living particles are transparent or semitransparent ones. In this case the reconstructed image of the particle will show a spot formed due to light focusing by the particle in addition to its cross section. This circumstance allowed us to propose a method of determining of refractive index of transparent and semitransparent microparticles, that, in turn, can provide identification of the particles type. The development of this method is presented. To make measurement of the size-distribution of particles one can do this simultaneously with the reconstruction of scattering optical field from the hologram. In this case a small angle optical meter (for example, focusing lens) can be placed just behind the illuminated hologram. The reconstructed field is composed of the initial one and its conjugate. Each of these components as well as interference between them can bear an additional information on the medium. The possibility of extraction of this information is also discussed.

  12. Application of information theory methods to food web reconstruction

    USGS Publications Warehouse

    Moniz, L.J.; Cooch, E.G.; Ellner, S.P.; Nichols, J.D.; Nichols, J.M.

    2007-01-01

    In this paper we use information theory techniques on time series of abundances to determine the topology of a food web. At the outset, the food web participants (two consumers, two resources) are known; in addition we know that each consumer prefers one of the resources over the other. However, we do not know which consumer prefers which resource, and if this preference is absolute (i.e., whether or not the consumer will consume the non-preferred resource). Although the consumers and resources are identified at the beginning of the experiment, we also provide evidence that the consumers are not resources for each other, and the resources do not consume each other. We do show that there is significant mutual information between resources; the model is seasonally forced and some shared information between resources is expected. Similarly, because the model is seasonally forced, we expect shared information between consumers as they respond to the forcing of the resources. The model that we consider does include noise, and in an effort to demonstrate that these methods may be of some use in other than model data, we show the efficacy of our methods with decreasing time series size; in this particular case we obtain reasonably clear results with a time series length of 400 points. This approaches ecological time series lengths from real systems.

  13. Control theory based airfoil design using the Euler equations

    NASA Technical Reports Server (NTRS)

    Jameson, Antony; Reuther, James

    1994-01-01

    This paper describes the implementation of optimization techniques based on control theory for airfoil design. In our previous work it was shown that control theory could be employed to devise effective optimization procedures for two-dimensional profiles by using the potential flow equation with either a conformal mapping or a general coordinate system. The goal of our present work is to extend the development to treat the Euler equations in two-dimensions by procedures that can readily be generalized to treat complex shapes in three-dimensions. Therefore, we have developed methods which can address airfoil design through either an analytic mapping or an arbitrary grid perturbation method applied to a finite volume discretization of the Euler equations. Here the control law serves to provide computationally inexpensive gradient information to a standard numerical optimization method. Results are presented for both the inverse problem and drag minimization problem.

  14. A Danger-Theory-Based Immune Network Optimization Algorithm

    PubMed Central

    Li, Tao; Xiao, Xin; Shi, Yuanquan

    2013-01-01

    Existing artificial immune optimization algorithms reflect a number of shortcomings, such as premature convergence and poor local search ability. This paper proposes a danger-theory-based immune network optimization algorithm, named dt-aiNet. The danger theory emphasizes that danger signals generated from changes of environments will guide different levels of immune responses, and the areas around danger signals are called danger zones. By defining the danger zone to calculate danger signals for each antibody, the algorithm adjusts antibodies' concentrations through its own danger signals and then triggers immune responses of self-regulation. So the population diversity can be maintained. Experimental results show that the algorithm has more advantages in the solution quality and diversity of the population. Compared with influential optimization algorithms, CLONALG, opt-aiNet, and dopt-aiNet, the algorithm has smaller error values and higher success rates and can find solutions to meet the accuracies within the specified function evaluation times. PMID:23483853

  15. Extending the Li&Ma method to include PSF information

    NASA Astrophysics Data System (ADS)

    Nievas-Rosillo, M.; Contreras, J. L.

    2016-02-01

    The so called Li&Ma formula is still the most frequently used method for estimating the significance of observations carried out by Imaging Atmospheric Cherenkov Telescopes. In this work a straightforward extension of the method for point sources that profits from the good imaging capabilities of current instruments is proposed. It is based on a likelihood ratio under the assumption of a well-known PSF and a smooth background. Its performance is tested with Monte Carlo simulations based on real observations and its sensitivity is compared to standard methods which do not incorporate PSF information. The gain of significance that can be attributed to the inclusion of the PSF is around 10% and can be boosted if a background model is assumed or a finer binning is used.

  16. Method to find community structures based on information centrality

    NASA Astrophysics Data System (ADS)

    Fortunato, Santo; Latora, Vito; Marchiori, Massimo

    2004-11-01

    Community structures are an important feature of many social, biological, and technological networks. Here we study a variation on the method for detecting such communities proposed by Girvan and Newman and based on the idea of using centrality measures to define the community boundaries [M. Girvan and M. E. J. Newman, Proc. Natl. Acad. Sci. U.S.A. 99, 7821 (2002)]. We develop an algorithm of hierarchical clustering that consists in finding and removing iteratively the edge with the highest information centrality. We test the algorithm on computer generated and real-world networks whose community structure is already known or has been studied by means of other methods. We show that our algorithm, although it runs to completion in a time O(n4) , is very effective especially when the communities are very mixed and hardly detectable by the other methods.

  17. Emotion identification method using RGB information of human face

    NASA Astrophysics Data System (ADS)

    Kita, Shinya; Mita, Akira

    2015-03-01

    Recently, the number of single households is drastically increased due to the growth of the aging society and the diversity of lifestyle. Therefore, the evolution of building spaces is demanded. Biofied Building we propose can help to avoid this situation. It helps interaction between the building and residents' conscious and unconscious information using robots. The unconscious information includes emotion, condition, and behavior. One of the important information is thermal comfort. We assume we can estimate it from human face. There are many researchs about face color analysis, but a few of them are conducted in real situations. In other words, the existing methods were not used with disturbance such as room lumps. In this study, Kinect was used with face-tracking. Room lumps and task lumps were used to verify that our method could be applicable to real situation. In this research, two rooms at 22 and 28 degrees C were prepared. We showed that the transition of thermal comfort by changing temperature can be observed from human face. Thus, distinction between the data of 22 and 28 degrees C condition from face color was proved to be possible.

  18. Acoustic emission source location and damage detection in a metallic structure using a graph-theory-based geodesic approach

    NASA Astrophysics Data System (ADS)

    Gangadharan, R.; Prasanna, G.; Bhat, M. R.; Murthy, C. R. L.; Gopalakrishnan, S.

    2009-11-01

    A geodesic-based approach using Lamb waves is proposed to locate the acoustic emission (AE) source and damage in an isotropic metallic structure. In the case of the AE (passive) technique, the elastic waves take the shortest path from the source to the sensor array distributed in the structure. The geodesics are computed on the meshed surface of the structure using graph theory based on Dijkstra's algorithm. By propagating the waves in reverse virtually from these sensors along the geodesic path and by locating the first intersection point of these waves, one can get the AE source location. The same approach is extended for detection of damage in a structure. The wave response matrix of the given sensor configuration for the healthy and the damaged structure is obtained experimentally. The healthy and damage response matrix is compared and their difference gives the information about the reflection of waves from the damage. These waves are backpropagated from the sensors and the above method is used to locate the damage by finding the point where intersection of geodesics occurs. In this work, the geodesic approach is shown to be suitable to obtain a practicable source location solution in a more general set-up on any arbitrary surface containing finite discontinuities. Experiments were conducted on aluminum specimens of simple and complex geometry to validate this new method.

  19. A method to stabilize linear systems using eigenvalue gradient information

    NASA Technical Reports Server (NTRS)

    Wieseman, C. D.

    1985-01-01

    Formal optimization methods and eigenvalue gradient information are used to develop a stabilizing control law for a closed loop linear system that is initially unstable. The method was originally formulated by using direct, constrained optimization methods with the constraints being the real parts of the eigenvalues. However, because of problems in trying to achieve stabilizing control laws, the problem was reformulated to be solved differently. The method described uses the Davidon-Fletcher-Powell minimization technique to solve an indirect, constrained minimization problem in which the performance index is the Kreisselmeier-Steinhauser function of the real parts of all the eigenvalues. The method is applied successfully to solve two different problems: the determination of a fourth-order control law stabilizes a single-input single-output active flutter suppression system and the determination of a second-order control law for a multi-input multi-output lateral-directional flight control system. Various sets of design variables and initial starting points were chosen to show the robustness of the method.

  20. Improved prediction of tacrolimus concentrations early after kidney transplantation using theory-based pharmacokinetic modelling

    PubMed Central

    Størset, Elisabet; Holford, Nick; Hennig, Stefanie; Bergmann, Troels K; Bergan, Stein; Bremer, Sara; Åsberg, Anders; Midtvedt, Karsten; Staatz, Christine E

    2014-01-01

    Aims The aim was to develop a theory-based population pharmacokinetic model of tacrolimus in adult kidney transplant recipients and to externally evaluate this model and two previous empirical models. Methods Data were obtained from 242 patients with 3100 tacrolimus whole blood concentrations. External evaluation was performed by examining model predictive performance using Bayesian forecasting. Results Pharmacokinetic disposition parameters were estimated based on tacrolimus plasma concentrations, predicted from whole blood concentrations, haematocrit and literature values for tacrolimus binding to red blood cells. Disposition parameters were allometrically scaled to fat free mass. Tacrolimus whole blood clearance/bioavailability standardized to haematocrit of 45% and fat free mass of 60 kg was estimated to be 16.1 l h−1 [95% CI 12.6, 18.0 l h−1]. Tacrolimus clearance was 30% higher (95% CI 13, 46%) and bioavailability 18% lower (95% CI 2, 29%) in CYP3A5 expressers compared with non-expressers. An Emax model described decreasing tacrolimus bioavailability with increasing prednisolone dose. The theory-based model was superior to the empirical models during external evaluation displaying a median prediction error of −1.2% (95% CI −3.0, 0.1%). Based on simulation, Bayesian forecasting led to 65% (95% CI 62, 68%) of patients achieving a tacrolimus average steady-state concentration within a suggested acceptable range. Conclusion A theory-based population pharmacokinetic model was superior to two empirical models for prediction of tacrolimus concentrations and seemed suitable for Bayesian prediction of tacrolimus doses early after kidney transplantation. PMID:25279405

  1. A theory-based approach to teaching young children about health: A recipe for understanding

    PubMed Central

    Nguyen, Simone P.; McCullough, Mary Beth; Noble, Ashley

    2011-01-01

    The theory-theory account of conceptual development posits that children’s concepts are integrated into theories. Concept learning studies have documented the central role that theories play in children’s learning of experimenter-defined categories, but have yet to extensively examine complex, real-world concepts such as health. The present study examined whether providing young children with coherent and causally-related information in a theory-based lesson would facilitate their learning about the concept of health. This study used a pre-test/lesson/post-test design, plus a five month follow-up. Children were randomly assigned to one of three conditions: theory (i.e., 20 children received a theory-based lesson); nontheory (i.e., 20 children received a nontheory-based lesson); and control (i.e., 20 children received no lesson). Overall, the results showed that children in the theory condition had a more accurate conception of health than children in the nontheory and control conditions, suggesting the importance of theories in children’s learning of complex, real-world concepts. PMID:21894237

  2. Dissemination of a theory-based online bone health program: Two intervention approaches.

    PubMed

    Nahm, Eun-Shim; Resnick, Barbara; Bellantoni, Michele; Zhu, Shijun; Brown, Clayton; Brennan, Patricia F; Charters, Kathleen; Brown, Jeanine; Rietschel, Matthew; Pinna, Joanne; An, Minjeong; Park, Bu Kyung; Plummer, Lisa

    2015-06-01

    With the increasing nationwide emphasis on eHealth, there has been a rapid growth in the use of the Internet to deliver health promotion interventions. Although there has been a great deal of research in this field, little information is available regarding the methodologies to develop and implement effective online interventions. This article describes two social cognitive theory-based online health behavior interventions used in a large-scale dissemination study (N = 866), their implementation processes, and the lessons learned during the implementation processes. The two interventions were a short-term (8-week) intensive online Bone Power program and a longer term (12-month) Bone Power Plus program, including the Bone Power program followed by a 10-month online booster intervention (biweekly eHealth newsletters). This study used a small-group approach (32 intervention groups), and to effectively manage those groups, an eLearning management program was used as an upper layer of the Web intervention. Both interventions were implemented successfully with high retention rates (80.7% at 18 months). The theory-based approaches and the online infrastructure used in this study showed a promising potential as an effective platform for online behavior studies. Further replication studies with different samples and settings are needed to validate the utility of this intervention structure. PMID:26021668

  3. Dissolved oxygen prediction using a possibility theory based fuzzy neural network

    NASA Astrophysics Data System (ADS)

    Khan, Usman T.; Valeo, Caterina

    2016-06-01

    A new fuzzy neural network method to predict minimum dissolved oxygen (DO) concentration in a highly urbanised riverine environment (in Calgary, Canada) is proposed. The method uses abiotic factors (non-living, physical and chemical attributes) as inputs to the model, since the physical mechanisms governing DO in the river are largely unknown. A new two-step method to construct fuzzy numbers using observations is proposed. Then an existing fuzzy neural network is modified to account for fuzzy number inputs and also uses possibility theory based intervals to train the network. Results demonstrate that the method is particularly well suited to predicting low DO events in the Bow River. Model performance is compared with a fuzzy neural network with crisp inputs, as well as with a traditional neural network. Model output and a defuzzification technique are used to estimate the risk of low DO so that water resource managers can implement strategies to prevent the occurrence of low DO.

  4. Dissolved oxygen prediction using a possibility-theory based fuzzy neural network

    NASA Astrophysics Data System (ADS)

    Khan, U. T.; Valeo, C.

    2015-11-01

    A new fuzzy neural network method to predict minimum dissolved oxygen (DO) concentration in a highly urbanised riverine environment (in Calgary, Canada) is proposed. The method uses abiotic (non-living, physical and chemical attributes) as inputs to the model, since the physical mechanisms governing DO in the river are largely unknown. A new two-step method to construct fuzzy numbers using observations is proposed. Then an existing fuzzy neural network is modified to account for fuzzy number inputs and also uses possibility-theory based intervals to train the network. Results demonstrate that the method is particularly well suited to predict low DO events in the Bow River. Model output and a defuzzification technique is used to estimate the risk of low DO so that water resource managers can implement strategies to prevent the occurrence of low DO.

  5. Information bias in health research: definition, pitfalls, and adjustment methods

    PubMed Central

    Althubaiti, Alaa

    2016-01-01

    As with other fields, medical sciences are subject to different sources of bias. While understanding sources of bias is a key element for drawing valid conclusions, bias in health research continues to be a very sensitive issue that can affect the focus and outcome of investigations. Information bias, otherwise known as misclassification, is one of the most common sources of bias that affects the validity of health research. It originates from the approach that is utilized to obtain or confirm study measurements. This paper seeks to raise awareness of information bias in observational and experimental research study designs as well as to enrich discussions concerning bias problems. Specifying the types of bias can be essential to limit its effects and, the use of adjustment methods might serve to improve clinical evaluation and health care practice. PMID:27217764

  6. Information processing systems, reasoning modules, and reasoning system design methods

    DOEpatents

    Hohimer, Ryan E; Greitzer, Frank L; Hampton, Shawn D

    2014-03-04

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  7. Information processing systems, reasoning modules, and reasoning system design methods

    DOEpatents

    Hohimer, Ryan E.; Greitzer, Frank L.; Hampton, Shawn D.

    2015-08-18

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  8. Information processing systems, reasoning modules, and reasoning system design methods

    DOEpatents

    Hohimer, Ryan E.; Greitzer, Frank L.; Hampton, Shawn D.

    2016-08-23

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  9. a Task-Oriented Disaster Information Correlation Method

    NASA Astrophysics Data System (ADS)

    Linyao, Q.; Zhiqiang, D.; Qing, Z.

    2015-07-01

    With the rapid development of sensor networks and Earth observation technology, a large quantity of disaster-related data is available, such as remotely sensed data, historic data, case data, simulated data, and disaster products. However, the efficiency of current data management and service systems has become increasingly difficult due to the task variety and heterogeneous data. For emergency task-oriented applications, the data searches primarily rely on artificial experience based on simple metadata indices, the high time consumption and low accuracy of which cannot satisfy the speed and veracity requirements for disaster products. In this paper, a task-oriented correlation method is proposed for efficient disaster data management and intelligent service with the objectives of 1) putting forward disaster task ontology and data ontology to unify the different semantics of multi-source information, 2) identifying the semantic mapping from emergency tasks to multiple data sources on the basis of uniform description in 1), and 3) linking task-related data automatically and calculating the correlation between each data set and a certain task. The method goes beyond traditional static management of disaster data and establishes a basis for intelligent retrieval and active dissemination of disaster information. The case study presented in this paper illustrates the use of the method on an example flood emergency relief task.

  10. A rooftop extraction method using color feature, height map information and road information

    NASA Astrophysics Data System (ADS)

    Xiang, Yongzhou; Sun, Ying; Li, Chao

    2012-11-01

    This paper presents a new method for rooftop extraction that integrates color features, height map, and road information in a level set based segmentation framework. The proposed method consists of two steps: rooftop detection and rooftop segmentation. The first step requires the user to provide a few example rooftops from which the color distribution of rooftop pixels is estimated. For better robustness, we obtain superpixels of the input satellite image, and then classify each superpixel as rooftop or non-rooftop based on its color features. Using the height map, we can remove those detected rooftop candidates with small height values. Level set based segmentation of each detected rooftop is then performed based on color and height information, by incorporating a shape-prior term that allows the evolving contour to take on the desired rectangle shape. This requires performing rectangle fitting to the evolving contour, which can be guided by the road information to improve the fitting accuracy. The performance of the proposed method has been evaluated on a satellite image of 1 km×1 km in area, with a resolution of one meter per pixel. The method achieves detection rate of 88.0% and false alarm rate of 9.5%. The average Dice's coefficient over 433 detected rooftops is 73.4%. These results demonstrate that by integrating the height map in rooftop detection and by incorporating road information and rectangle fitting in a level set based segmentation framework, the proposed method provides an effective and useful tool for rooftop extraction from satellite images.

  11. The analysis of network transmission method for welding robot information

    NASA Astrophysics Data System (ADS)

    Cheng, Weide; Zhang, Hua; Liu, Donghua; Wang, Hongbo

    2012-01-01

    On the asis of User Datagram Protocol (UserDatagram Protocol, UDP), to do some improvement and design a welding robot network communication protocol (welding robot network communicate protocol: WRNCP), working on the fields of the transport layer and application layer of TCP / IP protocol. According to the characteristics of video data, to design the radio push-type (Broadcast Push Model, BPM) transmission method, improving the efficiency and stability of video transmission.and to designed the network information transmission system, used for real-time control of welding robot network.

  12. The analysis of network transmission method for welding robot information

    NASA Astrophysics Data System (ADS)

    Cheng, Weide; Zhang, Hua; Liu, Donghua; Wang, Hongbo

    2011-12-01

    On the asis of User Datagram Protocol (UserDatagram Protocol, UDP), to do some improvement and design a welding robot network communication protocol (welding robot network communicate protocol: WRNCP), working on the fields of the transport layer and application layer of TCP / IP protocol. According to the characteristics of video data, to design the radio push-type (Broadcast Push Model, BPM) transmission method, improving the efficiency and stability of video transmission.and to designed the network information transmission system, used for real-time control of welding robot network.

  13. Methods and Systems for Advanced Spaceport Information Management

    NASA Technical Reports Server (NTRS)

    Fussell, Ronald M. (Inventor); Ely, Donald W. (Inventor); Meier, Gary M. (Inventor); Halpin, Paul C. (Inventor); Meade, Phillip T. (Inventor); Jacobson, Craig A. (Inventor); Blackwell-Thompson, Charlie (Inventor)

    2007-01-01

    Advanced spaceport information management methods and systems are disclosed. In one embodiment, a method includes coupling a test system to the payload and transmitting one or more test signals that emulate an anticipated condition from the test system to the payload. One or more responsive signals are received from the payload into the test system and are analyzed to determine whether one or more of the responsive signals comprises an anomalous signal. At least one of the steps of transmitting, receiving, analyzing and determining includes transmitting at least one of the test signals and the responsive signals via a communications link from a payload processing facility to a remotely located facility. In one particular embodiment, the communications link is an Internet link from a payload processing facility to a remotely located facility (e.g. a launch facility, university, etc.).

  14. Methods and systems for advanced spaceport information management

    NASA Technical Reports Server (NTRS)

    Fussell, Ronald M. (Inventor); Ely, Donald W. (Inventor); Meier, Gary M. (Inventor); Halpin, Paul C. (Inventor); Meade, Phillip T. (Inventor); Jacobson, Craig A. (Inventor); Blackwell-Thompson, Charlie (Inventor)

    2007-01-01

    Advanced spaceport information management methods and systems are disclosed. In one embodiment, a method includes coupling a test system to the payload and transmitting one or more test signals that emulate an anticipated condition from the test system to the payload. One or more responsive signals are received from the payload into the test system and are analyzed to determine whether one or more of the responsive signals comprises an anomalous signal. At least one of the steps of transmitting, receiving, analyzing and determining includes transmitting at least one of the test signals and the responsive signals via a communications link from a payload processing facility to a remotely located facility. In one particular embodiment, the communications link is an Internet link from a payload processing facility to a remotely located facility (e.g. a launch facility, university, etc.).

  15. Urban drainage control applying rational method and geographic information technologies

    NASA Astrophysics Data System (ADS)

    Aldalur, Beatriz; Campo, Alicia; Fernández, Sandra

    2013-09-01

    The objective of this study is to develop a method of controlling urban drainages in the town of Ingeniero White motivated by the problems arising as a result of floods, water logging and the combination of southeasterly and high tides. A Rational Method was applied to control urban watersheds and used tools of Geographic Information Technology (GIT). A Geographic Information System was developed on the basis of 28 panchromatic aerial photographs of 2005. They were georeferenced with control points measured with Global Positioning Systems (basin: 6 km2). Flow rates of basins and sub-basins were calculated and it was verified that the existing open channels have a low slope with the presence of permanent water and generate stagnation of water favored by the presence of trash. It is proposed for the output of storm drains, the use of an existing channel to evacuate the flow. The solution proposed in this work is complemented by the placement of three pumping stations: one on a channel to drain rain water which will allow the drain of the excess water from the lower area where is located the Ingeniero White city and the two others that will drain the excess liquid from the port area.

  16. [Spectral discrimination method information divergence combined with gradient angle].

    PubMed

    Zhang, Xiu-bao; Yuan, Yan; Jing, Juan-juan; Sun, Cheng-ming; Wang, Qian

    2011-03-01

    The present paper proposes a spectral discrimination method combining spectral information divergence with spectral gradient angle (SID x tan(SGA(pi/2)) which overcomes the shortages of the existing methods which can not take the whole spectral shape and local characteristics into account simultaneously. Using the simulation spectra as input data, according to the interferogram acquirement principle and spectrum recovery algorithm of the temporally and spatially modulated Fourier transform imaging spectrometer (TSMFTIS), we simulated the distortion spectra recovery process of the TMSFTIS in different maximum mix ratio and distinguished the difference between the recovered spectra and the true spectrum by different spectral discrimination methods. The experiment results show that the SID x tan(SGA(pi/2)) can not only identify the similarity of the whole spectral shapes, but also distinguish local differences of the spectral characteristics. A comparative study was conducted among the different discrimination methods. The results have validated that the SID x tan(SGA(pi/2)) has a significant improvement in the discriminatory ability. PMID:21595255

  17. A diffusive information preservation method for small Knudsen number flows

    NASA Astrophysics Data System (ADS)

    Fei, Fei; Fan, Jing

    2013-06-01

    The direct simulation Monte Carlo (DSMC) method is a powerful particle-based method for modeling gas flows. It works well for relatively large Knudsen (Kn) numbers, typically larger than 0.01, but quickly becomes computationally intensive as Kn decreases due to its time step and cell size limitations. An alternative approach was proposed to relax or remove these limitations, based on replacing pairwise collisions with a stochastic model corresponding to the Fokker-Planck equation [J. Comput. Phys., 229, 1077 (2010); J. Fluid Mech., 680, 574 (2011)]. Similar to the DSMC method, the downside of that approach suffers from computationally statistical noise. To solve the problem, a diffusion-based information preservation (D-IP) method has been developed. The main idea is to track the motion of a simulated molecule from the diffusive standpoint, and obtain the flow velocity and temperature through sampling and averaging the IP quantities. To validate the idea and the corresponding model, several benchmark problems with Kn ˜ 10-3-10-4 have been investigated. It is shown that the IP calculations are not only accurate, but also efficient because they make possible using a time step and cell size over an order of magnitude larger than the mean collision time and mean free path, respectively.

  18. A diffusive information preservation method for small Knudsen number flows

    SciTech Connect

    Fei, Fei; Fan, Jing

    2013-06-15

    The direct simulation Monte Carlo (DSMC) method is a powerful particle-based method for modeling gas flows. It works well for relatively large Knudsen (Kn) numbers, typically larger than 0.01, but quickly becomes computationally intensive as Kn decreases due to its time step and cell size limitations. An alternative approach was proposed to relax or remove these limitations, based on replacing pairwise collisions with a stochastic model corresponding to the Fokker–Planck equation [J. Comput. Phys., 229, 1077 (2010); J. Fluid Mech., 680, 574 (2011)]. Similar to the DSMC method, the downside of that approach suffers from computationally statistical noise. To solve the problem, a diffusion-based information preservation (D-IP) method has been developed. The main idea is to track the motion of a simulated molecule from the diffusive standpoint, and obtain the flow velocity and temperature through sampling and averaging the IP quantities. To validate the idea and the corresponding model, several benchmark problems with Kn ∼ 10{sup −3}–10{sup −4} have been investigated. It is shown that the IP calculations are not only accurate, but also efficient because they make possible using a time step and cell size over an order of magnitude larger than the mean collision time and mean free path, respectively.

  19. Testing a Theory-Based Mobility Monitoring Protocol Using In-Home Sensors: A Feasibility Study

    PubMed Central

    Reeder, Blaine; Chung, Jane; Lazar, Amanda; Joe, Jonathan; Demiris, George; Thompson, Hilaire J.

    2014-01-01

    Mobility is a key factor in the performance of many everyday tasks required for independent living as a person grows older. The purpose of this mixed methods study was to test a theory-based mobility monitoring protocol by comparing sensor-based measures to self-report measures of mobility and assessing the acceptability of in-home sensors with older adults. Standardized instruments to measure physical, psychosocial and cognitive parameters were administered to 8 community-dwelling older adults at baseline, 3 month and 6 month visits (examples: FES, GDS-SF, Mini-cog). Semi-structured interviews to characterize acceptability of the technology were conducted at 3 month and 6 month visits. Technical issues prevented comparison of sensor-based measures with self-report measures. In-home sensor technology for monitoring mobility is acceptable to older adults. Implementing our theory-based mobility monitoring protocol in a field study in the homes of older adults is a feasible undertaking but requires more robust technology for sensor-based measure validation. PMID:23938159

  20. Towards a theory-based positive youth development programme.

    PubMed

    Brink, Andrea Jw; Wissing, Marié P

    2013-01-01

    The aim of this study was to develop and describe an intervention programme for young adolescents, guided by the Positive Youth Development Intervention (PYDI) model, which provides a perspective on the facilitation of development in a more positive trajectory. The key concepts and processes suggested by the PYDI model were further analysed and broadened using existing literature for operationalisation and application purposes. Self-regulation is the central process effectuating developmental change, within the contexts of: a) the navigation of stressors; and b) the formulation and effective pursuit of relevant personal goals. Self-regulation, together with a developmental perspective, provided guidelines regarding the relevant skills and knowledge. These are facilitating: a) identity development; b) formulation of goals congruent with the latter; c) decision-making skills; d) coping skills; e) regulation of affect and cognition; and f) socialisation skills. The relevant content areas and the manner of the facilitation of these are indicated. The theory-based programme can be implemented and its effect empirically evaluated. Levels of hope, problem-solving efficacy and social efficacy may serve as, inter alia, indicators of developmental change. PMID:25860303

  1. System and Method for RFID-Enabled Information Collection

    NASA Technical Reports Server (NTRS)

    Fink, Patrick W. (Inventor); Lin, Gregory Y. (Inventor); Kennedy, Timothy F. (Inventor); Ngo, Phong H. (Inventor); Byerly, Diane (Inventor)

    2016-01-01

    Methods, apparatuses and systems for radio frequency identification (RFID)-enabled information collection are disclosed, including an enclosure, a collector coupled to the enclosure, an interrogator, a processor, and one or more RFID field sensors, each having an individual identification, disposed within the enclosure. In operation, the interrogator transmits an incident signal to the collector, causing the collector to generate an electromagnetic field within the enclosure. The electromagnetic field is affected by one or more influences. RFID sensors respond to the electromagnetic field by transmitting reflected signals containing the individual identifications of the responding RFID sensors to the interrogator. The interrogator receives the reflected signals, measures one or more returned signal strength indications ("RSSI") of the reflected signals and sends the RSSI measurements and identification of the responding RFID sensors to the processor to determine one or more facts about the influences. Other embodiments are also described.

  2. Constructing DEM from characteristic terrain information using HASM method

    NASA Astrophysics Data System (ADS)

    Song, Dunjiang; Yue, Tianxiang; Du, Zhengping; Wang, Qingguo

    2009-09-01

    In the construction of DEM, terrain features (e.g. valleys or stream lines, ridges, peaks, saddle points) are important for improving DEM accuracy and saw many applications in hydrology, precision agriculture, military trajectory planning, etc. HASM (High Accuracy Surface Modeling) is a method for surface modeling, which is based on the theory of surface. Presently, HASM is only used for scattered point's interpolation. So the work in this paper attempts to construct DEM based on the characteristic terrain information as stream lines and scattered points by HASM method. The course is described as the following steps. Firstly TIN (Triangulated Irregular Network) from the scattered points is generated. Secondly, each segment of the stream lines is well oriented to represent stream lines' flow direction, and a tree data structure (that has parent, children and brothers) is used to represent the whole stream lines' segments. A segment is a curve which does not intersect with other segments. A Water Course Flow (WCF) line is a set of segment lines connected piecewise but without overlapping or repetition, from the most upper reaches to the most lower reaches. From the stream lines' tree data structure, all the possible WCF lines are enumerated, and the start point and end point of each WCF lines is predicted from searching among the TIN. Thirdly, given a cell size, a 2-D matrix for the research region is built, and the values of the cells who were traversed by the stream lines by linear interpolation among each WCF lines. Fourthly, all the valued cells that were passed through by the stream line and that were from the scattered points are gathered as known scattered sampling points, and then HASM is used to construct the final DEM. A case study on the typical landform of plateau of China, KongTong gully of Dongzhi Plateau, Qingyang, Gausu province, is presented. The original data is manually vecterized from scanned maps 1:10,000, includes scattered points, stream lines

  3. Constructing DEM from characteristic terrain information using HASM method

    NASA Astrophysics Data System (ADS)

    Song, Dunjiang; Yue, Tianxiang; Du, Zhengping; Wang, Qingguo

    2010-11-01

    In the construction of DEM, terrain features (e.g. valleys or stream lines, ridges, peaks, saddle points) are important for improving DEM accuracy and saw many applications in hydrology, precision agriculture, military trajectory planning, etc. HASM (High Accuracy Surface Modeling) is a method for surface modeling, which is based on the theory of surface. Presently, HASM is only used for scattered point's interpolation. So the work in this paper attempts to construct DEM based on the characteristic terrain information as stream lines and scattered points by HASM method. The course is described as the following steps. Firstly TIN (Triangulated Irregular Network) from the scattered points is generated. Secondly, each segment of the stream lines is well oriented to represent stream lines' flow direction, and a tree data structure (that has parent, children and brothers) is used to represent the whole stream lines' segments. A segment is a curve which does not intersect with other segments. A Water Course Flow (WCF) line is a set of segment lines connected piecewise but without overlapping or repetition, from the most upper reaches to the most lower reaches. From the stream lines' tree data structure, all the possible WCF lines are enumerated, and the start point and end point of each WCF lines is predicted from searching among the TIN. Thirdly, given a cell size, a 2-D matrix for the research region is built, and the values of the cells who were traversed by the stream lines by linear interpolation among each WCF lines. Fourthly, all the valued cells that were passed through by the stream line and that were from the scattered points are gathered as known scattered sampling points, and then HASM is used to construct the final DEM. A case study on the typical landform of plateau of China, KongTong gully of Dongzhi Plateau, Qingyang, Gausu province, is presented. The original data is manually vecterized from scanned maps 1:10,000, includes scattered points, stream lines

  4. Caveats: numerical requirements in graph theory based quantitation of tissue architecture.

    PubMed

    Sudbø, J; Marcelpoil, R; Reith, A

    2000-01-01

    Graph theory based methods represent one approach to an objective and reproducible structural analysis of tissue architecture. By these methods, neighborhood relations between a number of objects (e.g., cells) are explored and inherent to these methods are therefore certain requirements as to the number of objects to be included in the analysis. However, the question of how many objects are required to achieve reproducible values in repeated computations of proposed structural features, has previously not been adressed specifically. After digitising HE stained slides and storing them as grey level images, cell nuclei were segmented and their geometrical centre of gravity were computed, serving as the basis for construction of the Voronoi diagram (VD) and its subgraphs. Variations in repeated computations of structural features derived from these graphs were related to the number of cell nuclei included in the analysis. We demonstrate a large variation in the values of the structural features from one computation to another in one and the same section when only a limited number of cells (100-500) are included in the analysis. This variation decreased with increasing number of cells analyzed. The exact number of cells required to achieve reproducible values differ significantly between tissues, but not between separate cases of similar lesions. There are no significant differences between normal and malignantly changed tissues in oral mucosa with respect to how many cells must be included. For graph theory based analysis of tissue architecture, care must be taken to include an adequate number of objects; for some of the structural features we have tested, more than 3000 cells. PMID:11310642

  5. Indigenous Knowledge and Culturally Responsive Methods in Information Research

    ERIC Educational Resources Information Center

    Becvar, Katherine; Srinivasan, Ramesh

    2009-01-01

    Research and professional practice in librarianship has increasingly turned to community-focused information services (CIS), which allow people to participate in creating and sharing information about themselves and their communities. These information services have a great potential to empower and engage marginalized communities; however, in this…

  6. A cloud theory-based particle swarm optimization for multiple decision maker vehicle routing problems with fuzzy random time windows

    NASA Astrophysics Data System (ADS)

    Ma, Yanfang; Xu, Jiuping

    2015-06-01

    This article puts forward a cloud theory-based particle swarm optimization (CTPSO) algorithm for solving a variant of the vehicle routing problem, namely a multiple decision maker vehicle routing problem with fuzzy random time windows (MDVRPFRTW). A new mathematical model is developed for the proposed problem in which fuzzy random theory is used to describe the time windows and bi-level programming is applied to describe the relationship between the multiple decision makers. To solve the problem, a cloud theory-based particle swarm optimization (CTPSO) is proposed. More specifically, this approach makes improvements in initialization, inertia weight and particle updates to overcome the shortcomings of the basic particle swarm optimization (PSO). Parameter tests and results analysis are presented to highlight the performance of the optimization method, and comparison of the algorithm with the basic PSO and the genetic algorithm demonstrates its efficiency.

  7. Agent-based method for distributed clustering of textual information

    DOEpatents

    Potok, Thomas E [Oak Ridge, TN; Reed, Joel W [Knoxville, TN; Elmore, Mark T [Oak Ridge, TN; Treadwell, Jim N [Louisville, TN

    2010-09-28

    A computer method and system for storing, retrieving and displaying information has a multiplexing agent (20) that calculates a new document vector (25) for a new document (21) to be added to the system and transmits the new document vector (25) to master cluster agents (22) and cluster agents (23) for evaluation. These agents (22, 23) perform the evaluation and return values upstream to the multiplexing agent (20) based on the similarity of the document to documents stored under their control. The multiplexing agent (20) then sends the document (21) and the document vector (25) to the master cluster agent (22), which then forwards it to a cluster agent (23) or creates a new cluster agent (23) to manage the document (21). The system also searches for stored documents according to a search query having at least one term and identifying the documents found in the search, and displays the documents in a clustering display (80) of similarity so as to indicate similarity of the documents to each other.

  8. Models for Theory-Based M.A. and Ph.D. Programs.

    ERIC Educational Resources Information Center

    Botan, Carl; Vasquez, Gabriel

    1999-01-01

    Presents work accomplished at the 1998 National Communication Association Summer Conference. Outlines reasons for theory-based education in public relations. Presents an integrated model of student outcomes, curriculum, pedagogy, and assessment for theory-based master's and doctoral programs, including assumptions made and rationale for such…

  9. Informative Parameters of Dynamic Geo-electricity Methods

    NASA Astrophysics Data System (ADS)

    Tursunmetov, R.

    With growing complexity of geological tasks and revealing abnormality zones con- nected with ore, oil, gas and water availability, methods of dynamic geo-electricity started to be used. In these methods geological environment is considered as inter- phase irregular one. Main dynamic element of this environment is double electric layer, which develops on the boundary between solid and liquid phase. In ore or wa- ter saturated environment double electric layers become electrochemical or electro- kinetic active elements of geo-electric environment, which, in turn, form natural elec- tric field. Mentioned field influences artificially created field distribution and inter- action bear complicated super-position or non-linear character. Therefore, geological environment is considered as active one, which is able to accumulate and transform artificially superpositioned fields. Main dynamic property of this environment is non- liner behavior of specific electric resistance and soil polarization depending on current density and measurements frequency, which serve as informative parameters for dy- namic geo-electricity methods. Study of disperse soil electric properties in impulse- frequency regime with study of temporal and frequency characteristics of electric field is of main interest for definition of geo-electric abnormality. Volt-amperic characteris- tics of electromagnetic field study has big practical significance. These characteristics are determined by electric-chemically active ore and water saturated fields. Mentioned parameters depend on initiated field polarity, in particular on ore saturated zone's character, composition and mineralization and natural electric field availability un- der cathode and anode mineralization. Non-linear behavior of environment's dynamic properties impacts initiated field structure that allows to define abnormal zone loca- tion. And, finally, study of soil anisotropy dynamic properties in space will allow to identify filtration flows

  10. Remote multi-position information gathering system and method

    DOEpatents

    Hirschfeld, Tomas B.

    1986-01-01

    A technique for gathering specific information from various remote locations, especially fluorimetric information characteristic of particular materials at the various locations is disclosed herein. This technique uses a single source of light disposed at still a different, central location and an overall optical network including an arrangement of optical fibers cooperating with the light source for directing individual light beams into the different information bearing locations. The incoming light beams result in corresponding displays of light, e.g., fluorescent light, containing the information to be obtained. The optical network cooperates with these light displays at the various locations for directing outgoing light beams containing the same information as their cooperating displays from these locations to the central location. Each of these outgoing beams is applied to a detection arrangement, e.g., a fluorescence spectroscope, for retrieving the information contained thereby.

  11. Remote multi-position information gathering system and method

    DOEpatents

    Hirschfeld, Tomas B.

    1989-01-01

    A technique for gathering specific information from various remote locations, especially fluorimetric information characteristic of particular materials at the various locations is disclosed herein. This technique uses a single source of light disposed at still a different, central location and an overall optical network including an arrangement of optical fibers cooperating with the light source for directing individual light beams into the different information bearing locations. The incoming light beams result in corresponding displays of light, e.g., fluorescent light, containing the information to be obtained. The optical network cooperates with these light displays at the various locations for directing outgoing light beams containing the same information as their cooperating displays from these locations to the central location. Each of these outgoing beams is applied to a detection arrangement, e.g., a fluorescence spectroscope, for retrieving the information contained thereby.

  12. Remote multi-position information gathering system and method

    DOEpatents

    Hirschfeld, T.B.

    1986-12-02

    A technique for gathering specific information from various remote locations, especially fluorimetric information characteristic of particular materials at the various locations is disclosed herein. This technique uses a single source of light disposed at still a different, central location and an overall optical network including an arrangement of optical fibers cooperating with the light source for directing individual light beams into the different information bearing locations. The incoming light beams result in corresponding displays of light, e.g., fluorescent light, containing the information to be obtained. The optical network cooperates with these light displays at the various locations for directing outgoing light beams containing the same information as their cooperating displays from these locations to the central location. Each of these outgoing beams is applied to a detection arrangement, e.g., a fluorescence spectroscope, for retrieving the information contained thereby. 9 figs.

  13. Remote multi-position information gathering system and method

    DOEpatents

    Hirschfeld, Tomas B.

    1986-01-01

    A technique for gathering specific information from various remote locations, especially fluorimetric information characteristic of particular materials at the various locations is disclosed herein. This technique uses a single source of light disposed at still a different, central location and an overall optical network including an arrangement of optical fibers cooperating with the light source for directing individual light beams into the different information bearing locations. The incoming light beams result in corresponding displays of light, e.g., fluorescent light, containing the information to be obtained. The optical network cooperates with these light displays at the various locations for directing ongoing light beams containing the same information as their cooperating displays from these locations to the central location. Each of these outgoing beams is applied to a detection arrangement, e.g., a fluorescence spectroscope, for retrieving the information contained thereby.

  14. Remote multi-position information gathering system and method

    DOEpatents

    Hirschfeld, T.B.

    1989-01-24

    A technique for gathering specific information from various remote locations, especially fluorimetric information characteristic of particular materials at the various locations is disclosed herein. This technique uses a single source of light disposed at still a different, central location and an overall optical network including an arrangement of optical fibers cooperating with the light source for directing individual light beams into the different information bearing locations. The incoming light beams result in corresponding displays of light, e.g., fluorescent light, containing the information to be obtained. The optical network cooperates with these light displays at the various locations for directing outgoing light beams containing the same information as their cooperating displays from these locations to the central location. Each of these outgoing beams is applied to a detection arrangement, e.g., a fluorescence spectroscope, for retrieving the information contained thereby. 9 figs.

  15. Item Characteristic Curve Estimation of Signal Detection Theory-Based Personality Data: A Two-Stage Approach to Item Response Modeling.

    ERIC Educational Resources Information Center

    Williams, Kevin M.; Zumbo, Bruno D.

    2003-01-01

    Developed an item characteristic curve estimation of signal detection theory based personality data. Results for 266 college students taking the Overclaiming Questionnaire (D. Paulhus and N. Bruce, 1990) suggest that this method is a reasonable approach to describing item functioning and that there are advantages to this method over traditional…

  16. "Method Is Education:" Making Informal Education Social and Substantive

    ERIC Educational Resources Information Center

    Stern, Miriam Heller

    2007-01-01

    This article presents the author's response to Joseph Reimer's essay titled, "Beyond More Jews Doing Jewish: Clarifying the Goals of Informal Jewish Education." Joseph Reimer states that the challenge for informal education is to move beyond socialization to clarify and achieve "deeper" educational goals. Distinguishing between socialization and…

  17. Method and system for analyzing and classifying electronic information

    DOEpatents

    McGaffey, Robert W.; Bell, Michael Allen; Kortman, Peter J.; Wilson, Charles H.

    2003-04-29

    A data analysis and classification system that reads the electronic information, analyzes the electronic information according to a user-defined set of logical rules, and returns a classification result. The data analysis and classification system may accept any form of computer-readable electronic information. The system creates a hash table wherein each entry of the hash table contains a concept corresponding to a word or phrase which the system has previously encountered. The system creates an object model based on the user-defined logical associations, used for reviewing each concept contained in the electronic information in order to determine whether the electronic information is classified. The data analysis and classification system extracts each concept in turn from the electronic information, locates it in the hash table, and propagates it through the object model. In the event that the system can not find the electronic information token in the hash table, that token is added to a missing terms list. If any rule is satisfied during propagation of the concept through the object model, the electronic information is classified.

  18. Impact of NDE reliability developments on risk-informed methods

    SciTech Connect

    Walker, S.M.; Ammirato, F.V.

    1996-12-01

    Risk informed inspection procedures are being developed to more effectively and economically manage degradation in plant piping systems. A key element of this process is applying nondestructive examination (NDE) procedures capable of detecting specific damage mechanisms that may be operative in particular locations. Thus, the needs of risk informed analysis are closely coupled with a firm understanding of the capability of NDE.

  19. Using the Work System Method with Freshman Information Systems Students

    ERIC Educational Resources Information Center

    Recker, Jan; Alter, Steven

    2012-01-01

    Recent surveys of information technology management professionals show that understanding business domains in terms of business productivity and cost reduction potential, knowledge of different vertical industry segments and their information requirements, understanding of business processes and client-facing skills are more critical for…

  20. Informal Learning of Social Workers: A Method of Narrative Inquiry

    ERIC Educational Resources Information Center

    Gola, Giancarlo

    2009-01-01

    Purpose: The purpose of this paper is to investigate social workers' processes of informal learning, through their narration of their professional experience, in order to understand how social workers learn. Informal learning is any individual practice or activity that is able to produce continuous learning; it is often non-intentional and…

  1. Graph theory-based measures as predictors of gene morbidity.

    PubMed

    Massanet-Vila, Raimon; Caminal, Pere; Perera, Alexandre

    2010-01-01

    Previous studies have suggested that some graph properties of protein interaction networks might be related with gene morbidity. In particular, it has been suggested that when a polymorphism affects a gene, it is more likely to produce a disease if the node degree in the interaction network is higher than for other genes. However, these results do not take into account the possible bias introduced by the variance in the amount of information available for different genes. This work models the relationship between the morbidity associated with a gene and the degrees of the nodes in the protein interaction network controlling the amount of information available in the literature. A set of 7461 genes and 3665 disease identifiers reported in the Online Mendelian Inheritance in Man (OMIM) was mined jointly with 9630 nodes and 38756 interactions of the Human Proteome Resource Database (HPRD). The information available from a gene was measured through PubMed mining. Results suggest that the correlation between the degree of a node in the protein interaction network and its morbidity is largely contributed by the information available from the gene. Even though the results suggest a positive correlation between the degree of a node and its morbidity while controlling the information factor, we believe this correlation has to be taken with caution for it can be affected by other factors not taken into account in this study. PMID:21096114

  2. 5 CFR 1640.6 - Methods of providing information.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... part to participants by making it available on the TSP Web site. A participant can request paper copies of that information from the TSP by calling the ThriftLine, submitting a request through the TSP...

  3. 5 CFR 1640.6 - Methods of providing information.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... part to participants by making it available on the TSP Web site. A participant can request paper copies of that information from the TSP by calling the ThriftLine, submitting a request through the TSP...

  4. 5 CFR 1640.6 - Methods of providing information.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... part to participants by making it available on the TSP Web site. A participant can request paper copies of that information from the TSP by calling the ThriftLine, submitting a request through the TSP...

  5. Study protocol: a randomised controlled trial of a theory-based online intervention to improve sun safety among Australian adults

    PubMed Central

    2014-01-01

    Background The effects of exposure to ultraviolet radiation are a significant concern in Australia which has one of the highest incidences of skin cancer in the world. Despite most skin cancers being preventable by encouraging consistent adoption of sun-protective behaviours, incidence rates are not decreasing. There is a dearth of research examining the factors involved in engaging in sun-protective behaviours. Further, online multi-behavioural theory-based interventions have yet to be explored fully as a medium for improving sun-protective behaviour in adults. This paper presents the study protocol of a randomised controlled trial of an online intervention based on the Theory of Planned Behaviour (TPB) that aims to improve sun safety among Australian adults. Methods/Design Approximately 420 adults aged 18 and over and predominantly from Queensland, Australia, will be recruited and randomised to the intervention (n = 200), information only (n = 200) or the control group (n = 20). The intervention focuses on encouraging supportive attitudes and beliefs toward sun-protective behaviour, fostering perceptions of normative support for sun protection, and increasing perceptions of control/self-efficacy over sun protection. The intervention will be delivered online over a single session. Data will be collected immediately prior to the intervention (Time 1), immediately following the intervention (Time 1b), and one week (Time 2) and one month (Time 3) post-intervention. Primary outcomes are intentions to sun protect and sun-protective behaviour. Secondary outcomes are the participants’ attitudes toward sun protection, perceptions of normative support for sun protection (i.e. subjective norms, group norms, personal norms and image norms) and perceptions of control/self-efficacy toward sun protection. Discussion The study will contribute to an understanding of the effectiveness of a TPB-based online intervention to improve Australian adults’ sun

  6. Institutionalizing Retention Activity: Toward a Theory-Based Model.

    ERIC Educational Resources Information Center

    Saunders, Martha Dunagin

    2003-01-01

    Examines Appreciative Inquiry, a relatively new approach to organizational change and growth, as a method for institutionalizing retention activity. Results of a case study in a college of arts and sciences suggest the method to be effective in creating a shared vision for the organization, energized participants, improved morale, and increased…

  7. Informativeness Improvement of Hardness Test Methods for Metal Product Assessment

    NASA Astrophysics Data System (ADS)

    Osipov, S.; Podshivalov, I.; Osipov, O.; Zhantybaev, A.

    2016-06-01

    The paper presents a combination of theoretical suggestions, results, and observations allowing to improve the informativeness of hardness testing process in solving problems of metal product assessment while in operation. The hardness value of metal surface obtained by a single measurement is considered to be random. Various measures of location and scattering of the random variable were experimentally estimated for a number of test samples using the correlation analysis, and their close interaction was studied. It was stated that in metal assessment, the main informative characteristics of hardness testing process are its average value and mean-square deviation for measures of location and scattering, respectively.

  8. Game Theory Based Trust Model for Cloud Environment

    PubMed Central

    Gokulnath, K.; Uthariaraj, Rhymend

    2015-01-01

    The aim of this work is to propose a method to establish trust at bootload level in cloud computing environment. This work proposes a game theoretic based approach for achieving trust at bootload level of both resources and users perception. Nash equilibrium (NE) enhances the trust evaluation of the first-time users and providers. It also restricts the service providers and the users to violate service level agreement (SLA). Significantly, the problem of cold start and whitewashing issues are addressed by the proposed method. In addition appropriate mapping of cloud user's application to cloud service provider for segregating trust level is achieved as a part of mapping. Thus, time complexity and space complexity are handled efficiently. Experiments were carried out to compare and contrast the performance of the conventional methods and the proposed method. Several metrics like execution time, accuracy, error identification, and undecidability of the resources were considered. PMID:26380365

  9. Statistical methods of combining information: Applications to sensor data fusion

    SciTech Connect

    Burr, T.

    1996-12-31

    This paper reviews some statistical approaches to combining information from multiple sources. Promising new approaches will be described, and potential applications to combining not-so-different data sources such as sensor data will be discussed. Experiences with one real data set are described.

  10. Paper Trail: One Method of Information Literacy Assessment

    ERIC Educational Resources Information Center

    Nutefall, Jennifer

    2004-01-01

    Assessing students' information literacy skills can be difficult depending on the involvement of the librarian in a course. To overcome this, librarians created an assignment called the Paper Trail, where students wrote a short essay about their research process and reflected on what they would do differently. Through reviewing and grading these…

  11. 10 CFR 207.3 - Method of collecting energy information under ESECA.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 3 2014-01-01 2014-01-01 false Method of collecting energy information under ESECA. 207.3 Section 207.3 Energy DEPARTMENT OF ENERGY OIL COLLECTION OF INFORMATION Collection of Information Under the Energy Supply and Environmental Coordination Act of 1974 § 207.3 Method of collecting...

  12. 10 CFR 207.3 - Method of collecting energy information under ESECA.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 3 2012-01-01 2012-01-01 false Method of collecting energy information under ESECA. 207.3 Section 207.3 Energy DEPARTMENT OF ENERGY OIL COLLECTION OF INFORMATION Collection of Information Under the Energy Supply and Environmental Coordination Act of 1974 § 207.3 Method of collecting...

  13. 10 CFR 207.3 - Method of collecting energy information under ESECA.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 3 2013-01-01 2013-01-01 false Method of collecting energy information under ESECA. 207.3 Section 207.3 Energy DEPARTMENT OF ENERGY OIL COLLECTION OF INFORMATION Collection of Information Under the Energy Supply and Environmental Coordination Act of 1974 § 207.3 Method of collecting...

  14. The Effects of Presentation Method and Information Density on Visual Search Ability and Working Memory Load

    ERIC Educational Resources Information Center

    Chang, Ting-Wen; Kinshuk; Chen, Nian-Shing; Yu, Pao-Ta

    2012-01-01

    This study investigates the effects of successive and simultaneous information presentation methods on learner's visual search ability and working memory load for different information densities. Since the processing of information in the brain depends on the capacity of visual short-term memory (VSTM), the limited information processing capacity…

  15. Game theory-based mode cooperative selection mechanism for device-to-device visible light communication

    NASA Astrophysics Data System (ADS)

    Liu, Yuxin; Huang, Zhitong; Li, Wei; Ji, Yuefeng

    2016-03-01

    Various patterns of device-to-device (D2D) communication, from Bluetooth to Wi-Fi Direct, are emerging due to the increasing requirements of information sharing between mobile terminals. This paper presents an innovative pattern named device-to-device visible light communication (D2D-VLC) to alleviate the growing traffic problem. However, the occlusion problem is a difficulty in D2D-VLC. This paper proposes a game theory-based solution in which the best-response dynamics and best-response strategies are used to realize a mode-cooperative selection mechanism. This mechanism uses system capacity as the utility function to optimize system performance and selects the optimal communication mode for each active user from three candidate modes. Moreover, the simulation and experimental results show that the mechanism can attain a significant improvement in terms of effectiveness and energy saving compared with the cases where the users communicate via only the fixed transceivers (light-emitting diode and photo diode) or via only D2D.

  16. Development of StopAdvisor: A theory-based interactive internet-based smoking cessation intervention.

    PubMed

    Michie, Susan; Brown, Jamie; Geraghty, Adam W A; Miller, Sascha; Yardley, Lucy; Gardner, Benjamin; Shahab, Lion; McEwen, Andy; Stapleton, John A; West, Robert

    2012-09-01

    Reviews of internet-based behaviour-change interventions have shown that they can be effective but there is considerable heterogeneity and effect sizes are generally small. In order to advance science and technology in this area, it is essential to be able to build on principles and evidence of behaviour change in an incremental manner. We report the development of an interactive smoking cessation website, StopAdvisor, designed to be attractive and effective across the social spectrum. It was informed by a broad motivational theory (PRIME), empirical evidence, web-design expertise, and user-testing. The intervention was developed using an open-source web-development platform, 'LifeGuide', designed to facilitate optimisation and collaboration. We identified 19 theoretical propositions, 33 evidence- or theory-based behaviour change techniques, 26 web-design principles and nine principles from user-testing. These were synthesised to create the website, 'StopAdvisor' (see http://www.lifeguideonline.org/player/play/stopadvisordemonstration). The systematic and transparent application of theory, evidence, web-design expertise and user-testing within an open-source development platform can provide a basis for multi-phase optimisation contributing to an 'incremental technology' of behaviour change. PMID:24073123

  17. Development and validation of a theory-based multimedia application for educating Persian patients on hemodialysis.

    PubMed

    Feizalahzadeh, Hossein; Tafreshi, Mansoureh Zagheri; Moghaddasi, Hamid; Farahani, Mansoureh A; Khosrovshahi, Hamid Tayebi; Zareh, Zahra; Mortazavi, Fakhrsadat

    2014-05-01

    Although patients on hemodialysis require effective education for self-care, several issues associated with the process raise barriers that make learning difficult. Computer-based education can reduce these problems and improve the quality of education. This study aims to develop and validate a theory-based multimedia application to educate Persian patients on hemodialysis. The study consisted of five phases: (1) content development, (2) prototype development 1, (3) evaluation by users, (4) evaluation by a multidisciplinary group of experts, and (5) prototype development 2. Data were collected through interviews and literature review with open-ended questions and two survey forms that consisted of a five-level scale. In the Results section, patient needs on hemodialysis self-care and related content were categorized into seven sections, including kidney function and failure, hemodialysis, vascular access, nutrition, medication, physical activity, and living with hemodialysis. The application designed includes seven modules consisting of user-controlled small multimedia units. During navigation through this application, the users were provided step-by-step information on self-care. Favorable scores were obtained from evaluations by users and experts. The researchers concluded that this application can facilitate hemodialysis education and learning process for the patients by focusing on their self-care needs using the multimedia design principles. PMID:24642877

  18. An efficient steganography method for hiding patient confidential information.

    PubMed

    Al-Dmour, Hayat; Al-Ani, Ahmed; Nguyen, Hung

    2014-01-01

    This paper deals with the important issue of security and confidentiality of patient information when exchanging or storing medical images. Steganography has recently been viewed as an alternative or complement to cryptography, as existing cryptographic systems are not perfect due to their vulnerability to certain types of attack. We propose in this paper a new steganography algorithm for hiding patient confidential information. It utilizes Pixel Value Differencing (PVD) to identify contrast regions in the image and a Hamming code that embeds 3 secret message bits into 4 bits of the cover image. In order to preserve the content of the region of interest (ROI), the embedding is only performed using the Region of Non-Interest (RONI). PMID:25569937

  19. Effects of a social cognitive theory-based hip fracture prevention web site for older adults.

    PubMed

    Nahm, Eun-Shim; Barker, Bausell; Resnick, Barbara; Covington, Barbara; Magaziner, Jay; Brennan, Patricia Flatley

    2010-01-01

    The purposes of this study were to develop a Social Cognitive Theory-based, structured Hip Fracture Prevention Web site for older adults and conduct a preliminary evaluation of its effectiveness. The Theory-based, structured Hip Fracture Prevention Web site is composed of learning modules and a moderated discussion board. A total of 245 older adults recruited from two Web sites and a newspaper advertisement were randomized into the Theory-based, structured Hip Fracture Prevention Web site and the conventional Web sites groups. Outcomes included (1) knowledge (hip fractures and osteoporosis), (2) self-efficacy and outcome expectations, and (3) calcium intake and exercise and were assessed at baseline, end of treatment (2 weeks), and follow-up (3 months). Both groups showed significant improvement in most outcomes. For calcium intake, only the Theory-based, structured Hip Fracture Prevention Web site group showed improvement. None of the group and time interactions were significant. The Theory-based, structured Hip Fracture Prevention Web site group, however, was more satisfied with the intervention. The discussion board usage was significantly correlated with outcome gains. Despite several limitations, the findings showed some preliminary effectiveness of Web-based health interventions for older adults and the use of a Theory-based, structured Hip Fracture Prevention Web site as a sustainable Web structure for online health behavior change interventions. PMID:20978408

  20. Imaging systems and methods for obtaining and using biometric information

    DOEpatents

    McMakin, Douglas L [Richland, WA; Kennedy, Mike O [Richland, WA

    2010-11-30

    Disclosed herein are exemplary embodiments of imaging systems and methods of using such systems. In one exemplary embodiment, one or more direct images of the body of a clothed subject are received, and a motion signature is determined from the one or more images. In this embodiment, the one or more images show movement of the body of the subject over time, and the motion signature is associated with the movement of the subject's body. In certain implementations, the subject can be identified based at least in part on the motion signature. Imaging systems for performing any of the disclosed methods are also disclosed herein. Furthermore, the disclosed imaging, rendering, and analysis methods can be implemented, at least in part, as one or more computer-readable media comprising computer-executable instructions for causing a computer to perform the respective methods.

  1. Information storage medium and method of recording and retrieving information thereon

    DOEpatents

    Marchant, D. D.; Begej, Stefan

    1986-01-01

    Information storage medium comprising a semiconductor doped with first and second impurities or dopants. Preferably, one of the impurities is introduced by ion implantation. Conductive electrodes are photolithographically formed on the surface of the medium. Information is recorded on the medium by selectively applying a focused laser beam to discrete regions of the medium surface so as to anneal discrete regions of the medium containing lattice defects introduced by the ion-implanted impurity. Information is retrieved from the storage medium by applying a focused laser beam to annealed and non-annealed regions so as to produce a photovoltaic signal at each region.

  2. A High Accuracy Method for Semi-supervised Information Extraction

    SciTech Connect

    Tratz, Stephen C.; Sanfilippo, Antonio P.

    2007-04-22

    Customization to specific domains of dis-course and/or user requirements is one of the greatest challenges for today’s Information Extraction (IE) systems. While demonstrably effective, both rule-based and supervised machine learning approaches to IE customization pose too high a burden on the user. Semi-supervised learning approaches may in principle offer a more resource effective solution but are still insufficiently accurate to grant realistic application. We demonstrate that this limitation can be overcome by integrating fully-supervised learning techniques within a semi-supervised IE approach, without increasing resource requirements.

  3. Extracting ocean surface information from altimeter returns - The deconvolution method

    NASA Technical Reports Server (NTRS)

    Rodriguez, Ernesto; Chapman, Bruce

    1989-01-01

    An evaluation of the deconvolution method for estimating ocean surface parameters from ocean altimeter waveforms is presented. It is shown that this method presents a fast, accurate way of determining the ocean surface parameters from noisy altimeter data. Three parameters may be estimated by using this method, including the altimeter-height error, the ocean-surface standard deviation, and the ocean-surface skewness. By means of a Monte Carlo experiment, an 'optimum' deconvolution algorithm and the accuracies with which the above parameters may be estimated using this algorithm are determined. Then the influence of instrument effects, such as errors in calibration and pointing-angle estimation, on the estimated parameters is examined. Finally, the deconvolution algorithm is used to estimate height and ocean-surface parameters from Seasat data.

  4. Treatment of adolescent sexual offenders: theory-based practice.

    PubMed

    Sermabeikian, P; Martinez, D

    1994-11-01

    The treatment of adolescent sexual offenders (ASO) has its theoretical underpinnings in social learning theory. Although social learning theory has been frequently cited in literature, a comprehensive application of this theory, as applied to practice, has not been mapped out. The social learning and social cognitive theories of Bandura appear to be particularly relevant to the group treatment of this population. The application of these theories to practice, as demonstrated in a program model, is discussed as a means of demonstrating how theory-driven practice methods can be developed. PMID:7850605

  5. Methods to include foreign information in national evaluations

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Genomic evaluations (GEBV) with higher reliability often result from including genotypes and phenotypes from foreign bulls in the reference population. Multi-step methods evaluate domestic phenotypes first using only pedigree relationships (EBV), then add foreign data available from multi-trait acro...

  6. A Theory-Based Exercise App to Enhance Exercise Adherence: A Pilot Study

    PubMed Central

    Voth, Elizabeth C; Oelke, Nelly D

    2016-01-01

    Background Use of mobile health (mHealth) technology is on an exponential rise. mHealth apps have the capability to reach a large number of individuals, but until now have lacked the integration of evidence-based theoretical constructs to increase exercise behavior in users. Objective The purpose of this study was to assess the effectiveness of a theory-based, self-monitoring app on exercise and self-monitoring behavior over 8 weeks. Methods A total of 56 adults (mean age 40 years, SD 13) were randomly assigned to either receive the mHealth app (experimental; n=28) or not to receive the app (control; n=28). All participants engaged in an exercise goal-setting session at baseline. Experimental condition participants received weekly short message service (SMS) text messages grounded in social cognitive theory and were encouraged to self-monitor exercise bouts on the app on a daily basis. Exercise behavior, frequency of self-monitoring exercise behavior, self-efficacy to self-monitor, and self-management of exercise behavior were collected at baseline and at postintervention. Results Engagement in exercise bouts was greater in the experimental condition (mean 7.24, SD 3.40) as compared to the control condition (mean 4.74, SD 3.70, P=.03, d=0.70) at week 8 postintervention. Frequency of self-monitoring increased significantly over the 8-week investigation between the experimental and control conditions (P<.001, partial η2=.599), with participants in the experimental condition self-monitoring significantly more at postintervention (mean 6.00, SD 0.93) in comparison to those in the control condition (mean 1.95, SD 2.58, P<.001, d=2.10). Self-efficacy to self-monitor and perceived self-management of exercise behavior were unaffected by this intervention. Conclusions The successful integration of social cognitive theory into an mHealth exercise self-monitoring app provides support for future research to feasibly integrate theoretical constructs into existing exercise apps

  7. Risk-Informed Safety Margin Characterization Methods Development Work

    SciTech Connect

    Smith, Curtis L; Ma, Zhegang; Tom Riley; Mandelli, Diego; Nielsen, Joseph W; Alfonsi, Andrea; Rabiti, Cristian

    2014-09-01

    This report summarizes the research activity developed during the Fiscal year 2014 within the Risk Informed Safety Margin and Characterization (RISMC) pathway within the Light Water Reactor Sustainability (LWRS) campaign. This research activity is complementary to the one presented in the INL/EXT-??? report which shows advances Probabilistic Risk Assessment Analysis using RAVEN and RELAP-7 in conjunction to novel flooding simulation tools. Here we present several analyses that prove the values of the RISMC approach in order to assess risk associated to nuclear power plants (NPPs). We focus on simulation based PRA which, in contrast to classical PRA, heavily employs system simulator codes. Firstly we compare, these two types of analyses, classical and RISMC, for a Boiling water reactor (BWR) station black out (SBO) initiating event. Secondly we present an extended BWR SBO analysis using RAVEN and RELAP-5 which address the comments and suggestions received about he original analysis presented in INL/EXT-???. This time we focus more on the stochastic analysis such probability of core damage and on the determination of the most risk-relevant factors. We also show some preliminary results regarding the comparison between RELAP5-3D and the new code RELAP-7 for a simplified Pressurized Water Reactors system. Lastly we present some conceptual ideas regarding the possibility to extended the RISMC capabilities from an off-line tool (i.e., as PRA analysis tool) to an online-tool. In this new configuration, RISMC capabilities can be used to assist and inform reactor operator during real accident scenarios.

  8. Looking inside the black box: a theory-based process evaluation alongside a randomised controlled trial of printed educational materials (the Ontario printed educational message, OPEM) to improve referral and prescribing practices in primary care in Ontario, Canada

    PubMed Central

    Grimshaw, Jeremy M; Zwarenstein, Merrick; Tetroe, Jacqueline M; Godin, Gaston; Graham, Ian D; Lemyre, Louise; Eccles, Martin P; Johnston, Marie; Francis, Jillian J; Hux, Jan; O'Rourke, Keith; Légaré, France; Presseau, Justin

    2007-01-01

    Background Randomised controlled trials of implementation strategies tell us whether (or not) an intervention results in changes in professional behaviour but little about the causal mechanisms that produce any change. Theory-based process evaluations collect data on theoretical constructs alongside randomised trials to explore possible causal mechanisms and effect modifiers. This is similar to measuring intermediate endpoints in clinical trials to further understand the biological basis of any observed effects (for example, measuring lipid profiles alongside trials of lipid lowering drugs where the primary endpoint could be reduction in vascular related deaths). This study protocol describes a theory-based process evaluation alongside the Ontario Printed Educational Message (OPEM) trial. We hypothesize that the OPEM interventions are most likely to operate through changes in physicians' behavioural intentions due to improved attitudes or subjective norms with little or no change in perceived behavioural control. We will test this hypothesis using a well-validated social cognition model, the theory of planned behaviour (TPB) that incorporates these constructs. Methods/design We will develop theory-based surveys using standard methods based upon the TPB for the second and third replications, and survey a subsample of Ontario family physicians from each arm of the trial two months before and six months after the dissemination of the index edition of informed, the evidence based newsletter used for the interventions. In the third replication, our study will converge with the "TRY-ME" protocol (a second study conducted alongside the OPEM trial), in which the content of educational messages was constructed using both standard methods and methods informed by psychological theory. We will modify Dillman's total design method to maximise response rates. Preliminary analyses will initially assess the internal reliability of the measures and use regression to explore the

  9. ROOM: A recursive object oriented method for information systems development

    SciTech Connect

    Thelliez, T.; Donahue, S.

    1994-02-09

    Although complementary for the development of complex systems, top-down structured design and object oriented approach are still opposed and not integrated. As the complexity of the systems are still growing, and the so-called software crisis still not solved, it is urgent to provide a framework mixing the two paradigms. This paper presents an elegant attempt in this direction through our Recursive Object-Oriented Method (ROOM) in which a top-down approach divides the complexity of the system and an object oriented method studies a given level of abstraction. Illustrating this recursive schema with a simple example, we demonstrate that we achieve the goal of creating loosely coupled and reusable components.

  10. Mixture theory-based poroelasticity as a model of interstitial tissue growth.

    PubMed

    Cowin, Stephen C; Cardoso, Luis

    2012-01-01

    This contribution presents an alternative approach to mixture theory-based poroelasticity by transferring some poroelastic concepts developed by Maurice Biot to mixture theory. These concepts are a larger RVE and the subRVE-RVE velocity average tensor, which Biot called the micro-macro velocity average tensor. This velocity average tensor is assumed here to depend upon the pore structure fabric. The formulation of mixture theory presented is directed toward the modeling of interstitial growth, that is to say changing mass and changing density of an organism. Traditional mixture theory considers constituents to be open systems, but the entire mixture is a closed system. In this development the mixture is also considered to be an open system as an alternative method of modeling growth. Growth is slow and accelerations are neglected in the applications. The velocity of a solid constituent is employed as the main reference velocity in preference to the mean velocity concept from the original formulation of mixture theory. The standard development of statements of the conservation principles and entropy inequality employed in mixture theory are modified to account for these kinematic changes and to allow for supplies of mass, momentum and energy to each constituent and to the mixture as a whole. The objective is to establish a basis for the development of constitutive equations for growth of tissues. PMID:22184481

  11. Mixture theory-based poroelasticity as a model of interstitial tissue growth

    PubMed Central

    Cowin, Stephen C.; Cardoso, Luis

    2011-01-01

    This contribution presents an alternative approach to mixture theory-based poroelasticity by transferring some poroelastic concepts developed by Maurice Biot to mixture theory. These concepts are a larger RVE and the subRVE-RVE velocity average tensor, which Biot called the micro-macro velocity average tensor. This velocity average tensor is assumed here to depend upon the pore structure fabric. The formulation of mixture theory presented is directed toward the modeling of interstitial growth, that is to say changing mass and changing density of an organism. Traditional mixture theory considers constituents to be open systems, but the entire mixture is a closed system. In this development the mixture is also considered to be an open system as an alternative method of modeling growth. Growth is slow and accelerations are neglected in the applications. The velocity of a solid constituent is employed as the main reference velocity in preference to the mean velocity concept from the original formulation of mixture theory. The standard development of statements of the conservation principles and entropy inequality employed in mixture theory are modified to account for these kinematic changes and to allow for supplies of mass, momentum and energy to each constituent and to the mixture as a whole. The objective is to establish a basis for the development of constitutive equations for growth of tissues. PMID:22184481

  12. Parallel implementation of multireference coupled-cluster theories based on the reference-level parallelism

    SciTech Connect

    Brabec, Jiri; Pittner, Jiri; van Dam, Hubertus JJ; Apra, Edoardo; Kowalski, Karol

    2012-02-01

    A novel algorithm for implementing general type of multireference coupled-cluster (MRCC) theory based on the Jeziorski-Monkhorst exponential Ansatz [B. Jeziorski, H.J. Monkhorst, Phys. Rev. A 24, 1668 (1981)] is introduced. The proposed algorithm utilizes processor groups to calculate the equations for the MRCC amplitudes. In the basic formulation each processor group constructs the equations related to a specific subset of references. By flexible choice of processor groups and subset of reference-specific sufficiency conditions designated to a given group one can assure optimum utilization of available computing resources. The performance of this algorithm is illustrated on the examples of the Brillouin-Wigner and Mukherjee MRCC methods with singles and doubles (BW-MRCCSD and Mk-MRCCSD). A significant improvement in scalability and in reduction of time to solution is reported with respect to recently reported parallel implementation of the BW-MRCCSD formalism [J.Brabec, H.J.J. van Dam, K. Kowalski, J. Pittner, Chem. Phys. Lett. 514, 347 (2011)].

  13. a Probability-Based Statistical Method to Extract Water Body of TM Images with Missing Information

    NASA Astrophysics Data System (ADS)

    Lian, Shizhong; Chen, Jiangping; Luo, Minghai

    2016-06-01

    Water information cannot be accurately extracted using TM images because true information is lost in some images because of blocking clouds and missing data stripes, thereby water information cannot be accurately extracted. Water is continuously distributed in natural conditions; thus, this paper proposed a new method of water body extraction based on probability statistics to improve the accuracy of water information extraction of TM images with missing information. Different disturbing information of clouds and missing data stripes are simulated. Water information is extracted using global histogram matching, local histogram matching, and the probability-based statistical method in the simulated images. Experiments show that smaller Areal Error and higher Boundary Recall can be obtained using this method compared with the conventional methods.

  14. Using a Marginal Structural Model to Design a Theory-Based Mass Media Campaign

    PubMed Central

    Taguri, Masataka; Ishikawa, Yoshiki

    2016-01-01

    Background The essential first step in the development of mass media health campaigns is to identify specific beliefs of the target audience. The challenge is to prioritize suitable beliefs derived from behavioral theory. The purpose of this study was to identify suitable beliefs to target in a mass media campaign to change behavior using a new method to estimate the possible effect size of a small set of beliefs. Methods Data were drawn from the 2010 Japanese Young Female Smoker Survey (n = 500), conducted by the Japanese Ministry of Health, Labor and Welfare. Survey measures included intention to quit smoking, psychological beliefs (attitude, norms, and perceived control) based on the theory of planned behavior and socioeconomic status (age, education, household income, and marital status). To identify suitable candidate beliefs for a mass media health campaign, we estimated the possible effect size required to change the intention to quit smoking among the population of young Japanese women using the population attributable fraction from a marginal structural model. Results Thirteen percent of study participants intended to quit smoking. The marginal structural model estimated a population attributable fraction of 47 psychological beliefs (21 attitudes, 6 norms, and 19 perceived controls) after controlling for socioeconomic status. The belief, “I could quit smoking if my husband or significant other recommended it” suggested a promising target for a mass media campaign (population attributable fraction = 0.12, 95% CI = 0.02–0.23). Messages targeting this belief could possibly improve intention rates by up to 12% among this population. The analysis also suggested the potential for regulatory action. Conclusions This study proposed a method by which campaign planners can develop theory-based mass communication strategies to change health behaviors at the population level. This method might contribute to improving the quality of future mass health

  15. Theory based design and optimization of materials for spintronics applications

    NASA Astrophysics Data System (ADS)

    Xu, Tianyi

    The Spintronics industry has developed rapidly in the past decade. Finding the right material is very important for Spintronics applications, which requires good understanding of the physics behind specific phenomena. In this dissertation, we will focus on two types of perpendicular transport phenomena, the current-perpendicular-to-plane giant-magneto-resistance (CPP-GMR) phenomenon and the tunneling phenomenon in the magnetic tunnel junctions. The Valet-Fert model is a very useful semi-classical approach for understanding the transport and spin-flip process in CPP-GMR. We will present a finite element based implementation for the Valet-Fert model which enables a practical way to calculate the electron transport in real CPP-GMR spin valves. It is very important to find high spin polarized materials for CPP-GMR spin valves. The half-metal, due to its full spin polarization, is of interest. We will propose a rational way to find half-metals based on the gap theorem. Then we will focus on the high-MR TMR phenomenon. The tunneling theory of electron transport in mesoscopic systems will be covered. Then we will calculate the transport properties of certain junctions with the help of Green's function under the Landauer-Buttiker formalism, also known as the scattering formalism. The damping constant determines the switching rate of a device. We can calculate it using a method based on the Extended Huckel Tight-Binding theory (EHTB). The symmetry filtering effect is very helpful for finding materials for TMR junctions. Based upon which, we find a good candidate material, MnAl, for TMR applications.

  16. An Information Theory-Based Approach to Assessing the Sustainability and Stability of an Island System

    EPA Science Inventory

    It is well-documented that a sustainable system is based on environmental stewardship, economic viability and social equity. What is often overlooked is the need for continuity such that desirable system behavior is maintained with mechanisms in place that facilitate the ability ...

  17. Successful Aging with Sickle Cell Disease: Using Qualitative Methods to Inform Theory

    PubMed Central

    Jenerette, Coretta M.; Lauderdale, Gloria

    2009-01-01

    Little is known about the lives of adults with sickle cell disease (SCD). This article reports findings from a qualitative pilot study, which used life review as a method to explore influences on health outcomes among middle-aged and older adults with SCD, Six females with SCD, recruited from two urban sickle cell clinics in the U.S., engaged in semi-structured, in-depth life review interviews. MaxQDA2 software was used for qualitative data coding and analysis. Three major themes were identified: vulnerability factors, self-care management resources, and health outcomes. These themes are consistent with the Theory of Self-Care Management for Sickle Cell Disease. Identifying vulnerability factors, self-care management resources, and health outcomes in adults with SCD may aid in developing theory-based interventions to meet health care needs of younger individuals with SCD. The life review process is a useful means to gain insight into successful aging with SCD and other chronic illnesses. PMID:19838320

  18. 10 CFR 207.3 - Method of collecting energy information under ESECA.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 3 2011-01-01 2011-01-01 false Method of collecting energy information under ESECA. 207.3 Section 207.3 Energy DEPARTMENT OF ENERGY OIL COLLECTION OF INFORMATION Collection of Information Under... production, processing, refining, transportation by pipeline, or distribution (at other than the retail...

  19. 10 CFR 207.3 - Method of collecting energy information under ESECA.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 3 2010-01-01 2010-01-01 false Method of collecting energy information under ESECA. 207.3 Section 207.3 Energy DEPARTMENT OF ENERGY OIL COLLECTION OF INFORMATION Collection of Information Under... production, processing, refining, transportation by pipeline, or distribution (at other than the retail...

  20. Improving Diabetes care through Examining, Advising, and prescribing (IDEA): protocol for a theory-based cluster randomised controlled trial of a multiple behaviour change intervention aimed at primary healthcare professionals

    PubMed Central

    2014-01-01

    Background New clinical research findings may require clinicians to change their behaviour to provide high-quality care to people with type 2 diabetes, likely requiring them to change multiple different clinical behaviours. The present study builds on findings from a UK-wide study of theory-based behavioural and organisational factors associated with prescribing, advising, and examining consistent with high-quality diabetes care. Aim To develop and evaluate the effectiveness and cost of an intervention to improve multiple behaviours in clinicians involved in delivering high-quality care for type 2 diabetes. Design/methods We will conduct a two-armed cluster randomised controlled trial in 44 general practices in the North East of England to evaluate a theory-based behaviour change intervention. We will target improvement in six underperformed clinical behaviours highlighted in quality standards for type 2 diabetes: prescribing for hypertension; prescribing for glycaemic control; providing physical activity advice; providing nutrition advice; providing on-going education; and ensuring that feet have been examined. The primary outcome will be the proportion of patients appropriately prescribed and examined (using anonymised computer records), and advised (using anonymous patient surveys) at 12 months. We will use behaviour change techniques targeting motivational, volitional, and impulsive factors that we have previously demonstrated to be predictive of multiple health professional behaviours involved in high-quality type 2 diabetes care. We will also investigate whether the intervention was delivered as designed (fidelity) by coding audiotaped workshops and interventionist delivery reports, and operated as hypothesised (process evaluation) by analysing responses to theory-based postal questionnaires. In addition, we will conduct post-trial qualitative interviews with practice teams to further inform the process evaluation, and a post-trial economic analysis to

  1. Explanation of Second-Order Asymptotic Theory Via Information Spectrum Method

    NASA Astrophysics Data System (ADS)

    Hayashi, Masahito

    We explain second-order asymptotic theory via the information spectrum method. From a unified viewpoint based on the generality of the information spectrum method, we consider second-order asymptotic theory for use in fixed-length data compression, uniform random number generation, and channel coding. Additionally, we discuss its application to quantum cryptography, folklore in source coding, and security analysis.

  2. A Review of Web Information Seeking Research: Considerations of Method and Foci of Interest

    ERIC Educational Resources Information Center

    Martzoukou, Konstantina

    2005-01-01

    Introduction: This review shows that Web information seeking research suffers from inconsistencies in method and a lack of homogeneity in research foci. Background: Qualitative and quantitative methods are needed to produce a comprehensive view of information seeking. Studies also recommend observation as one of the most fundamental ways of…

  3. “Please Don’t Send Us Spam!” A Participative, Theory-Based Methodology for Developing an mHealth Intervention

    PubMed Central

    2016-01-01

    Background Mobile health solutions have the potential of reducing burdens on health systems and empowering patients with important information. However, there is a lack of theory-based mHealth interventions. Objective The purpose of our study was to develop a participative, theory-based, mobile phone, audio messaging intervention attractive to recently circumcised men at voluntary medical male circumcision (VMMC) clinics in the Cape Town area in South Africa. We aimed to shift some of the tasks related to postoperative counselling on wound management and goal setting on safe sex. We place an emphasis on describing the full method of message generation to allow for replication. Methods We developed an mHealth intervention using a staggered qualitative methodology: (1) focus group discussions with 52 recently circumcised men and their partners to develop initial voice messages they felt were relevant and appropriate, (2) thematic analysis and expert consultation to select the final messages for pilot testing, and (3) cognitive interviews with 12 recent VMMC patients to judge message comprehension and rank the messages. Message content and phasing were guided by the theory of planned behavior and the health action process approach. Results Patients and their partners came up with 245 messages they thought would help men during the wound-healing period. Thematic analysis revealed 42 different themes. Expert review and cognitive interviews with more patients resulted in 42 messages with a clear division in terms of needs and expectations between the initial wound-healing recovery phase (weeks 1–3) and the adjustment phase (weeks 4–6). Discussions with patients also revealed potential barriers to voice messaging, such as lack of technical knowledge of mobile phones and concerns about the invasive nature of the intervention. Patients’ own suggested messages confirmed Ajzen’s theory of planned behavior that if a health promotion intervention can build trust and be

  4. Dynamic stepping information process method in mobile bio-sensing computing environments.

    PubMed

    Lee, Tae-Gyu; Lee, Seong-Hoon

    2014-01-01

    Recently, the interest toward human longevity free from diseases is being converged as one system frame along with the development of mobile computing environment, diversification of remote medical system and aging society. Such converged system enables implementation of a bioinformatics system created as various supplementary information services by sensing and gathering health conditions and various bio-information of mobile users to set up medical information. The existing bio-information system performs static and identical process without changes after the bio-information process defined at the initial system configuration executes the system. However, such static process indicates ineffective execution in the application of mobile bio-information system performing mobile computing. Especially, an inconvenient duty of having to perform initialization of new definition and execution is accompanied during the process configuration of bio-information system and change of method. This study proposes a dynamic process design and execution method to overcome such ineffective process. PMID:24704651

  5. A method for extracting task-oriented information from biological text sources.

    PubMed

    Kuttiyapillai, Dhanasekaran; Rajeswari, R

    2015-01-01

    A method for information extraction which processes the unstructured data from document collection has been introduced. A dynamic programming technique adopted to find relevant genes from sequences which are longest and accurate is used for finding matching sequences and identifying effects of various factors. The proposed method could handle complex information sequences which give different meanings in different situations, eliminating irrelevant information. The text contents were pre-processed using a general-purpose method and were applied with entity tagging component. The bottom-up scanning of key-value pairs improves content finding to generate relevant sequences to the testing task. This paper highlights context-based extraction method for extracting food safety information, which is identified from articles, guideline documents and laboratory results. The graphical disease model verifies weak component through utilisation of development data set. This improves the accuracy of information retrieval in biological text analysis and reporting applications. PMID:26510293

  6. Preventing Postpartum Smoking Relapse Among Inner City Women: Development of a Theory-Based and Evidence-Guided Text Messaging Intervention

    PubMed Central

    Wen, Kuang-Yi; Kilby, Linda; Fleisher, Linda; Belton, Tanisha D; Roy, Gem; Hernandez, Enrique

    2014-01-01

    Background Underserved women are at high risk for smoking relapse after childbirth due to their unique socioeconomic and postpartum stressors and barriers. Mobile text messaging technology allows delivery of relapse prevention programs targeted to their personal needs over time. Objective To describe the development of a social-cognitive theory-based and evidence-guided text messaging intervention for preventing postpartum smoking relapse among inner city women. Methods Guided by the cognitive-social health information processing framework, user-centered design, and health communication best practices, the intervention was developed through a systematic process that included needs assessment, followed by an iterative cycling through message drafting, health literacy evaluation and rewriting, review by target community members and a scientific advisory panel, and message revision, concluding with usability testing. Results All message content was theory-grounded, derived by needs assessment analysis and evidence-based materials, reviewed and revised by the target population, health literacy experts, and scientific advisors. The final program, “Txt2Commit,” was developed as a fully automated system, designed to deliver 3 proactive messages per day for a 1-month postpartum smoking relapse intervention, with crave and lapse user-initiated message functions available when needed. Conclusions The developmental process suggests that the application of theory and best practices in the design of text messaging smoking cessation interventions is not only feasible but necessary for ensuring that the interventions are evidence based and user-centered. PMID:24698804

  7. A research on scenic information prediction method based on RBF neural network

    NASA Astrophysics Data System (ADS)

    Li, Jingwen; Yin, Shouqiang; Wang, Ke

    2015-12-01

    Based on the rapid development of the wisdom tourism, it is conform to the trend of the development of the wisdom tourism through the scientific method to realize the prediction of the scenic information. The article,using the super nonlinear fitting ability of RBF neural network[1-2],builds a prediction and inference method of comprehensive information for the complex geographic time, space and attribute of scenic through the hyper-surface data organization of the scenic geographic entity information[3]. And it uses Guilin scenic area as an example to deduce the process of the forecasting of the whole information.

  8. A fuzzy clustering vessel segmentation method incorporating line-direction information

    NASA Astrophysics Data System (ADS)

    Wang, Zhimin; Xiong, Wei; Huang, Weimin; Zhou, Jiayin; Venkatesh, Sudhakar K.

    2012-02-01

    A data clustering based vessel segmentation method is proposed for automatic liver vasculature segmentation in CT images. It consists of a novel similarity measure which incorporates the spatial context, vesselness information and line-direction information in a unique way. By combining the line-direction information and spatial information into the data clustering process, the proposed method is able to take care of the fine details of the vessel tree and suppress the image noise and artifacts at the same time. The proposed algorithm has been evaluated on the real clinical contrast-enhanced CT images, and achieved excellent segmentation accuracy without any experimentally set parameters.

  9. A theory-based online health behavior intervention for new university students: study protocol

    PubMed Central

    2013-01-01

    Background Too few young people engage in behaviors that reduce the risk of morbidity and premature mortality, such as eating healthily, being physically active, drinking sensibly and not smoking. The present research developed an online intervention to target these health behaviors during the significant life transition from school to university when health beliefs and behaviors may be more open to change. This paper describes the intervention and the proposed approach to its evaluation. Methods/design Potential participants (all undergraduates about to enter the University of Sheffield) will be emailed an online questionnaire two weeks before starting university. On completion of the questionnaire, respondents will be randomly assigned to receive either an online health behavior intervention (U@Uni) or a control condition. The intervention employs three behavior change techniques (self-affirmation, theory-based messages, and implementation intentions) to target four heath behaviors (alcohol consumption, physical activity, fruit and vegetable intake, and smoking). Subsequently, all participants will be emailed follow-up questionnaires approximately one and six months after starting university. The questionnaires will assess the four targeted behaviors and associated cognitions (e.g., intentions, self-efficacy) as well as socio-demographic variables, health status, Body Mass Index (BMI), health service use and recreational drug use. A sub-sample of participants will provide a sample of hair to assess changes in biochemical markers of health behavior. A health economic evaluation of the cost effectiveness of the intervention will also be conducted. Discussion The findings will provide evidence on the effectiveness of online interventions as well as the potential for intervening during significant life transitions, such as the move from school to university. If successful, the intervention could be employed at other universities to promote healthy behaviors among new

  10. Increasing smoke alarm operability through theory-based health education: a randomised trial

    PubMed Central

    Miller, Ted R; Bergen, Gwen; Ballesteros, Michael F; Bhattacharya, Soma; Gielen, Andrea Carlson; Sheppard, Monique S

    2015-01-01

    Background Although working smoke alarms halve deaths in residential fires, many households do not keep alarms operational. We tested whether theory-based education increases alarm operability. Methods Randomised multiarm trial, with a single arm randomly selected for use each day, in low-income neighbourhoods in Maryland, USA. Intervention arms: (1) Full Education combining a health belief module with a social-cognitive theory module that provided hands-on practice installing alarm batteries and using the alarm’s hush button; (2) Hands-on Practice social-cognitive module supplemented by typical fire department education; (3) Current Norm receiving typical fire department education only. Four hundred and thirty-six homes recruited through churches or by knocking on doors in 2005–2008. Followup visits checked alarm operability in 370 homes (85%) 1–3.5 years after installation. Main outcome measures: number of homes with working alarms defined as alarms with working batteries or hard-wired and number of working alarms per home. Regressions controlled for alarm status preintervention; demographics and beliefs about fire risks and alarm effectiveness. Results Homes in the Full Education and Practice arms were more likely to have a functioning smoke alarm at follow-up (OR=2.77, 95% CI 1.09 to 7.03) and had an average of 0.32 more working alarms per home (95% CI 0.09 to 0.56). Working alarms per home rose 16%. Full Education and Practice had similar effectiveness (p=0.97 on both outcome measures). Conclusions Without exceeding typical fire department installation time, installers can achieve greater smoke alarm operability. Hands-on practice is key. Two years after installation, for every three homes that received hands-on practice, one had an additional working alarm. Trial registration number http://www.clinicaltrials.gov number NCT00139126. PMID:25165090

  11. Inter-instrumental method transfer of chiral capillary electrophoretic methods using robustness test information.

    PubMed

    De Cock, Bart; Borsuk, Agnieszka; Dejaegher, Bieke; Stiens, Johan; Mangelings, Debby; Vander Heyden, Yvan

    2014-08-01

    Capillary electrophoresis (CE) is an electrodriven separation technique that is often used for the separation of chiral molecules. Advantages of CE are its flexibility, low cost and efficiency. On the other hand, the precision and transfer of CE methods are well-known problems of the technique. Reasons for the more complicated method transfer are the more diverse instrumental differences, such as total capillary lengths and capillary cooling systems; and the higher response variability of CE methods compared to other techniques, such as liquid chromatography (HPLC). Therefore, a larger systematic change in peak resolutions, migration times and peak areas, with a loss of separation and efficiency may be seen when a CE method is transferred to another laboratory or another type of instrument. A swift and successful method transfer is required because development and routine use of analytical methods are usually not performed in the same laboratory and/or on the same type of equipment. The aim of our study was to develop transfer rules to facilitate CE method transfers between different laboratories and instruments. In our case study, three β-blockers were chirally separated and inter-instrumental transfers were performed. The first step of our study was to optimise the precision of the chiral CE method. Next, a robustness test was performed to identify the instrumental and experimental parameters that were most influencing the considered responses. The precision- and the robustness study results were used to adapt instrumental and/or method settings to improve the transfer between different instruments. Finally, the comparison of adapted and non-adapted transfers allowed deriving some rules to facilitate CE method transfers. PMID:24931445

  12. A formal, mathematics oriented method for identifying security risks in information systems.

    PubMed

    van Piggelen, H U

    1997-01-01

    IT security presently lacks the benefits of physics where certain unifying grand principles can be applied. The aim of the method is to provide a technology independent method of identifying components of a system in general, and of information systems in particular. The need for the proposed method is derived from ad hoc character of theories used in the present formal security textbooks. None of these can give the user any guarantee of completeness. The new method is scientifically derived as a method, presented, explained and applied to several interesting topics in the field of health care information systems. Some simple mathematical formulae can be introduced. PMID:10179535

  13. A Theory-Based Approach to Reading Assessment in the Army. Technical Report 625.

    ERIC Educational Resources Information Center

    Oxford-Carpenter, Rebecca L.; Schultz-Shiner, Linda J.

    Noting that the United States Army Research Institute for the Behavioral and Social Sciences (ARI) has been involved in research on reading assessment in the Army from both practical and theoretical perspectives, this paper addresses practical Army problems in reading assessment from a theory base that reflects the most recent and most sound…

  14. Development and Evaluation of a Theory-Based Physical Activity Guidebook for Breast Cancer Survivors

    ERIC Educational Resources Information Center

    Vallance, Jeffrey K.; Courneya, Kerry S.; Taylor, Lorian M.; Plotnikoff, Ronald C.; Mackey, John R.

    2008-01-01

    This study's objective was to develop and evaluate the suitability and appropriateness of a theory-based physical activity (PA) guidebook for breast cancer survivors. Guidebook content was constructed based on the theory of planned behavior (TPB) using salient exercise beliefs identified by breast cancer survivors in previous research. Expert…

  15. Theory-Based Interactive Mathematics Instruction: Development and Validation of Computer-Video Modules.

    ERIC Educational Resources Information Center

    Henderson, Ronald W.; And Others

    Theory-based prototype computer-video instructional modules were developed to serve as an instructional supplement for students experiencing difficulty in learning mathematics, with special consideration given to students underrepresented in mathematics (particularly women and minorities). Modules focused on concepts and operations for factors,…

  16. Assessment of Prevalence of Persons with Down Syndrome: A Theory-Based Demographic Model

    ERIC Educational Resources Information Center

    de Graaf, Gert; Vis, Jeroen C.; Haveman, Meindert; van Hove, Geert; de Graaf, Erik A. B.; Tijssen, Jan G. P.; Mulder, Barbara J. M.

    2011-01-01

    Background: The Netherlands are lacking reliable empirical data in relation to the development of birth and population prevalence of Down syndrome. For the UK and Ireland there are more historical empirical data available. A theory-based model is developed for predicting Down syndrome prevalence in the Netherlands from the 1950s onwards. It is…

  17. Validating a Theory-Based Survey to Evaluate Teaching Effectiveness in Higher Education

    ERIC Educational Resources Information Center

    Amrein-Beardsley, A.; Haladyna, T.

    2012-01-01

    Surveys to evaluate instructor effectiveness are commonly used in higher education. Yet the survey items included are often drawn from other surveys without reference to a theory of adult learning. The authors present the results from a validation study of such a theory-based survey. They evidence that an evaluation survey based on a theory that…

  18. Using Emergence Theory-Based Curriculum to Teach Compromise Skills to Students with Autistic Spectrum Disorders

    ERIC Educational Resources Information Center

    Fein, Lance; Jones, Don

    2015-01-01

    This study addresses the compromise skills that are taught to students diagnosed with autistic spectrum disorders (ASD) and related social and communication deficits. A private school in the southeastern United States implemented an emergence theory-based curriculum to address these skills, yet no formal analysis was conducted to determine its…

  19. Liminality in cultural transition: applying ID-EA to advance a concept into theory-based practice.

    PubMed

    Baird, Martha B; Reed, Pamela G

    2015-01-01

    As global migration increases worldwide, nursing interventions are needed to address the effects of migration on health. The concept of liminality emerged as a pivotal concept in the situation-specific theory of well-being in refugee women experiencing cultural transition. As a relatively new concept in the discipline of nursing, liminality is explored using a method, called ID-EA, which we developed to advance a theoretical concept for application to nursing practice. Liminality in the context of cultural transition is further developed using the five steps of inquiry of the ID-EA method. The five steps are as follows: (1) inductive inquiry: qualitative research, (2) deductive inquiry: literature review, (3) synthesis of inductive and deductive inquiry, (4) evaluation inquiry, and (5) application-to-practice inquiry. The overall goal of this particular work was to develop situation-specific, theory-based interventions that facilitate cultural transitions for immigrants and refugees. PMID:25799694

  20. 49 CFR 1135.2 - Revenue Shortfall Allocation Method: Annual State tax information.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 8 2010-10-01 2010-10-01 false Revenue Shortfall Allocation Method: Annual State... RECOVERY PROCEDURES § 1135.2 Revenue Shortfall Allocation Method: Annual State tax information. (a) To enable the Board to calculate the revenue shortfall allocation method (RSAM), which is one of the...

  1. 78 FR 34427 - 2012 Tax Information for Use In The Revenue Shortfall Allocation Method

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-07

    ... Surface Transportation Board 2012 Tax Information for Use In The Revenue Shortfall Allocation Method... Shortfall Allocation Method (RSAM). DATES: Comments are due by July 9, 2013. If any comment opposing AAR's... Revenue Shortfall Allocation Method, EP 646 (Sub-No. 2) (STB served Nov. 21, 2008). RSAM is intended...

  2. 76 FR 40448 - 2010 Tax Information for Use in the Revenue Shortfall Allocation Method

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-08

    ... Surface Transportation Board 2010 Tax Information for Use in the Revenue Shortfall Allocation Method... Allocation Method (RSAM). DATES: Comments are due by August 8, 2011. If any comment opposing AAR's... Shortfall Allocation Method, EP 646 (Sub-No. 2) (STB served Nov. 21, 2008). RSAM is intended to measure...

  3. A method for fast selecting feature wavelengths from the spectral information of crop nitrogen

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Research on a method for fast selecting feature wavelengths from the nitrogen spectral information is necessary, which can determine the nitrogen content of crops. Based on the uniformity of uniform design, this paper proposed an improved particle swarm optimization (PSO) method. The method can ch...

  4. 78 FR 68076 - Request for Information on Alternative Skin Sensitization Test Methods and Testing Strategies and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-13

    ...The Interagency Coordinating Committee on the Validation of Alternative Methods (ICCVAM) is developing a U.S. plan for the evaluation of alternative skin sensitization test methods and testing strategies. The National Toxicology Program (NTP) Interagency Center for the Evaluation of Alternative Toxicological Methods (NICEATM) requests information that ICCVAM might use to develop this plan and......

  5. Genetic Algorithm and Graph Theory Based Matrix Factorization Method for Online Friend Recommendation

    PubMed Central

    Li, Qu; Yang, Jianhua; Xu, Ning

    2014-01-01

    Online friend recommendation is a fast developing topic in web mining. In this paper, we used SVD matrix factorization to model user and item feature vector and used stochastic gradient descent to amend parameter and improve accuracy. To tackle cold start problem and data sparsity, we used KNN model to influence user feature vector. At the same time, we used graph theory to partition communities with fairly low time and space complexity. What is more, matrix factorization can combine online and offline recommendation. Experiments showed that the hybrid recommendation algorithm is able to recommend online friends with good accuracy. PMID:24757410

  6. Method of bistable optical information storage using antiferroelectric phase PLZT ceramics

    DOEpatents

    Land, C.E.

    1990-07-31

    A method for bistable storage of binary optical information includes an antiferroelectric (AFE) lead lanthanum zirconate titanate (PLZT) layer having a stable antiferroelectric first phase and a ferroelectric (FE) second phase obtained by applying a switching electric field across the surface of the device. Optical information is stored by illuminating selected portions of the layer to photoactivate an FE to AFE transition in those portions. Erasure of the stored information is obtained by reapplying the switching field. 8 figs.

  7. Method of bistable optical information storage using antiferroelectric phase PLZT ceramics

    DOEpatents

    Land, Cecil E.

    1990-01-01

    A method for bistable storage of binary optical information includes an antiferroelectric (AFE) lead lanthanum zirconate titanate (PLZT) layer having a stable antiferroelectric first phase and a ferroelectric (FE) second phase obtained by applying a switching electric field across the surface of the device. Optical information is stored by illuminating selected portions of the layer to photoactivate an FE to AFE transition in those portions. Erasure of the stored information is obtained by reapplying the switching field.

  8. Extraction Method for Earthquake-Collapsed Building Information Based on High-Resolution Remote Sensing

    NASA Astrophysics Data System (ADS)

    Chen, Peng; Wu, Jian; Liu, Yaolin; Wang, Jing

    2014-03-01

    At present, the extraction of earthquake disaster information from remote sensing data relies on visual interpretation. However, this technique cannot effectively and quickly obtain precise and efficient information for earthquake relief and emergency management. Collapsed buildings in the town of Zipingpu after the Wenchuan earthquake were used as a case study to validate two kinds of rapid extraction methods for earthquake-collapsed building information based on pixel-oriented and object-oriented theories. The pixel-oriented method is based on multi-layer regional segments that embody the core layers and segments of the object-oriented method. The key idea is to mask layer by layer all image information, including that on the collapsed buildings. Compared with traditional techniques, the pixel-oriented method is innovative because it allows considerably rapid computer processing. As for the object-oriented method, a multi-scale segment algorithm was applied to build a three-layer hierarchy. By analyzing the spectrum, texture, shape, location, and context of individual object classes in different layers, the fuzzy determined rule system was established for the extraction of earthquake-collapsed building information. We compared the two sets of results using three variables: precision assessment, visual effect, and principle. Both methods can extract earthquake-collapsed building information quickly and accurately. The object-oriented method successfully overcomes the pepper salt noise caused by the spectral diversity of high-resolution remote sensing data and solves the problem of same object, different spectrums and that of same spectrum, different objects. With an overall accuracy of 90.38%, the method achieves more scientific and accurate results compared with the pixel-oriented method (76.84%). The object-oriented image analysis method can be extensively applied in the extraction of earthquake disaster information based on high-resolution remote sensing.

  9. Theories and Methods for Research on Informal Learning and Work: Towards Cross-Fertilization

    ERIC Educational Resources Information Center

    Sawchuk, Peter H.

    2008-01-01

    The topic of informal learning and work has quickly become a staple in contemporary work and adult learning research internationally. The narrow conceptualization of work is briefly challenged before the article turns to a review of the historical origins as well as contemporary theories and methods involved in researching informal learning and…

  10. Mathematical, Logical, and Formal Methods in Information Retrieval: An Introduction to the Special Issue.

    ERIC Educational Resources Information Center

    Crestani, Fabio; Dominich, Sandor; Lalmas, Mounia; van Rijsbergen, Cornelis Joost

    2003-01-01

    Discusses the importance of research on the use of mathematical, logical, and formal methods in information retrieval to help enhance retrieval effectiveness and clarify underlying concepts of information retrieval. Highlights include logic; probability; spaces; and future research needs. (Author/LRW)

  11. Evaluation of Semantic-Based Information Retrieval Methods in the Autism Phenotype Domain

    PubMed Central

    Hassanpour, Saeed; O’Connor, Martin J.; Das, Amar K.

    2011-01-01

    Biomedical ontologies are increasingly being used to improve information retrieval methods. In this paper, we present a novel information retrieval approach that exploits knowledge specified by the Semantic Web ontology and rule languages OWL and SWRL. We evaluate our approach using an autism ontology that has 156 SWRL rules defining 145 autism phenotypes. Our approach uses a vector space model to correlate how well these phenotypes relate to the publications used to define them. We compare a vector space phenotype representation using class hierarchies with one that extends this method to incorporate additional semantics encoded in SWRL rules. From a PubMed-extracted corpus of 75 articles, we show that average rank of a related paper using the class hierarchy method is 4.6 whereas the average rank using the extended rule-based method is 3.3. Our results indicate that incorporating rule-based definitions in information retrieval methods can improve search for relevant publications. PMID:22195112

  12. Evaluation of semantic-based information retrieval methods in the autism phenotype domain.

    PubMed

    Hassanpour, Saeed; O'Connor, Martin J; Das, Amar K

    2011-01-01

    Biomedical ontologies are increasingly being used to improve information retrieval methods. In this paper, we present a novel information retrieval approach that exploits knowledge specified by the Semantic Web ontology and rule languages OWL and SWRL. We evaluate our approach using an autism ontology that has 156 SWRL rules defining 145 autism phenotypes. Our approach uses a vector space model to correlate how well these phenotypes relate to the publications used to define them. We compare a vector space phenotype representation using class hierarchies with one that extends this method to incorporate additional semantics encoded in SWRL rules. From a PubMed-extracted corpus of 75 articles, we show that average rank of a related paper using the class hierarchy method is 4.6 whereas the average rank using the extended rule-based method is 3.3. Our results indicate that incorporating rule-based definitions in information retrieval methods can improve search for relevant publications. PMID:22195112

  13. An automatic abrupt information extraction method based on singular value decomposition and higher-order statistics

    NASA Astrophysics Data System (ADS)

    He, Tian; Ye, Wu; Pan, Qiang; Liu, Xiandong

    2016-02-01

    One key aspect of local fault diagnosis is how to effectively extract abrupt features from the vibration signals. This paper proposes a method to automatically extract abrupt information based on singular value decomposition and higher-order statistics. In order to observe the distribution law of singular values, a numerical analysis to simulate the noise, periodic signal, abrupt signal and singular value distribution is conducted. Based on higher-order statistics and spectrum analysis, a method to automatically choose the upper and lower borders of the singular value interval reflecting the abrupt information is built. And the selected singular values derived from this method are used to reconstruct abrupt signals. It is proven that the method is able to obtain accurate results by processing the rub-impact fault signal measured from the experiments. The analytical and experimental results indicate that the proposed method is feasible for automatically extracting abrupt information caused by faults like the rotor-stator rub-impact.

  14. An Adaptive Altitude Information Fusion Method for Autonomous Landing Processes of Small Unmanned Aerial Rotorcraft

    PubMed Central

    Lei, Xusheng; Li, Jingjing

    2012-01-01

    This paper presents an adaptive information fusion method to improve the accuracy and reliability of the altitude measurement information for small unmanned aerial rotorcraft during the landing process. Focusing on the low measurement performance of sensors mounted on small unmanned aerial rotorcraft, a wavelet filter is applied as a pre-filter to attenuate the high frequency noises in the sensor output. Furthermore, to improve altitude information, an adaptive extended Kalman filter based on a maximum a posteriori criterion is proposed to estimate measurement noise covariance matrix in real time. Finally, the effectiveness of the proposed method is proved by static tests, hovering flight and autonomous landing flight tests. PMID:23201993

  15. Method and apparatus for bistable optical information storage for erasable optical disks

    DOEpatents

    Land, Cecil E.; McKinney, Ira D.

    1990-01-01

    A method and an optical device for bistable storage of optical information, together with reading and erasure of the optical information, using a photoactivated shift in a field dependent phase transition between a metastable or a bias-stabilized ferroelectric (FE) phase and a stable antiferroelectric (AFE) phase in an lead lanthanum zirconate titanate (PLZT). An optical disk contains the PLZT. Writing and erasing of optical information can be accomplished by a light beam normal to the disk. Reading of optical information can be accomplished by a light beam at an incidence angle of 15 to 60 degrees to the normal of the disk.

  16. Method and apparatus for bistable optical information storage for erasable optical disks

    DOEpatents

    Land, C.E.; McKinney, I.D.

    1988-05-31

    A method and an optical device for bistable storage of optical information, together with reading and erasure of the optical information, using a photoactivated shift in a field dependent phase transition between a metastable or a bias-stabilized ferroelectric (FE) phase and a stable antiferroelectric (AFE) phase in a lead lanthanum zirconate titanate (PLZT). An optical disk contains the PLZT. Writing and erasing of optical information can be accomplished by a light beam normal to the disk. Reading of optical information can be accomplished by a light beam at an incidence angle of 15 to 60 degrees to the normal of the disk. 10 figs.

  17. Novel copyright information hiding method based on random phase matrix of Fresnel diffraction transforms

    NASA Astrophysics Data System (ADS)

    Cao, Chao; Chen, Ru-jun

    2009-10-01

    In this paper, we present a new copyright information hide method for digital images in Moiré fringe formats. The copyright information is embedded into the protected image and the detecting image based on Fresnel phase matrix. Firstly, using Fresnel diffraction transform, the random phase matrix of copyright information is generated. Then, according to Moiré fringe principle, the protected image and the detecting image are modulated respectively based on the random phase matrix, and the copyright information is embedded into them. When the protected image and the detecting image are overlapped, the copyright information can reappear. Experiment results show that our method has good concealment performance, and is a new way for copyright protection.

  18. Assessing Bayesian model averaging uncertainty of groundwater modeling based on information entropy method

    NASA Astrophysics Data System (ADS)

    Zeng, Xiankui; Wu, Jichun; Wang, Dong; Zhu, Xiaobin; Long, Yuqiao

    2016-07-01

    Because of groundwater conceptualization uncertainty, multi-model methods are usually used and the corresponding uncertainties are estimated by integrating Markov Chain Monte Carlo (MCMC) and Bayesian model averaging (BMA) methods. Generally, the variance method is used to measure the uncertainties of BMA prediction. The total variance of ensemble prediction is decomposed into within-model and between-model variances, which represent the uncertainties derived from parameter and conceptual model, respectively. However, the uncertainty of a probability distribution couldn't be comprehensively quantified by variance solely. A new measuring method based on information entropy theory is proposed in this study. Due to actual BMA process hard to meet the ideal mutually exclusive collectively exhaustive condition, BMA predictive uncertainty could be decomposed into parameter, conceptual model, and overlapped uncertainties, respectively. Overlapped uncertainty is induced by the combination of predictions from correlated model structures. In this paper, five simple analytical functions are firstly used to illustrate the feasibility of the variance and information entropy methods. A discrete distribution example shows that information entropy could be more appropriate to describe between-model uncertainty than variance. Two continuous distribution examples show that the two methods are consistent in measuring normal distribution, and information entropy is more appropriate to describe bimodal distribution than variance. The two examples of BMA uncertainty decomposition demonstrate that the two methods are relatively consistent in assessing the uncertainty of unimodal BMA prediction. Information entropy is more informative in describing the uncertainty decomposition of bimodal BMA prediction. Then, based on a synthetical groundwater model, the variance and information entropy methods are used to assess the BMA uncertainty of groundwater modeling. The uncertainty assessments of

  19. Development and Content Validation of the Information Assessment Method for Patients and Consumers

    PubMed Central

    Bartlett, Gillian; Grad, Roland M; Tang, David L; Johnson-Lafleur, Janique; Shulha, Michael; Barbosa Galvão, Maria Cristiane; Ricarte, Ivan LM; Stephenson, Randolph; Shohet, Linda; Hutsul, Jo-Anne; Repchinsky, Carol A; Rosenberg, Ellen; Burnand, Bernard; Légaré, France; Dunikowski, Lynn; Murray, Susan; Boruff, Jill; Frati, Francesca; Kloda, Lorie; Macaulay, Ann; Lagarde, François; Doray, Geneviève

    2014-01-01

    Background Online consumer health information addresses health problems, self-care, disease prevention, and health care services and is intended for the general public. Using this information, people can improve their knowledge, participation in health decision-making, and health. However, there are no comprehensive instruments to evaluate the value of health information from a consumer perspective. Objective We collaborated with information providers to develop and validate the Information Assessment Method for all (IAM4all) that can be used to collect feedback from information consumers (including patients), and to enable a two-way knowledge translation between information providers and consumers. Methods Content validation steps were followed to develop the IAM4all questionnaire. The first version was based on a theoretical framework from information science, a critical literature review and prior work. Then, 16 laypersons were interviewed on their experience with online health information and specifically their impression of the IAM4all questionnaire. Based on the summaries and interpretations of interviews, questionnaire items were revised, added, and excluded, thus creating the second version of the questionnaire. Subsequently, a panel of 12 information specialists and 8 health researchers participated in an online survey to rate each questionnaire item for relevance, clarity, representativeness, and specificity. The result of this expert panel contributed to the third, current, version of the questionnaire. Results The current version of the IAM4all questionnaire is structured by four levels of outcomes of information seeking/receiving: situational relevance, cognitive impact, information use, and health benefits. Following the interviews and the expert panel survey, 9 questionnaire items were confirmed as relevant, clear, representative, and specific. To improve readability and accessibility for users with a lower level of literacy, 19 items were reworded

  20. Comparison of high and low intensity contact between secondary and primary care to detect people at ultra-high risk for psychosis: study protocol for a theory-based, cluster randomized controlled trial

    PubMed Central

    2013-01-01

    Background The early detection and referral to specialized services of young people at ultra-high risk (UHR) for psychosis may reduce the duration of untreated psychosis and, therefore, improve prognosis. General practitioners (GPs) are usually the healthcare professionals contacted first on the help-seeking pathway of these individuals. Methods/Design This is a cluster randomized controlled trial (cRCT) of primary care practices in Cambridgeshire and Peterborough, UK. Practices are randomly allocated into two groups in order to establish which is the most effective and cost-effective way to identify people at UHR for psychosis. One group will receive postal information about the local early intervention in psychosis service, including how to identify young people who may be in the early stages of a psychotic illness. The second group will receive the same information plus an additional, ongoing theory-based educational intervention with dedicated liaison practitioners to train clinical staff at each site. The primary outcome of this trial is count data over a 2-year period: the yield - number of UHR for psychosis referrals to a specialist early intervention in psychosis service - per primary care practice. Discussion There is little guidance on the essential components of effective and cost-effective educational interventions in primary mental health care. Furthermore, no study has demonstrated an effect of a theory-based intervention to help GPs identify young people at UHR for psychosis. This study protocol is underpinned by a robust scientific rationale that intends to address these limitations. Trial registration Current Controlled Trials ISRCTN70185866 PMID:23866815

  1. Using Financial Information in Continuing Education. Accepted Methods and New Approaches.

    ERIC Educational Resources Information Center

    Matkin, Gary W.

    This book, which is intended as a resource/reference guide for experienced financial managers and course planners, examines accepted methods and new approaches for using financial information in continuing education. The introduction reviews theory and practice, traditional and new methods, planning and organizational management, and technology.…

  2. Mixed Methods Research of Adult Family Care Home Residents and Informal Caregivers

    ERIC Educational Resources Information Center

    Jeanty, Guy C.; Hibel, James

    2011-01-01

    This article describes a mixed methods approach used to explore the experiences of adult family care home (AFCH) residents and informal caregivers (IC). A rationale is presented for using a mixed methods approach employing the sequential exploratory design with this poorly researched population. The unique challenges attendant to the sampling…

  3. Farmers' Preferences for Methods of Receiving Information on New or Innovative Farming Practices.

    ERIC Educational Resources Information Center

    Riesenberg, Lou E.; Gor, Christopher Obel

    1989-01-01

    Survey of 386 Idaho farmers (response rate 58 percent) identified preferred methods of receiving information on new or innovative farming practices. Analysis revealed preference for interpersonal methods (demonstrations, tours, and field trips) over mass media such as computer-assisted instruction (CAI) and home study, although younger farmers,…

  4. A Qualitative Study about Performance Based Assesment Methods Used in Information Technologies Lesson

    ERIC Educational Resources Information Center

    Daghan, Gökhan; Akkoyunlu, Buket

    2014-01-01

    In this study, Information Technologies teachers' views and usage cases on performance based assesment methods (PBAMs) are examined. It is aimed to find out which of the PBAMs are used frequently or not used, preference reasons of these methods and opinions about the applicability of them. Study is designed with the phenomenological design…

  5. 77 FR 23674 - Proposed Information Collection Requests: Measures and Methods for the National Reporting System...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-20

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF EDUCATION Proposed Information Collection Requests: Measures and Methods for the National Reporting System for Adult... records. Title of Collection: Measures and Methods for the National Reporting System for Adult...

  6. A Method for the Analysis of Information Use in Source-Based Writing

    ERIC Educational Resources Information Center

    Sormunen, Eero; Heinstrom, Jannica; Romu, Leena; Turunen, Risto

    2012-01-01

    Introduction: Past research on source-based writing assignments has hesitated to scrutinize how students actually use information afforded by sources. This paper introduces a method for the analysis of text transformations from sources to texts composed. The method is aimed to serve scholars in building a more detailed understanding of how…

  7. An Energy-Efficient Game-Theory-Based Spectrum Decision Scheme for Cognitive Radio Sensor Networks.

    PubMed

    Salim, Shelly; Moh, Sangman

    2016-01-01

    A cognitive radio sensor network (CRSN) is a wireless sensor network in which sensor nodes are equipped with cognitive radio. In this paper, we propose an energy-efficient game-theory-based spectrum decision (EGSD) scheme for CRSNs to prolong the network lifetime. Note that energy efficiency is the most important design consideration in CRSNs because it determines the network lifetime. The central part of the EGSD scheme consists of two spectrum selection algorithms: random selection and game-theory-based selection. The EGSD scheme also includes a clustering algorithm, spectrum characterization with a Markov chain, and cluster member coordination. Our performance study shows that EGSD outperforms the existing popular framework in terms of network lifetime and coordination overhead. PMID:27376290

  8. An Energy-Efficient Game-Theory-Based Spectrum Decision Scheme for Cognitive Radio Sensor Networks

    PubMed Central

    Salim, Shelly; Moh, Sangman

    2016-01-01

    A cognitive radio sensor network (CRSN) is a wireless sensor network in which sensor nodes are equipped with cognitive radio. In this paper, we propose an energy-efficient game-theory-based spectrum decision (EGSD) scheme for CRSNs to prolong the network lifetime. Note that energy efficiency is the most important design consideration in CRSNs because it determines the network lifetime. The central part of the EGSD scheme consists of two spectrum selection algorithms: random selection and game-theory-based selection. The EGSD scheme also includes a clustering algorithm, spectrum characterization with a Markov chain, and cluster member coordination. Our performance study shows that EGSD outperforms the existing popular framework in terms of network lifetime and coordination overhead. PMID:27376290

  9. A theory-based logic model for innovation policy and evaluation.

    SciTech Connect

    Jordan, Gretchen B.

    2010-04-01

    Current policy and program rationale, objectives, and evaluation use a fragmented picture of the innovation process. This presents a challenge since in the United States officials in both the executive and legislative branches of government see innovation, whether that be new products or processes or business models, as the solution to many of the problems the country faces. The logic model is a popular tool for developing and describing the rationale for a policy or program and its context. This article sets out to describe generic logic models of both the R&D process and the diffusion process, building on existing theory-based frameworks. Then a combined, theory-based logic model for the innovation process is presented. Examples of the elements of the logic, each a possible leverage point or intervention, are provided, along with a discussion of how this comprehensive but simple model might be useful for both evaluation and policy development.

  10. Interconnected but underprotected? Parents' methods and motivations for information seeking on digital safety issues.

    PubMed

    Davis, Vauna

    2012-12-01

    Parents need information and skills to meet the demands of mediating connected technology in their homes. Parents' methods and motivations for learning to protect children from digital risks were reported through a survey. This study explores relationships between information seeking, parents' concerns, risks children have experienced, and access to connected devices, in addition to the use and satisfaction of various digital safety resources. Three types of information-seeking behavior were identified: (a) protective information seeking, to protect children from being confronted with harmful content; (b) problem-solving information seeking, to help children who have been negatively affected by connected technology; and (c) attentive learning, by attending to media resources passively encountered on this topic. Friends and family are the dominant source of digital safety information, followed by presentations and the Internet. Parents' top concerns for their children using connected technology were accidental exposure to pornography, and sexual content in Internet-based entertainment. Higher numbers of risks experienced by children were positively associated with parents' problem-solving information seeking and level of attentive learning. Parents who were more concerned exhibited more problem-solving information seeking; but despite the high level of concern for children's safety online, 65 percent of parents seek information on this subject less than twice per year. Children have access to a mean of five connected devices at home; a higher number of devices was correlated with increased risks experienced by children, but was not associated with increased concern or information seeking from parents. PMID:23098226

  11. Mechanisms of behavioural maintenance: Long-term effects of theory-based interventions to promote safe water consumption.

    PubMed

    Inauen, Jennifer; Mosler, Hans-Joachim

    2016-01-01

    Theory-based interventions can enhance people's safe water consumption, but the sustainability of these interventions and the mechanisms of maintenance remain unclear. We investigated these questions based on an extended theory of planned behaviour. Seven hundred and ten (445 analysed) randomly selected households participated in two cluster-randomised controlled trials in Bangladesh. Study 1 promoted switching to neighbours' arsenic-safe wells, and Study 2 promoted switching to arsenic-safe deep wells. Both studies included two intervention phases. Structured interviews were conducted at baseline (T1), and at 1-month (T2), 2-month (T3) and 9-month (T4) follow-ups. In intervention phase 1 (between T1 and T2), commitment-based behaviour change techniques--reminders, implementation intentions and public commitment--were combined with information and compared to an information-only control group. In phase 2 (between T2 and T3), half of each phase 1 intervention group was randomly assigned to receive either commitment-based techniques once more or coping planning with reminders and information. Initial well-switching rates of up to 60% significantly declined by T4: 38.3% of T2 safe water users stopped consuming arsenic-safe water. The decline depended on the intervention. Perceived behavioural control, intentions, commitment strength and coping planning were associated with maintenance. In line with previous studies, the results indicate that commitment and reminders engender long-term behavioural change. PMID:26304476

  12. A simplified orthotropic formulation of the viscoplasticity theory based on overstress

    NASA Technical Reports Server (NTRS)

    Sutcu, M.; Krempl, E.

    1988-01-01

    An orthotropic, small strain viscoplasticity theory based on overstress is presented. In each preferred direction the stress is composed of time (rate) independent (or plastic) and viscous (or rate dependent) contributions. Tension-compression asymmetry can depend on direction and is included in the model. Upon a proper choice of a material constant one preferred direction can exhibit linear elastic response while the other two deform in a viscoplastic manner.

  13. The Equivalence of Information-Theoretic and Likelihood-Based Methods for Neural Dimensionality Reduction

    PubMed Central

    Williamson, Ross S.; Sahani, Maneesh; Pillow, Jonathan W.

    2015-01-01

    Stimulus dimensionality-reduction methods in neuroscience seek to identify a low-dimensional space of stimulus features that affect a neuron’s probability of spiking. One popular method, known as maximally informative dimensions (MID), uses an information-theoretic quantity known as “single-spike information” to identify this space. Here we examine MID from a model-based perspective. We show that MID is a maximum-likelihood estimator for the parameters of a linear-nonlinear-Poisson (LNP) model, and that the empirical single-spike information corresponds to the normalized log-likelihood under a Poisson model. This equivalence implies that MID does not necessarily find maximally informative stimulus dimensions when spiking is not well described as Poisson. We provide several examples to illustrate this shortcoming, and derive a lower bound on the information lost when spiking is Bernoulli in discrete time bins. To overcome this limitation, we introduce model-based dimensionality reduction methods for neurons with non-Poisson firing statistics, and show that they can be framed equivalently in likelihood-based or information-theoretic terms. Finally, we show how to overcome practical limitations on the number of stimulus dimensions that MID can estimate by constraining the form of the non-parametric nonlinearity in an LNP model. We illustrate these methods with simulations and data from primate visual cortex. PMID:25831448

  14. Binary encoding method to encrypt Fourier-transformed information of digital images

    NASA Astrophysics Data System (ADS)

    Lin, Kuang Tsan

    2009-02-01

    An encoding method is used to encrypt the Fourier-transformed information of a hidden (covert) digital image in an overt image, while the Fourier-transformed information must be encoded with binary codes. All of the pixels in an overt image are classified into five groups that are called identification, type, tracing, dimension, and information codes. Identification codes are used to judge if the overt image contains codes that belong to the proposed encoding method or not; type codes are used to judge the encoding type; tracing codes are used to judge the encoding trace; dimension codes are used to judge the size of the hidden information; and information codes are used to decode the hidden information. Applying the proposed encoding method is rather easy, and host images corresponding to overt images are not needed for decoding work. The experiment has demonstrated four types of encoding for the proposed encoding method to reconstruct covert images without any distortion or only with a little distortion.

  15. Study of information extraction method of water body based on Mapping satellite-1 imagery

    NASA Astrophysics Data System (ADS)

    Zhang, Zhaoxi; Hu, Zhuowei; Du, Hongyue

    2014-11-01

    To find a suitable water extraction method from Mapping satellite-1 imagery in urban areas, this paper compared the results of different water extraction methods and studied the effect of building shadow on water extraction. Taken parts of Xinjiang (Beitun of Irtysh river Basin) as the study area, Mapping satellite-1 imagery as data sources, single-band threshold method, Normalized Difference Water Index (NDWI), Normalized Difference Vegetation Index (NDVI), and spectral relationship method based on spectral area were applied respectively to extract information of water body. The results of information extraction by use of the four methods were compared to determine the optimal method for water body extraction. The results showed that, for Mapping satellite-1 imagery, single-band threshold method and spectral relationship method which based on spectral area were effective to eliminate classification errors cased by shadows from buildings, fast, easy and accurate to extract information of water body from urban areas. In addition, the spectral relationship method based on spectral area also had the character of extracting small tributary rivers.

  16. A Privacy-Preserved Analytical Method for eHealth Database with Minimized Information Loss

    PubMed Central

    Chen, Ya-Ling; Cheng, Bo-Chao; Chen, Hsueh-Lin; Lin, Chia-I; Liao, Guo-Tan; Hou, Bo-Yu; Hsu, Shih-Chun

    2012-01-01

    Digitizing medical information is an emerging trend that employs information and communication technology (ICT) to manage health records, diagnostic reports, and other medical data more effectively, in order to improve the overall quality of medical services. However, medical information is highly confidential and involves private information, even legitimate access to data raises privacy concerns. Medical records provide health information on an as-needed basis for diagnosis and treatment, and the information is also important for medical research and other health management applications. Traditional privacy risk management systems have focused on reducing reidentification risk, and they do not consider information loss. In addition, such systems cannot identify and isolate data that carries high risk of privacy violations. This paper proposes the Hiatus Tailor (HT) system, which ensures low re-identification risk for medical records, while providing more authenticated information to database users and identifying high-risk data in the database for better system management. The experimental results demonstrate that the HT system achieves much lower information loss than traditional risk management methods, with the same risk of re-identification. PMID:22969273

  17. An information theory criteria based blind method for enumerating active users in DS-CDMA system

    NASA Astrophysics Data System (ADS)

    Samsami Khodadad, Farid; Abed Hodtani, Ghosheh

    2014-11-01

    In this paper, a new and blind algorithm for active user enumeration in asynchronous direct sequence code division multiple access (DS-CDMA) in multipath channel scenario is proposed. The proposed method is based on information theory criteria. There are two main categories of information criteria which are widely used in active user enumeration, Akaike Information Criterion (AIC) and Minimum Description Length (MDL) information theory criteria. The main difference between these two criteria is their penalty functions. Due to this difference, MDL is a consistent enumerator which has better performance in higher signal-to-noise ratios (SNR) but AIC is preferred in lower SNRs. In sequel, we propose a SNR compliance method based on subspace and training genetic algorithm to have the performance of both of them. Moreover, our method uses only a single antenna, in difference to the previous methods which decrease hardware complexity. Simulation results show that the proposed method is capable of estimating the number of active users without any prior knowledge and the efficiency of the method.

  18. A method of building information extraction based on mathematical morphology and multiscale

    NASA Astrophysics Data System (ADS)

    Li, Jing-wen; Wang, Ke; Zhang, Zi-ping; Xue, Long-li; Yin, Shou-qiang; Zhou, Song

    2015-12-01

    In view of monitoring the changes of buildings on Earth's surface ,by analyzing the distribution characteristics of building in remote sensing image, combined with multi-scale in image segmentation and the advantages of mathematical morphology, this paper proposes a multi-scale combined with mathematical morphology of high resolution remote sensing image segmentation method, and uses the multiple fuzzy classification method and the shadow of auxiliary method to extract information building, With the comparison of k-means classification, and the traditional maximum likelihood classification method, the results of experiment object based on multi-scale combined with mathematical morphology of image segmentation and extraction method, can accurately extract the structure of the information is more clear classification data, provide the basis for the intelligent monitoring of earth data and theoretical support.

  19. Measuring information flow in cellular networks by the systems biology method through microarray data

    PubMed Central

    Chen, Bor-Sen; Li, Cheng-Wei

    2015-01-01

    In general, it is very difficult to measure the information flow in a cellular network directly. In this study, based on an information flow model and microarray data, we measured the information flow in cellular networks indirectly by using a systems biology method. First, we used a recursive least square parameter estimation algorithm to identify the system parameters of coupling signal transduction pathways and the cellular gene regulatory network (GRN). Then, based on the identified parameters and systems theory, we estimated the signal transductivities of the coupling signal transduction pathways from the extracellular signals to each downstream protein and the information transductivities of the GRN between transcription factors in response to environmental events. According to the proposed method, the information flow, which is characterized by signal transductivity in coupling signaling pathways and information transductivity in the GRN, can be estimated by microarray temporal data or microarray sample data. It can also be estimated by other high-throughput data such as next-generation sequencing or proteomic data. Finally, the information flows of the signal transduction pathways and the GRN in leukemia cancer cells and non-leukemia normal cells were also measured to analyze the systematic dysfunction in this cancer from microarray sample data. The results show that the signal transductivities of signal transduction pathways change substantially from normal cells to leukemia cancer cells. PMID:26082788

  20. Development and Validation of an Instrument Measuring Theory-Based Determinants of Monitoring Obesogenic Behaviors of Pre-Schoolers among Hispanic Mothers

    PubMed Central

    Branscum, Paul; Lora, Karina R.

    2016-01-01

    Public health interventions are greatly needed for obesity prevention, and planning for such strategies should include community participation. The study’s purpose was to develop and validate a theory-based instrument with low-income, Hispanic mothers of preschoolers, to assess theory-based determinants of maternal monitoring of child’s consumption of fruits and vegetables and sugar-sweetened beverages (SSB). Nine focus groups with mothers were conducted to determine nutrition-related behaviors that mothers found as most obesogenic for their children. Next, behaviors were operationally defined and rated for importance and changeability. Two behaviors were selected for investigation (fruits and vegetable and SSB). Twenty semi-structured interviews with mothers were conducted next to develop culturally appropriate items for the instrument. Afterwards, face and content validity were established using a panel of six experts. Finally, the instrument was tested with a sample of 238 mothers. Psychometric properties evaluated included construct validity (using the maximum likelihood extraction method of factor analysis), and internal consistency reliability (Cronbach’s alpha). Results suggested that all scales on the instrument were valid and reliable, except for the autonomy scales. Researchers and community planners working with Hispanic families can use this instrument to measure theory-based determinants of parenting behaviors related to preschoolers’ consumption of fruits and vegetables, and SSB. PMID:27271643

  1. Development and Validation of an Instrument Measuring Theory-Based Determinants of Monitoring Obesogenic Behaviors of Pre-Schoolers among Hispanic Mothers.

    PubMed

    Branscum, Paul; Lora, Karina R

    2016-01-01

    Public health interventions are greatly needed for obesity prevention, and planning for such strategies should include community participation. The study's purpose was to develop and validate a theory-based instrument with low-income, Hispanic mothers of preschoolers, to assess theory-based determinants of maternal monitoring of child's consumption of fruits and vegetables and sugar-sweetened beverages (SSB). Nine focus groups with mothers were conducted to determine nutrition-related behaviors that mothers found as most obesogenic for their children. Next, behaviors were operationally defined and rated for importance and changeability. Two behaviors were selected for investigation (fruits and vegetable and SSB). Twenty semi-structured interviews with mothers were conducted next to develop culturally appropriate items for the instrument. Afterwards, face and content validity were established using a panel of six experts. Finally, the instrument was tested with a sample of 238 mothers. Psychometric properties evaluated included construct validity (using the maximum likelihood extraction method of factor analysis), and internal consistency reliability (Cronbach's alpha). Results suggested that all scales on the instrument were valid and reliable, except for the autonomy scales. Researchers and community planners working with Hispanic families can use this instrument to measure theory-based determinants of parenting behaviors related to preschoolers' consumption of fruits and vegetables, and SSB. PMID:27271643

  2. A Feature Extraction Method Based on Information Theory for Fault Diagnosis of Reciprocating Machinery

    PubMed Central

    Wang, Huaqing; Chen, Peng

    2009-01-01

    This paper proposes a feature extraction method based on information theory for fault diagnosis of reciprocating machinery. A method to obtain symptom parameter waves is defined in the time domain using the vibration signals, and an information wave is presented based on information theory, using the symptom parameter waves. A new way to determine the difference spectrum of envelope information waves is also derived, by which the feature spectrum can be extracted clearly and machine faults can be effectively differentiated. This paper also compares the proposed method with the conventional Hilbert-transform-based envelope detection and with a wavelet analysis technique. Practical examples of diagnosis for a rolling element bearing used in a diesel engine are provided to verify the effectiveness of the proposed method. The verification results show that the bearing faults that typically occur in rolling element bearings, such as outer-race, inner-race, and roller defects, can be effectively identified by the proposed method, while these bearing faults are difficult to detect using either of the other techniques it was compared to. PMID:22574021

  3. A New Method for Geometric Quality Evaluation of Remote Sensing Image Based on Information Entropy

    NASA Astrophysics Data System (ADS)

    Jiao, W.; Long, T.; Yang, G.; He, G.

    2014-11-01

    Geometric accuracy of the remote sensing rectified image is usually evaluated by the root-mean-square errors (RMSEs) of the ground control points (GCPs) and check points (CPs). These discrete geometric accuracy index data represent only on a local quality of the image with statistical methods. In addition, the traditional methods only evaluate the difference between the rectified image and reference image, ignoring the degree of the original image distortion. A new method of geometric quality evaluation of remote sensing image based on the information entropy is proposed in this paper. The information entropy, the amount of information and the uncertainty interval of the image before and after rectification are deduced according to the information theory. Four kind of rectification model and seven situations of GCP distribution are applied on the remotely sensed imagery in the experiments. The effective factors of the geometrical accuracy are analysed and the geometric qualities of the image are evaluated in various situations. Results show that the proposed method can be used to evaluate the rectification model, the distribution model of GCPs and the uncertainty of the remotely sensed imagery, and is an effective and objective assessment method.

  4. A feature extraction method based on information theory for fault diagnosis of reciprocating machinery.

    PubMed

    Wang, Huaqing; Chen, Peng

    2009-01-01

    This paper proposes a feature extraction method based on information theory for fault diagnosis of reciprocating machinery. A method to obtain symptom parameter waves is defined in the time domain using the vibration signals, and an information wave is presented based on information theory, using the symptom parameter waves. A new way to determine the difference spectrum of envelope information waves is also derived, by which the feature spectrum can be extracted clearly and machine faults can be effectively differentiated. This paper also compares the proposed method with the conventional Hilbert-transform-based envelope detection and with a wavelet analysis technique. Practical examples of diagnosis for a rolling element bearing used in a diesel engine are provided to verify the effectiveness of the proposed method. The verification results show that the bearing faults that typically occur in rolling element bearings, such as outer-race, inner-race, and roller defects, can be effectively identified by the proposed method, while these bearing faults are difficult to detect using either of the other techniques it was compared to. PMID:22574021

  5. Spectral-spatial classification combined with diffusion theory based inverse modeling of hyperspectral images

    NASA Astrophysics Data System (ADS)

    Paluchowski, Lukasz A.; Bjorgan, Asgeir; Nordgaard, Hâvard B.; Randeberg, Lise L.

    2016-02-01

    Hyperspectral imagery opens a new perspective for biomedical diagnostics and tissue characterization. High spectral resolution can give insight into optical properties of the skin tissue. However, at the same time the amount of collected data represents a challenge when it comes to decomposition into clusters and extraction of useful diagnostic information. In this study spectral-spatial classification and inverse diffusion modeling were employed to hyperspectral images obtained from a porcine burn model using a hyperspectral push-broom camera. The implemented method takes advantage of spatial and spectral information simultaneously, and provides information about the average optical properties within each cluster. The implemented algorithm allows mapping spectral and spatial heterogeneity of the burn injury as well as dynamic changes of spectral properties within the burn area. The combination of statistical and physics informed tools allowed for initial separation of different burn wounds and further detailed characterization of the injuries in short post-injury time.

  6. Investigation on Coding Method of Dental X-ray Image for Integrated Hospital Information System

    NASA Astrophysics Data System (ADS)

    Seki, Takashi; Hamamoto, Kazuhiko

    Recently, medical information system in dental field goes into digital system. In the system, X-ray image can be taken in digital modality and input to the system directly. Consequently, it is easy to combine the image data with alpha-numerical data which are stored in the conventional medical information system. It is useful to manipulate alpha-numerical data and image data simultaneously. The purpose of this research is to develop a new coding method for dental X-ray image. The method enables to reduce a disk space to store the images and transmit the images through Internet or LAN lightly. I attempt to apply multi-resolution analysis (wavelet transform) to accomplish the purpose. Proposed method achieves low bit-rate compared with conventional method.

  7. Non-intrusive hybrid interval method for uncertain nonlinear systems using derivative information

    NASA Astrophysics Data System (ADS)

    Liu, Zhuang-Zhuang; Wang, Tian-Shu; Li, Jun-Feng

    2016-02-01

    This paper proposes a new non-intrusive hybrid interval method using derivative information for the dynamic response analysis of nonlinear systems with uncertain-but-bounded parameters and/or initial conditions. This method provides tighter solution ranges compared to the existing polynomial approximation interval methods. Interval arithmetic using the Chebyshev basis and interval arithmetic using the general form modified affine basis for polynomials are developed to obtain tighter bounds for interval computation. To further reduce the overestimation caused by the "wrapping effect" of interval arithmetic, the derivative information of dynamic responses is used to achieve exact solutions when the dynamic responses are monotonic with respect to all the uncertain variables. Finally, two typical numerical examples with nonlinearity are applied to demonstrate the effectiveness of the proposed hybrid interval method, in particular, its ability to effectively control the overestimation for specific timepoints.

  8. The Effect of Health Information Technology on Health Care Provider Communication: A Mixed-Method Protocol

    PubMed Central

    Adler-Milstein, Julia; Harrod, Molly; Sales, Anne; Hofer, Timothy P; Saint, Sanjay; Krein, Sarah L

    2015-01-01

    Background Communication failures between physicians and nurses are one of the most common causes of adverse events for hospitalized patients, as well as a major root cause of all sentinel events. Communication technology (ie, the electronic medical record, computerized provider order entry, email, and pagers), which is a component of health information technology (HIT), may help reduce some communication failures but increase others because of an inadequate understanding of how communication technology is used. Increasing use of health information and communication technologies is likely to affect communication between nurses and physicians. Objective The purpose of this study is to describe, in detail, how health information and communication technologies facilitate or hinder communication between nurses and physicians with the ultimate goal of identifying how we can optimize the use of these technologies to support effective communication. Effective communication is the process of developing shared understanding between communicators by establishing, testing, and maintaining relationships. Our theoretical model, based in communication and sociology theories, describes how health information and communication technologies affect communication through communication practices (ie, use of rich media; the location and availability of computers) and work relationships (ie, hierarchies and team stability). Therefore we seek to (1) identify the range of health information and communication technologies used in a national sample of medical-surgical acute care units, (2) describe communication practices and work relationships that may be influenced by health information and communication technologies in these same settings, and (3) explore how differences in health information and communication technologies, communication practices, and work relationships between physicians and nurses influence communication. Methods This 4-year study uses a sequential mixed-methods

  9. Multidimensional mutual information methods for the analysis of covariation in multiple sequence alignments

    PubMed Central

    2014-01-01

    Background Several methods are available for the detection of covarying positions from a multiple sequence alignment (MSA). If the MSA contains a large number of sequences, information about the proximities between residues derived from covariation maps can be sufficient to predict a protein fold. However, in many cases the structure is already known, and information on the covarying positions can be valuable to understand the protein mechanism and dynamic properties. Results In this study we have sought to determine whether a multivariate (multidimensional) extension of traditional mutual information (MI) can be an additional tool to study covariation. The performance of two multidimensional MI (mdMI) methods, designed to remove the effect of ternary/quaternary interdependencies, was tested with a set of 9 MSAs each containing <400 sequences, and was shown to be comparable to that of the newest methods based on maximum entropy/pseudolikelyhood statistical models of protein sequences. However, while all the methods tested detected a similar number of covarying pairs among the residues separated by < 8 Å in the reference X-ray structures, there was on average less than 65% overlap between the top scoring pairs detected by methods that are based on different principles. Conclusions Given the large variety of structure and evolutionary history of different proteins it is possible that a single best method to detect covariation in all proteins does not exist, and that for each protein family the best information can be derived by merging/comparing results obtained with different methods. This approach may be particularly valuable in those cases in which the size of the MSA is small or the quality of the alignment is low, leading to significant differences in the pairs detected by different methods. PMID:24886131

  10. Application of information-retrieval methods to the classification of physical data

    NASA Technical Reports Server (NTRS)

    Mamotko, Z. N.; Khorolskaya, S. K.; Shatrovskiy, L. I.

    1975-01-01

    Scientific data received from satellites are characterized as a multi-dimensional time series, whose terms are vector functions of a vector of measurement conditions. Information retrieval methods are used to construct lower dimensional samples on the basis of the condition vector, in order to obtain these data and to construct partial relations. The methods are applied to the joint Soviet-French Arkad project.

  11. Developing and testing theory-based and evidence-based interventions to promote switching to arsenic-safe wells in Bangladesh.

    PubMed

    Inauen, Jennifer; Mosler, Hans-Joachim

    2014-12-01

    Millions of people in Bangladesh drink arsenic-contaminated water despite increased awareness of consequences to health. Theory-based and evidence-based interventions are likely to have greater impact on people switching to existing arsenic-safe wells than providing information alone. To test this assumption, we first developed interventions based on an empirical test of the Risk, Attitudes, Norms, Abilities and Self-regulation (RANAS) model of behaviour change. In the second part of this study, a cluster-randomised controlled trial revealed that in accordance with our hypotheses, information alone showed smaller increases in switching to arsenic-safe wells than information with reminders or information with reminders and implementation intentions. PMID:23864069

  12. An Augmented Classical Least Squares Method for Quantitative Raman Spectral Analysis against Component Information Loss

    PubMed Central

    Zhou, Yan; Cao, Hui

    2013-01-01

    We propose an augmented classical least squares (ACLS) calibration method for quantitative Raman spectral analysis against component information loss. The Raman spectral signals with low analyte concentration correlations were selected and used as the substitutes for unknown quantitative component information during the CLS calibration procedure. The number of selected signals was determined by using the leave-one-out root-mean-square error of cross-validation (RMSECV) curve. An ACLS model was built based on the augmented concentration matrix and the reference spectral signal matrix. The proposed method was compared with partial least squares (PLS) and principal component regression (PCR) using one example: a data set recorded from an experiment of analyte concentration determination using Raman spectroscopy. A 2-fold cross-validation with Venetian blinds strategy was exploited to evaluate the predictive power of the proposed method. The one-way variance analysis (ANOVA) was used to access the predictive power difference between the proposed method and existing methods. Results indicated that the proposed method is effective at increasing the robust predictive power of traditional CLS model against component information loss and its predictive power is comparable to that of PLS or PCR. PMID:23956689

  13. A randomised controlled trial of a theory-based intervention to improve sun protective behaviour in adolescents ('you can still be HOT in the shade'): study protocol

    PubMed Central

    2012-01-01

    Background Most skin cancers are preventable by encouraging consistent use of sun protective behaviour. In Australia, adolescents have high levels of knowledge and awareness of the risks of skin cancer but exhibit significantly lower sun protection behaviours than adults. There is limited research aimed at understanding why people do or do not engage in sun protective behaviour, and an associated absence of theory-based interventions to improve sun safe behaviour. This paper presents the study protocol for a school-based intervention which aims to improve the sun safe behaviour of adolescents. Methods/design Approximately 400 adolescents (aged 12-17 years) will be recruited through Queensland, Australia public and private schools and randomized to the intervention (n = 200) or 'wait-list' control group (n = 200). The intervention focuses on encouraging supportive sun protective attitudes and beliefs, fostering perceptions of normative support for sun protection behaviour, and increasing perceptions of control/self-efficacy over using sun protection. It will be delivered during three × one hour sessions over a three week period from a trained facilitator during class time. Data will be collected one week pre-intervention (Time 1), and at one week (Time 2) and four weeks (Time 3) post-intervention. Primary outcomes are intentions to sun protect and sun protection behaviour. Secondary outcomes include attitudes toward performing sun protective behaviours (i.e., attitudes), perceptions of normative support to sun protect (i.e., subjective norms, group norms, and image norms), and perceived control over performing sun protective behaviours (i.e., perceived behavioural control). Discussion The study will provide valuable information about the effectiveness of the intervention in improving the sun protective behaviour of adolescents. PMID:22212211

  14. Alberta Diabetes and Physical Activity Trial (ADAPT): A randomized theory-based efficacy trial for adults with type 2 diabetes - rationale, design, recruitment, evaluation, and dissemination

    PubMed Central

    2010-01-01

    Background The primary aim of this study was to compare the efficacy of three physical activity (PA) behavioural intervention strategies in a sample of adults with type 2 diabetes. Method/Design Participants (N = 287) were randomly assigned to one of three groups consisting of the following intervention strategies: (1) standard printed PA educational materials provided by the Canadian Diabetes Association [i.e., Group 1/control group)]; (2) standard printed PA educational materials as in Group 1, pedometers, a log book and printed PA information matched to individuals' PA stage of readiness provided every 3 months (i.e., Group 2); and (3) PA telephone counseling protocol matched to PA stage of readiness and tailored to personal characteristics, in addition to the materials provided in Groups 1 and 2 (i.e., Group 3). PA behaviour measured by the Godin Leisure Time Exercise Questionnaire and related social-cognitive measures were assessed at baseline, 3, 6, 9, 12 and 18-months (i.e., 6-month follow-up). Clinical (biomarkers) and health-related quality of life assessments were conducted at baseline, 12-months, and 18-months. Linear Mixed Model (LMM) analyses will be used to examine time-dependent changes from baseline across study time points for Groups 2 and 3 relative to Group 1. Discussion ADAPT will determine whether tailored but low-cost interventions can lead to sustainable increases in PA behaviours. The results may have implications for practitioners in designing and implementing theory-based physical activity promotion programs for this population. Clinical Trials Registration ClinicalTrials.gov identifier: NCT00221234 PMID:20067626

  15. Parenting Practices of Anxious and Nonanxious Mothers: A Multi-Method, Multi-Informant Approach

    ERIC Educational Resources Information Center

    Drake, Kelly L.; Ginsburg, Golda S.

    2011-01-01

    Anxious and nonanxious mothers were compared on theoretically derived parenting and family environment variables (i.e., overcontrol, warmth, criticism, anxious modeling) using multiple informants and methods. Mother-child dyads completed questionnaires about parenting and were observed during an interactional task. Findings reveal that, after…

  16. Method and Apparatus Providing Deception and/or Altered Operation in an Information System Operating System

    DOEpatents

    Cohen, Fred; Rogers, Deanna T.; Neagoe, Vicentiu

    2008-10-14

    A method and/or system and/or apparatus providing deception and/or execution alteration in an information system. In specific embodiments, deceptions and/or protections are provided by intercepting and/or modifying operation of one or more system calls of an operating system.

  17. Shape Recovery: A Visual Method for Evaluation of Information Retrieval Experiments.

    ERIC Educational Resources Information Center

    Rorvig, Mark; Fitzpatrick, Steven

    2000-01-01

    The method described permits visual analysis of information retrieval (IR) experiment results in classic control and treatment group protocols. It is an additional analysis technique for the evaluation of IR procedures that may be conducted within the rich terrain of human visual acuity, supported by two well-known statistical measures. (Contains…

  18. Systems, methods and apparatus for implementation of formal specifications derived from informal requirements

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G. (Inventor); Rash, James L. (Inventor); Erickson, John D. (Inventor); Gracinin, Denis (Inventor); Rouff, Christopher A. (Inventor)

    2010-01-01

    Systems, methods and apparatus are provided through which in some embodiments an informal specification is translated without human intervention into a formal specification. In some embodiments the formal specification is a process-based specification. In some embodiments, the formal specification is translated into a high-level computer programming language which is further compiled into a set of executable computer instructions.

  19. Reduction in redundancy of multichannel telemetric information by the method of adaptive discretization with associative sorting

    NASA Technical Reports Server (NTRS)

    Kantor, A. V.; Timonin, V. G.; Azarova, Y. S.

    1974-01-01

    The method of adaptive discretization is the most promising for elimination of redundancy from telemetry messages characterized by signal shape. Adaptive discretization with associative sorting was considered as a way to avoid the shortcomings of adaptive discretization with buffer smoothing and adaptive discretization with logical switching in on-board information compression devices (OICD) in spacecraft. Mathematical investigations of OICD are presented.

  20. Genetically Informative Research on Adolescent Substance Use: Methods, Findings, and Challenges

    ERIC Educational Resources Information Center

    Lynskey, Michael T.; Agrawal, Arpana; Heath, Andrew C.

    2010-01-01

    Objective: To provide an overview of the genetic epidemiology of substance use and misuse in adolescents. Method: A selective review of genetically informative research strategies, their limitations, and key findings examining issues related to the heritability of substance use and substance use disorders in children and adolescents is presented.…

  1. 78 FR 25440 - Request for Information and Citations on Methods for Cumulative Risk Assessment

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-01

    ...The United States Environmental Protection Agency (EPA) is requesting information and citations on approaches and methods for the planning, analysis, assessment, and characterization of cumulative risks to human populations and the environment. The EPA is developing guidelines for the assessment of cumulative risk as defined and characterized in the EPA 2003 publication Framework for......

  2. THE RELATIVE EFFECTIVENESS OF THE TRADITIONAL AND TWO MODIFIED METHODS OF ORGANIZING INFORMATION SHEETS.

    ERIC Educational Resources Information Center

    PUCEL, DAVID J.

    THE EFFECTIVENESS OF A TYPICAL METHOD OF ORGANIZING TECHNICAL INFORMATION SHEETS USED BY VOCATIONAL EDUCATORS TO PROVIDE UP-TO-DATE INSTRUCTION TO STUDENTS WAS COMPARED TO THAT OF TWO NEWLY DEVELOPED ORGANIZATIONS BASED ON "THE SUBSUMPTION THEORY OF MEANINGFUL VERBAL LEARNING AND RETENTION" (AUSUBEL, 1962). AN OPERATIONAL DEFINITION STATED AN…

  3. The Implementation and Effectiveness of Geographic Information Systems Technology and Methods in Secondary Education

    ERIC Educational Resources Information Center

    Kerski, Joseph J.

    2003-01-01

    Geographic information systems (GIS) technology and methods have transformed decision-making in society by bringing geographic analysis to the desktop computer. Although some educators consider GIS to be a promising means for implementing reform, it has been adopted by less than 2 percent of American high schools. The reasons behind the interest…

  4. An Inquiry-Based Approach to Teaching Research Methods in Information Studies

    ERIC Educational Resources Information Center

    Albright, Kendra; Petrulis, Robert; Vasconcelos, Ana; Wood, Jamie

    2012-01-01

    This paper presents the results of a project that aimed at restructuring the delivery of research methods training at the Information School at the University of Sheffield, UK, based on an Inquiry-Based Learning (IBL) approach. The purpose of this research was to implement inquiry-based learning that would allow customization of research methods…

  5. Estimation of IRT Graded Response Models: Limited versus Full Information Methods

    ERIC Educational Resources Information Center

    Forero, Carlos G.; Maydeu-Olivares, Alberto

    2009-01-01

    The performance of parameter estimates and standard errors in estimating F. Samejima's graded response model was examined across 324 conditions. Full information maximum likelihood (FIML) was compared with a 3-stage estimator for categorical item factor analysis (CIFA) when the unweighted least squares method was used in CIFA's third stage. CIFA…

  6. Methods study for the relocation of visual information in central scotoma cases

    NASA Astrophysics Data System (ADS)

    Scherlen, Anne-Catherine; Gautier, Vincent

    2005-03-01

    In this study we test the benefit on the reading performance of different ways to relocating the visual information present under the scotoma. The relocation (or unmasking) allows to compensate the loss of information and avoid the patient developing driving strategies not adapted for the reading. Eight healthy subjects were tested on a reading task, on each a central scotoma of various sizes was simulated. We then evaluate the reading speed (words/min) during three visual information relocation methods: all masked information is relocated - on both side of scotoma, - on the right of scotoma, - and only essentials letters for the word recognition too on the right of scotoma. We compare these reading speeds versus the pathological condition, ie without relocating visual information. Our results show that unmasking strategy improve the reading speed when all the visual information is unmask to the right of scotoma, this only for large scotoma. Taking account the word morphology, the perception of only certain letters outside the scotoma can be sufficient to improve the reading speed. A deepening of reading processes in the presence of a scotoma will then allows a new perspective for visual information unmasking. Multidisciplinary competences brought by engineers, ophtalmologists, linguists, clinicians would allow to optimize the reading benefit brought by the unmasking.

  7. A Comparison of Limited-Information and Full-Information Methods in M"plus" for Estimating Item Response Theory Parameters for Nonnormal Populations

    ERIC Educational Resources Information Center

    DeMars, Christine E.

    2012-01-01

    In structural equation modeling software, either limited-information (bivariate proportions) or full-information item parameter estimation routines could be used for the 2-parameter item response theory (IRT) model. Limited-information methods assume the continuous variable underlying an item response is normally distributed. For skewed and…

  8. An information processing method for acoustic emission signal inspired from musical staff

    NASA Astrophysics Data System (ADS)

    Zheng, Wei; Wu, Chunxian

    2016-01-01

    This study proposes a musical-staff-inspired signal processing method for standard description expressions for discrete signals and describing the integrated characteristics of acoustic emission (AE) signals. The method maps various AE signals with complex environments into the normalized musical space. Four new indexes are proposed to comprehensively describe the signal. Several key features, such as contour, amplitude, and signal changing rate, are quantitatively expressed in a normalized musical space. The processed information requires only a small storage space to maintain high fidelity. The method is illustrated by using experiments on sandstones and computed tomography (CT) scanning to determine its validity for AE signal processing.

  9. Method for Bandwidth Compression and Transmission of Environmental Information in Bilateral Teleoperation

    NASA Astrophysics Data System (ADS)

    Kubo, Ryogo; Ohnishi, Kouhei

    In this paper, a novel method for bandwidth compression and transmission of environmental information is proposed for bilateral teleoperation systems with multiple degrees of freedom (MDOF). In this method, environmental information, i.e., the position of end-effectors and the reaction force exerted on them, is converted into environmental modes by using discrete Fourier transform (DFT) matrices. The environmental modes to be transmitted are then selected on the basis of the communication bandwidth between master and slave robots. Bilateral control is achieved in low-frequency modal spaces, and local position control is achieved in high-frequency modal spaces. The validity of the proposed method is confirmed by performing an experiment.

  10. Preferred Methods for Delivery of Technological Information by the North Carolina Agricultural Extension Service: Opinions of Agricultural Producers Who Use Extension Information.

    ERIC Educational Resources Information Center

    Richardson, John G.; Mustian, R. David

    The findings of a questionnaire survey of 702 North Carolina agricultural producers indicated that communication methods historically used by the North Carolina Agricultural Extension Service for information dissemination are accepted by state farmers and continue to be popular. Information delivery methods most frequently preferred are…

  11. Improvements in recall and food choices using a graphical method to deliver information of select nutrients.

    PubMed

    Pratt, Nathan S; Ellison, Brenna D; Benjamin, Aaron S; Nakamura, Manabu T

    2016-01-01

    Consumers have difficulty using nutrition information. We hypothesized that graphically delivering information of select nutrients relative to a target would allow individuals to process information in time-constrained settings more effectively than numerical information. Objectives of the study were to determine the efficacy of the graphical method in (1) improving memory of nutrient information and (2) improving consumer purchasing behavior in a restaurant. Values of fiber and protein per calorie were 2-dimensionally plotted alongside a target box. First, a randomized cued recall experiment was conducted (n=63). Recall accuracy of nutrition information improved by up to 43% when shown graphically instead of numerically. Second, the impact of graphical nutrition signposting on diner choices was tested in a cafeteria. Saturated fat and sodium information was also presented using color coding. Nutrient content of meals (n=362) was compared between 3 signposting phases: graphical, nutrition facts panels (NFP), or no nutrition label. Graphical signposting improved nutrient content of purchases in the intended direction, whereas NFP had no effect compared with the baseline. Calories ordered from total meals, entrées, and sides were significantly less during graphical signposting than no-label and NFP periods. For total meal and entrées, protein per calorie purchased was significantly higher and saturated fat significantly lower during graphical signposting than the other phases. Graphical signposting remained a predictor of calories and protein per calorie purchased in regression modeling. These findings demonstrate that graphically presenting nutrition information makes that information more available for decision making and influences behavior change in a realistic setting. PMID:26773780

  12. Information Preservation (IP) Method in Simulation of Internal Rarefied Gas Flows in MEMS

    NASA Astrophysics Data System (ADS)

    Shen, Ching

    2005-05-01

    This paper reviews firstly methods for treating low speed rarefied gas flows: the linearised Boltzmann equation, the Lattice Boltzmann method (LBM), the Navier-Stokes equation plus slip boundary conditions and the DSMC method, and discusses the difficulties in simulating low speed transitional MEMS flows, especially the internal flows. In particular, the present version of the LBM is shown unfeasible for simulation of MEMS flow in transitional regime. The information preservation (IP) method overcomes the difficulty of the statistical simulation caused by the small information to noise ratio for low speed flows by preserving the average information of the enormous number of molecules a simulated molecule represents. A kind of validation of the method is given in this paper. The specificities of the internal flows in MEMS, i.e. the low speed and the large length to width ratio, result in the problem of elliptic nature of the necessity to regulate the inlet and outlet boundary conditions that influence each other. Through the example of the IP calculation of the microchannel (thousands μm long) flow it is shown that the adoption of the conservative scheme of the mass conservation equation and the super relaxation method resolves this problem successfully. With employment of the same measures the IP method solves the thin film air bearing problem in transitional regime for authentic hard disc write/read head length (L = 1000μm) and provides pressure distribution in full agreement with the generalized Reynolds equation, while before this the DSMC check of the validity of the Reynolds equation was done only for short (L = 5μm) drive head. The author suggests degenerate the Reynolds equation to solve the microchannel flow problem in transitional regime, thus provides a means with merit of strict kinetic theory for testing various methods intending to treat the internal MEMS flows.

  13. APhoRISM FP7 project: the A Priori information for Earthquake damage mapping method

    NASA Astrophysics Data System (ADS)

    Bignami, Christian; Stramondo, Salvatore; Pierdicca, Nazzareno

    2014-05-01

    The APhoRISM - Advanced PRocedure for volcanIc and Seismic Monitoring - project is an FP7 funded project, which aims at developing and testing two new methods to combine Earth Observation satellite data from different sensors, and ground data for seismic and volcanic risk management. The objective is to demonstrate that this two types of data, appropriately managed and integrated, can provide new improved products useful for seismic and volcanic crisis management. One of the two methods deals with earthquakes, and it concerns the generation of maps to address the detection and estimate of damage caused by a seism. The method is named APE - A Priori information for Earthquake damage mapping. The use of satellite data to investigate earthquake damages is not an innovative issue. Indeed, a wide literature and projects have addressed and focused such issue, but usually the proposed approaches are only based on change detection techniques and/or classifications algorithms. The novelty of APhoRISM-APE relies on the exploitation of a priori information derived by: - InSAR time series to measure surface movements - shakemaps obtained from seismological data - vulnerability information. This a priori information is then integrated with change detection map from earth observation satellite sensors (either Optical or Synthetic Aperture Radar) to improve accuracy and to limit false alarms.

  14. Application of Multi-Sensor Information Fusion Method Based on Rough Sets and Support Vector Machine

    NASA Astrophysics Data System (ADS)

    Xue, Jinxue; Wang, Guohu; Wang, Xiaoqiang; Cui, Fengkui

    In order to improve the precision and date processing speed of multi-sensor information fusion, a kind of multi-sensor data fusion process algorithm has been studied in this paper. First, based on rough set theory (RS) to attribute reduction the parameter set, we use the advantages of rough set theory in dealing with large amount of data to eliminate redundant information. Then, the data can be trained and classified by Support Vector Machine (SYM). Experimental results showed that this method can improve the speed and accuracy of multi-sensor fusion system.

  15. Control theory based airfoil design for potential flow and a finite volume discretization

    NASA Technical Reports Server (NTRS)

    Reuther, J.; Jameson, A.

    1994-01-01

    This paper describes the implementation of optimization techniques based on control theory for airfoil design. In previous studies it was shown that control theory could be used to devise an effective optimization procedure for two-dimensional profiles in which the shape is determined by a conformal transformation from a unit circle, and the control is the mapping function. The goal of our present work is to develop a method which does not depend on conformal mapping, so that it can be extended to treat three-dimensional problems. Therefore, we have developed a method which can address arbitrary geometric shapes through the use of a finite volume method to discretize the potential flow equation. Here the control law serves to provide computationally inexpensive gradient information to a standard numerical optimization method. Results are presented, where both target speed distributions and minimum drag are used as objective functions.

  16. Theory-Based Interventions in Physical Activity: A Systematic Review of Literature in Iran

    PubMed Central

    Abdi, Jalal; Eftekhar, Hassan; Estebsari, Fatemeh; Sadeghi, Roya

    2015-01-01

    Lack of physical activity is ranked fourth among the causes of human death and chronic diseases. Using models and theories to design, implement, and evaluate the health education and health promotion interventions has many advantages. Using models and theories of physical activity, we decided to systematically study the educational and promotional interventions carried out in Iran from 2003 to 2013.Three information databases were used to systematically select papers using key words including Iranian Magazine Database (MAGIRAN), Iran Medical Library (MEDLIB), and Scientific Information Database (SID). Twenty papers were selected and studied. Having been applied in 9 studies, The Trans Theoretical Model (TTM) was the most widespread model in Iran (PENDER in 3 studies, BASNEF in 2, and the Theory of Planned Behavior in 2 studies). With regards to the educational methods, almost all studies used a combination of methods. The most widely used Integrative educational method was group discussion. Only one integrated study was done. Behavior maintenance was not addressed in 75% of the studies. Almost all studies used self-reporting instruments. The effectiveness of educational methods was assessed in none of the studies. Most of the included studies had several methodological weaknesses, which hinder the validity and applicability of their results. According to the findings, the necessity of need assessment in using models, epidemiology and methodology consultation, addressing maintenance of physical activity, using other theories and models such as social marketing and social-cognitive theory, and other educational methods like empirical and complementary are suggested. PMID:25948454

  17. Theory-based interventions in physical activity: a systematic review of literature in Iran.

    PubMed

    Abdi, Jalal; Eftekhar, Hassan; Estebsari, Fatemeh; Sadeghi, Roya

    2015-01-01

    Lack of physical activity is ranked fourth among the causes of human death and chronic diseases. Using models and theories to design, implement, and evaluate the health education and health promotion interventions has many advantages. Using models and theories of physical activity, we decided to systematically study the educational and promotional interventions carried out in Iran from 2003 to 2013.Three information databases were used to systematically select papers using key words including Iranian Magazine Database (MAGIRAN), Iran Medical Library (MEDLIB), and Scientific Information Database (SID). Twenty papers were selected and studied .Having been applied in 9 studies, The Trans Theoretical Model (TTM) was the most widespread model in Iran (PENDER in 3 studies, BASNEF in 2, and the Theory of Planned Behavior in 2 studies). With regards to the educational methods, almost all studies used a combination of methods. The most widely used Integrative educational method was group discussion. Only one integrated study was done. Behavior maintenance was not addressed in 75% of the studies. Almost all studies used self-reporting instruments. The effectiveness of educational methods was assessed in none of the studies. Most of the included studies had several methodological weaknesses, which hinder the validity and applicability of their results. According to the findings, the necessity of need assessment in using models, epidemiology and methodology consultation, addressing maintenance of physical activity, using other theories and models such as social marketing and social-cognitive theory, and other educational methods like empirical and complementary are suggested. PMID:25948454

  18. Post-Reconstruction Non-Local Means Filtering Methods using CT Side Information for Quantitative SPECT

    PubMed Central

    Chun, Se Young; Fessler, Jeffrey A.; Dewaraja, Yuni K.

    2013-01-01

    Quantitative SPECT techniques are important for many applications including internal emitter therapy dosimetry where accurate estimation of total target activity and activity distribution within targets are both potentially important for dose-response evaluations. We investigated non-local means (NLM) post-reconstruction filtering for accurate I-131 SPECT estimation of both total target activity and the 3D activity distribution. We first investigated activity estimation versus number of ordered-subsets expectation-maximization (OSEM) iterations. We performed simulations using the XCAT phantom with tumors containing a uniform and a non-uniform activity distribution, and measured the recovery coefficient (RC) and the root mean squared error (RMSE) to quantify total target activity and activity distribution, respectively. We observed that using more OSEM iterations is essential for accurate estimation of RC, but may or may not improve RMSE. We then investigated various post-reconstruction filtering methods to suppress noise at high iteration while preserving image details so that both RC and RMSE can be improved. Recently, NLM filtering methods have shown promising results for noise reduction. Moreover, NLM methods using high-quality side information can improve image quality further. We investigated several NLM methods with and without CT side information for I-131 SPECT imaging and compared them to conventional Gaussian filtering and to unfiltered methods. We studied four different ways of incorporating CT information in the NLM methods: two known (NLM CT-B and NLM CT-M) and two newly considered (NLM CT-S and NLM CT-H). We also evaluated the robustness of NLM filtering using CT information to erroneous CT. NLM CT-S and NLM CT-H yielded comparable RC values to unfiltered images while substantially reducing RMSE. NLM CT-S achieved −2.7 to 2.6% increase of RC compared to no filtering and NLM CT-H yielded up to 6% decrease in RC while other methods yielded lower RCs

  19. Post-reconstruction non-local means filtering methods using CT side information for quantitative SPECT

    NASA Astrophysics Data System (ADS)

    Chun, Se Young; Fessler, Jeffrey A.; Dewaraja, Yuni K.

    2013-09-01

    Quantitative SPECT techniques are important for many applications including internal emitter therapy dosimetry where accurate estimation of total target activity and activity distribution within targets are both potentially important for dose-response evaluations. We investigated non-local means (NLM) post-reconstruction filtering for accurate I-131 SPECT estimation of both total target activity and the 3D activity distribution. We first investigated activity estimation versus number of ordered-subsets expectation-maximization (OSEM) iterations. We performed simulations using the XCAT phantom with tumors containing a uniform and a non-uniform activity distribution, and measured the recovery coefficient (RC) and the root mean squared error (RMSE) to quantify total target activity and activity distribution, respectively. We observed that using more OSEM iterations is essential for accurate estimation of RC, but may or may not improve RMSE. We then investigated various post-reconstruction filtering methods to suppress noise at high iteration while preserving image details so that both RC and RMSE can be improved. Recently, NLM filtering methods have shown promising results for noise reduction. Moreover, NLM methods using high-quality side information can improve image quality further. We investigated several NLM methods with and without CT side information for I-131 SPECT imaging and compared them to conventional Gaussian filtering and to unfiltered methods. We studied four different ways of incorporating CT information in the NLM methods: two known (NLM CT-B and NLM CT-M) and two newly considered (NLM CT-S and NLM CT-H). We also evaluated the robustness of NLM filtering using CT information to erroneous CT. NLM CT-S and NLM CT-H yielded comparable RC values to unfiltered images while substantially reducing RMSE. NLM CT-S achieved -2.7 to 2.6% increase of RC compared to no filtering and NLM CT-H yielded up to 6% decrease in RC while other methods yielded lower RCs

  20. A Novel Method Incorporating Gene Ontology Information for Unsupervised Clustering and Feature Selection

    PubMed Central

    Srivastava, Shireesh; Zhang, Linxia; Jin, Rong; Chan, Christina

    2008-01-01

    Background Among the primary goals of microarray analysis is the identification of genes that could distinguish between different phenotypes (feature selection). Previous studies indicate that incorporating prior information of the genes' function could help identify physiologically relevant features. However, current methods that incorporate prior functional information do not provide a relative estimate of the effect of different genes on the biological processes of interest. Results Here, we present a method that integrates gene ontology (GO) information and expression data using Bayesian regression mixture models to perform unsupervised clustering of the samples and identify physiologically relevant discriminating features. As a model application, the method was applied to identify the genes that play a role in the cytotoxic responses of human hepatoblastoma cell line (HepG2) to saturated fatty acid (SFA) and tumor necrosis factor (TNF)-α, as compared to the non-toxic response to the unsaturated FFAs (UFA) and TNF-α. Incorporation of prior knowledge led to a better discrimination of the toxic phenotypes from the others. The model identified roles of lysosomal ATPases and adenylate cyclase (AC9) in the toxicity of palmitate. To validate the role of AC in palmitate-treated cells, we measured the intracellular levels of cyclic AMP (cAMP). The cAMP levels were found to be significantly reduced by palmitate treatment and not by the other FFAs, in accordance with the model selection of AC9. Conclusions A framework is presented that incorporates prior ontology information, which helped to (a) perform unsupervised clustering of the phenotypes, and (b) identify the genes relevant to each cluster of phenotypes. We demonstrate the proposed framework by applying it to identify physiologically-relevant feature genes that conferred differential toxicity to saturated vs. unsaturated FFAs. The framework can be applied to other problems to efficiently integrate ontology

  1. Application of damage mechanism-specific NDE methods in support of risk-informed inspections

    SciTech Connect

    Walker, S.M.; Ammirato, F.V.

    1996-12-01

    Risk-informed inservice inspection (RISI) programs effectively concentrate limited and costly examination resources on systems and locations most relevant to plant safety. The thought process used in the selection of nondestructive evaluation (NDE) methods and procedures in a RISI program is expected to change toward integrating NDE into integrity management, with a concentration on understanding failure mechanisms. Identifying which damage mechanisms may be operative in specific locations and applying appropriate NDE methods to detect the presence of these damage mechanisms is fundamental to effective RISI application. Considerable information is already available on inspection for damage mechanisms such as intergranular stress corrosion cracking (IGSCC), thermal fatigue, and erosion-corrosion. Similar procedures are under development for other damage mechanisms that may occur individually or in combination with other mechanisms. Guidance is provided on application of NDE procedures in an RISI framework to facilitate implementation by utility staff (Gosselin, 1996).

  2. Information System Hazard Analysis: A Method for Identifying Technology-induced Latent Errors for Safety.

    PubMed

    Weber, Jens H; Mason-Blakley, Fieran; Price, Morgan

    2015-01-01

    Many health information and communication technologies (ICT) are safety-critical; moreover, reports of technology-induced adverse events related to them are plentiful in the literature. Despite repeated criticism and calls to action, recent data collected by the Institute of Medicine (IOM) and other organization do not indicate significant improvements with respect to the safety of health ICT systems. A large part of the industry still operates on a reactive "break & patch" model; the application of pro-active, systematic hazard analysis methods for engineering ICT that produce "safe by design" products is sparse. This paper applies one such method: Information System Hazard Analysis (ISHA). ISHA adapts and combines hazard analysis techniques from other safety-critical domains and customizes them for ICT. We provide an overview of the steps involved in ISHA and describe. PMID:25676999

  3. A method for tailoring the information content of a software process model

    NASA Technical Reports Server (NTRS)

    Perkins, Sharon; Arend, Mark B.

    1990-01-01

    The framework is defined for a general method for selecting a necessary and sufficient subset of a general software life cycle's information products, to support new software development process. Procedures for characterizing problem domains in general and mapping to a tailored set of life cycle processes and products is presented. An overview of the method is shown using the following steps: (1) During the problem concept definition phase, perform standardized interviews and dialogs between developer and user, and between user and customer; (2) Generate a quality needs profile of the software to be developed, based on information gathered in step 1; (3) Translate the quality needs profile into a profile of quality criteria that must be met by the software to satisfy the quality needs; (4) Map the quality criteria to a set of accepted processes and products for achieving each criterion; (5) select the information products which match or support the accepted processes and product of step 4; and (6) Select the design methodology which produces the information products selected in step 5.

  4. A method for tailoring the information content of a software process model

    NASA Technical Reports Server (NTRS)

    Perkins, Sharon; Arend, Mark B.

    1990-01-01

    The framework is defined for a general method for selecting a necessary and sufficient subset of a general software life cycle's information products, to support new software development process. Procedures for characterizing problem domains in general and mapping to a tailored set of life cycle processes and products is presented. An overview of the method is shown using the following steps: (1) During the problem concept definition phase, perform standardized interviews and dialogs between developer and user, and between user and customer; (2) Generate a quality needs profile of the software to be developed, based on information gathered in step 1; (3) Translate the quality needs profile into a profile of quality criteria that must be met by the software to satisfy the quality needs; (4) Map the quality criteria to set of accepted processes and products for achieving each criterion; (5) Select the information products which match or support the accepted processes and product of step 4; and (6) Select the design methodology which produces the information products selected in step 5.

  5. A Topologically-Informed Hyperstreamline Seeding Method for Alignment Tensor Fields.

    PubMed

    Fu, Fred; Abukhdeir, Nasser Mohieddin

    2015-03-01

    A topologically-informed hyperstreamline seeding method is presented for visualization of alignment tensor fields. The method is inspired by and applied to visualization of nematic liquid crystal (LC) orientation dynamics simulations. The method distributes hyperstreamlines along domain boundaries and edges of a nearest-neighbor graph whose vertices are degenerate regions of the alignment tensor field, which correspond to orientational defects in a nematic LC domain. This is accomplished without iteration while conforming to a user-specified spacing between hyperstreamlines and avoids possible failure modes associated with hyperstreamline integration in the vicinity of degeneracies in alignment (orientational defects). It is shown that the presented seeding method enables automated hyperstreamline-based visualization of a broad range of alignment tensor fields which enhances the ability of researchers to interpret these fields and provides an alternative to using glyph-based techniques. PMID:26357072

  6. Statistical method using operating room information system data to determine anesthetist weekend call requirements.

    PubMed

    Dexter, F; Macario, A; Traub, R D

    2000-02-01

    We present a statistical method that uses data from surgical services information systems to determine the minimum number of anesthetists to be scheduled for weekend call in an operating room suite. The staffing coverage is predicted that provides for sufficient anesthetists to cover each hour of a 24-hour weekend period, while satisfying a specified risk for being understaffed. The statistical method incorporates shifts of varying start times and durations, as well as historical weekend operating room caseload data. By using this method to schedule weekend staff, an anesthesia group can assure as few anesthetists are on call as possible, and for as few hours as possible, while maintaining the level of risk of understaffing that the anesthesia group is willing to accept. An anesthesia group also can use the method to calculate its risk of being understaffed in the surgical suite based on its existing weekend staffing plan. PMID:10876448

  7. [Land salinization information extraction method based on HSI hyperspectral and TM imagery].

    PubMed

    Li, Jin; Zhao, Geng-Xing; Chang, Chun-Yan; Liu, Hai-Teng

    2014-02-01

    This paper chose the typical salinization area in Kenli County of the Yellow River Delta as the study area, selected HJ-1A satellite HSI image at March 15, 2011 and TM image at March 22, 2011 as source of information, and pre-processed these data by image cropping, geometric correction and atmospheric correction. Spectral characteristics of main land use types including different degree of salinization lands, water and shoals were analyzed to find distinct bands for information extraction Land use information extraction model was built by adopting the quantitative and qualitative rules combining the spectral characteristics and the content of soil salinity. Land salinization information was extracted via image classification using decision tree method. The remote sensing image interpretation accuracy was verified by land salinization degree, which was determined through soil salinity chemical analysis of soil sampling points. In addition, classification accuracy between the hyperspectral and multi-spectral images were analyzed and compared. The results showed that the overall image classification accuracy of HSI was 96.43%, Kappa coefficient was 95.59%; while the overall image classification accuracy of TM was 89.17%, Kappa coefficient was 86.74%. Therefore, compared to multi-spectral TM data, the hyperspectral imagery could be more accurate and efficient for land salinization information extraction. Also, the classification map showed that the soil salinity distinction degree of hyperspectral image was higher than that of multi-spectral image. This study explored the land salinization information extraction techniques from hyperspectral imagery, extracted the spatial distribution and area ratio information of different degree of salinization land, and provided decision-making basis for the scientific utilization and management of coastal salinization land resources in the Yellow River Delta. PMID:24822432

  8. Review of methods for handling confounding by cluster and informative cluster size in clustered data

    PubMed Central

    Seaman, Shaun; Pavlou, Menelaos; Copas, Andrew

    2014-01-01

    Clustered data are common in medical research. Typically, one is interested in a regression model for the association between an outcome and covariates. Two complications that can arise when analysing clustered data are informative cluster size (ICS) and confounding by cluster (CBC). ICS and CBC mean that the outcome of a member given its covariates is associated with, respectively, the number of members in the cluster and the covariate values of other members in the cluster. Standard generalised linear mixed models for cluster-specific inference and standard generalised estimating equations for population-average inference assume, in general, the absence of ICS and CBC. Modifications of these approaches have been proposed to account for CBC or ICS. This article is a review of these methods. We express their assumptions in a common format, thus providing greater clarity about the assumptions that methods proposed for handling CBC make about ICS and vice versa, and about when different methods can be used in practice. We report relative efficiencies of methods where available, describe how methods are related, identify a previously unreported equivalence between two key methods, and propose some simple additional methods. Unnecessarily using a method that allows for ICS/CBC has an efficiency cost when ICS and CBC are absent. We review tools for identifying ICS/CBC. A strategy for analysis when CBC and ICS are suspected is demonstrated by examining the association between socio-economic deprivation and preterm neonatal death in Scotland. PMID:25087978

  9. ROI-preserving 3D video compression method utilizing depth information

    NASA Astrophysics Data System (ADS)

    Ti, Chunli; Xu, Guodong; Guan, Yudong; Teng, Yidan

    2015-09-01

    Efficiently transmitting the extra information of three dimensional (3D) video is becoming a key issue of the development of 3DTV. 2D plus depth format not only occupies the smaller bandwidth and is compatible transmission under the condition of the existing channel, but also can provide technique support for advanced 3D video compression in some extend. This paper proposes an ROI-preserving compression scheme to further improve the visual quality at a limited bit rate. According to the connection between the focus of Human Visual System (HVS) and depth information, region of interest (ROI) can be automatically selected via depth map progressing. The main improvement from common method is that a meanshift based segmentation is executed to the depth map before foreground ROI selection to keep the integrity of scene. Besides, the sensitive areas along the edges are also protected. The Spatio-temporal filtering adapting to H.264 is used to the non-ROI of both 2D video and depth map before compression. Experiments indicate that, the ROI extracted by this method is more undamaged and according with subjective feeling, and the proposed method can keep the key high-frequency information more effectively while the bit rate is reduced.

  10. Aphasic speech with and without SentenceShaper: Two methods for assessing informativeness.

    PubMed

    Fink, Ruth B; Bartlett, Megan R; Lowery, Jennifer S; Linebarger, Marcia C; Schwartz, Myrna F

    2008-01-01

    BACKGROUND: SentenceShaper((R)) (SSR) is a computer program that is for speech what a word-processing program is for written text; it allows the user to record words and phrases, play them back, and manipulate them on-screen to build sentences and narratives. A recent study demonstrated that when listeners rated the informativeness of functional narratives produced by chronic aphasic speakers with and without the program they gave higher informativeness ratings to the language produced with the aid of the program (Bartlett, Fink, Schwartz, & Linebarger, 2007). Bartlett et al. (2007) also compared unaided (spontaneous) narratives produced before and after the aided version of the narrative was obtained. In a subset of comparisons, the sample created after was judged to be more informative; they called this "topic-specific carryover". AIMS: (1) To determine whether differences in informativeness that Bartlett et al.'s listeners perceived are also revealed by Correct Information Unit (CIU) analysis (Nicholas & Brookshire, 1993)-a well studied, objective method for measuring informativeness-and (2) to demonstrate the usefulness of CIU analysis for samples of this type. METHODS #ENTITYSTARTX00026; PROCEDURES: A modified version of the CIU analysis was applied to the speech samples obtained by Bartlett et al. (2007). They had asked five individuals with chronic aphasia to create functional narratives on two topics, under three conditions: Unaided ("U"), Aided ("SSR"), & Post-SSR Unaided ("Post-U"). Here, these samples were analysed for differences in % CIUs across conditions. Linear associations between listener judgements and CIU measures were evaluated with bivariate correlations and multiple regression analysis. OUTCOMES #ENTITYSTARTX00026; RESULTS: (1) The aided effect was confirmed: samples produced with SentenceShaper had higher % CIUs, in most cases exceeding 90%. (2) There was little CONCLUSIONS: That the percentage of CIUs was higher in SSR-aided samples than in

  11. A Method to Quantify Visual Information Processing in Children Using Eye Tracking.

    PubMed

    Kooiker, Marlou J G; Pel, Johan J M; van der Steen-Kant, Sanny P; van der Steen, Johannes

    2016-01-01

    Visual problems that occur early in life can have major impact on a child's development. Without verbal communication and only based on observational methods, it is difficult to make a quantitative assessment of a child's visual problems. This limits accurate diagnostics in children under the age of 4 years and in children with intellectual disabilities. Here we describe a quantitative method that overcomes these problems. The method uses a remote eye tracker and a four choice preferential looking paradigm to measure eye movement responses to different visual stimuli. The child sits without head support in front of a monitor with integrated infrared cameras. In one of four monitor quadrants a visual stimulus is presented. Each stimulus has a specific visual modality with respect to the background, e.g., form, motion, contrast or color. From the reflexive eye movement responses to these specific visual modalities, output parameters such as reaction times, fixation accuracy and fixation duration are calculated to quantify a child's viewing behavior. With this approach, the quality of visual information processing can be assessed without the use of communication. By comparing results with reference values obtained in typically developing children from 0-12 years, the method provides a characterization of visual information processing in visually impaired children. The quantitative information provided by this method can be advantageous for the field of clinical visual assessment and rehabilitation in multiple ways. The parameter values provide a good basis to: (i) characterize early visual capacities and consequently to enable early interventions; (ii) compare risk groups and follow visual development over time; and (iii), construct an individual visual profile for each child. PMID:27500922

  12. The Theory-based Influence of Map Features on Risk Beliefs: Self-reports of What is Seen and Understood for Maps Depicting an Environmental Health Hazard

    PubMed Central

    Vatovec, Christine

    2013-01-01

    Theory-based research is needed to understand how maps of environmental health risk information influence risk beliefs and protective behavior. Using theoretical concepts from multiple fields of study including visual cognition, semiotics, health behavior, and learning and memory supports a comprehensive assessment of this influence. We report results from thirteen cognitive interviews that provide theory-based insights into how visual features influenced what participants saw and the meaning of what they saw as they viewed three formats of water test results for private wells (choropleth map, dot map, and a table). The unit of perception, color, proximity to hazards, geographic distribution, and visual salience had substantial influences on what participants saw and their resulting risk beliefs. These influences are explained by theoretical factors that shape what is seen, properties of features that shape cognition (pre-attentive, symbolic, visual salience), information processing (top-down and bottom-up), and the strength of concrete compared to abstract information. Personal relevance guided top-down attention to proximal and larger hazards that shaped stronger risk beliefs. Meaning was more local for small perceptual units and global for large units. Three aspects of color were important: pre-attentive “incremental risk” meaning of sequential shading, symbolic safety meaning of stoplight colors, and visual salience that drew attention. The lack of imagery, geographic information, and color diminished interest in table information. Numeracy and prior beliefs influenced comprehension for some participants. Results guided the creation of an integrated conceptual framework for application to future studies. Ethics should guide the selection of map features that support appropriate communication goals. PMID:22715919

  13. Improved method for calculating the respiratory line length in the Concealed Information Test.

    PubMed

    Matsuda, Izumi; Ogawa, Tokihiro

    2011-08-01

    The Concealed Information Test (CIT) assesses an examinee's knowledge about a crime based on response differences between crime-relevant and crime-irrelevant items. One effective measure in the CIT is the respiration line length, which is the average of the moving distances of the respiration curve in a specified time interval after the item onset. However, the moving distance differs between parts of a respiratory cycle. As a result, the calculated respiration line length is biased by how the parts of the respiratory cycles are included in the time interval. To resolve this problem, we propose a weighted average method, which calculates the respiration line length per cycle and weights it with the proportion that the cycle occupies in the time interval. Simulation results indicated that the weighted average method removes the bias of respiration line lengths compared to the original method. The results of experimental CIT data demonstrated that the weighted average method significantly increased the discrimination performance as compared with the original method. The weighted average method is a promising method for assessing respiration changes in response to question items more accurately, which improves the respiration-based discrimination performance of the CIT. PMID:21689693

  14. The effects of method of presenting health plan information on HMO enrollment by Medicaid beneficiaries.

    PubMed Central

    Andrews, R M; Curbow, B A; Owen, E; Burke, A

    1989-01-01

    Marketing strategies are critical for enhancing HMO enrollments among Medicaid beneficiaries when they are provided a choice of health plans. This study examined one component of marketing HMOs--the method of communicating the HMO's attributes. The purpose of the analysis was to determine if characteristics of Medicaid beneficiaries who enroll in HMOs vary by method of communicating information about health plan options. Data were analyzed from the marketing component of California's Prepaid Health Research, Evaluation, and Demonstration (PHRED) project. Five communication methods are examined in the article: brochure, film, county eligibility worker presentation, state representative presentation, and HMO representative presentation. The analysis reveals that each communication method is most effective with a different type of beneficiary. No single consumer characteristic is related to HMO enrollment across all five methods, although lack of a private physician and dissatisfaction with a current provider are associated with choice in four methods. Film is the best method for attracting persons who have an ongoing relationship with a provider. PMID:2668236

  15. Can functionalized cucurbituril bind actinyl cations efficiently? A density functional theory based investigation.

    PubMed

    Sundararajan, Mahesh; Sinha, Vivek; Bandyopadhyay, Tusar; Ghosh, Swapan K

    2012-05-01

    The feasibility of using cucurbituril host molecule as a probable actinyl cation binders candidate is investigated through density functional theory based calculations. Various possible binding sites of the cucurbit[5]uril host molecule to uranyl are analyzed and based on the binding energy evaluations, μ(5)-binding is predicted to be favored. For this coordination, the structure, vibrational spectra, and binding energies are evaluated for the binding of three actinyls in hexa-valent and penta-valent oxidation states with functionalized cucurbiturils. Functionalizing cucurbituril with methyl and cyclohexyl groups increases the binding affinities of actinyls, whereas fluorination decreases the binding affinities as compared to the native host molecule. Surprisingly hydroxylation of the host molecule does not distinguish the oxidation state of the three actinyls. PMID:22471316

  16. Theory-based approaches to understanding public emergency preparedness: implications for effective health and risk communication.

    PubMed

    Paek, Hye-Jin; Hilyard, Karen; Freimuth, Vicki; Barge, J Kevin; Mindlin, Michele

    2010-06-01

    Recent natural and human-caused disasters have awakened public health officials to the importance of emergency preparedness. Guided by health behavior and media effects theories, the analysis of a statewide survey in Georgia reveals that self-efficacy, subjective norm, and emergency news exposure are positively associated with the respondents' possession of emergency items and their stages of emergency preparedness. Practical implications suggest less focus on demographics as the sole predictor of emergency preparedness and more comprehensive measures of preparedness, including both a person's cognitive stage of preparedness and checklists of emergency items on hand. We highlight the utility of theory-based approaches for understanding and predicting public emergency preparedness as a way to enable more effective health and risk communication. PMID:20574880

  17. HBNG: Graph theory based visualization of hydrogen bond networks in protein structures

    PubMed Central

    Tiwari, Abhishek; Tiwari, Vivek

    2007-01-01

    HBNG is a graph theory based tool for visualization of hydrogen bond network in 2D. Digraphs generated by HBNG facilitate visualization of cooperativity and anticooperativity chains and rings in protein structures. HBNG takes hydrogen bonds list files (output from HBAT, HBEXPLORE, HBPLUS and STRIDE) as input and generates a DOT language script and constructs digraphs using freeware AT and T Graphviz tool. HBNG is useful in the enumeration of favorable topologies of hydrogen bond networks in protein structures and determining the effect of cooperativity and anticooperativity on protein stability and folding. HBNG can be applied to protein structure comparison and in the identification of secondary structural regions in protein structures. Availability Program is available from the authors for non-commercial purposes. PMID:18084648

  18. An open-shell restricted Hartree-Fock perturbation theory based on symmetric spin orbitals

    NASA Technical Reports Server (NTRS)

    Lee, Timothy J.; Jayatilaka, Dylan

    1993-01-01

    A new open-shell perturbation theory is formulated in terms of symmetric spin orbitals. Only one set of spatial orbitals is required, thereby reducing the number of independent coefficients in the perturbed wavefunctions. For second order, the computational cost is shown to be similar to a closed-shell calculation. This formalism is therefore more efficient than the recently developed RMP, ROMP or RMP-MBPT theories. The perturbation theory described herein was designed to have a close correspondence with our recently proposed coupled-cluster theory based on symmetric spin orbitals. The first-order wavefunction contains contributions from only doubly excited determinants. Equilibrium structures and vibrational frequencies determined from second-order perturbation theory are presented for OH, NH, CH, 02, NH2 and CH2.

  19. Informational Theory of Aging: The Life Extension Method Based on the Bone Marrow Transplantation

    PubMed Central

    Karnaukhov, Alexey V.; Karnaukhova, Elena V.; Sergievich, Larisa A.; Karnaukhova, Natalia A.; Bogdanenko, Elena V.; Manokhina, Irina A.; Karnaukhov, Valery N.

    2015-01-01

    The method of lifespan extension that is a practical application of the informational theory of aging is proposed. In this theory, the degradation (error accumulation) of the genetic information in cells is considered a main cause of aging. According to it, our method is based on the transplantation of genetically identical (or similar) stem cells with the lower number of genomic errors to the old recipients. For humans and large mammals, this method can be realized by cryopreservation of their own stem cells, taken in a young age, for the later autologous transplantation in old age. To test this method experimentally, we chose laboratory animals of relatively short lifespan (mouse). Because it is difficult to isolate the required amount of the stem cells (e.g., bone marrow) without significant damage for animals, we used the bone marrow transplantation from sacrificed inbred young donors. It is shown that the lifespan extension of recipients depends on level of their genetic similarity (syngeneity) with donors. We have achieved the lifespan increase of the experimental mice by 34% when the transplantation of the bone marrow with high level of genetic similarity was used. PMID:26491435

  20. Data Delivery Method Based on Neighbor Nodes' Information in a Mobile Ad Hoc Network

    PubMed Central

    Hayashi, Takuma; Taenaka, Yuzo; Okuda, Takeshi; Yamaguchi, Suguru

    2014-01-01

    This paper proposes a data delivery method based on neighbor nodes' information to achieve reliable communication in a mobile ad hoc network (MANET). In a MANET, it is difficult to deliver data reliably due to instabilities in network topology and wireless network condition which result from node movement. To overcome such unstable communication, opportunistic routing and network coding schemes have lately attracted considerable attention. Although an existing method that employs such schemes, MAC-independent opportunistic routing and encoding (MORE), Chachulski et al. (2007), improves the efficiency of data delivery in an unstable wireless mesh network, it does not address node movement. To efficiently deliver data in a MANET, the method proposed in this paper thus first employs the same opportunistic routing and network coding used in MORE and also uses the location information and transmission probabilities of neighbor nodes to adapt to changeable network topology and wireless network condition. The simulation experiments showed that the proposed method can achieve efficient data delivery with low network load when the movement speed is relatively slow. PMID:24672371

  1. Aircraft target onboard detecting technology via Circular Information Matching method for remote sensing satellite

    NASA Astrophysics Data System (ADS)

    Xiao, Huachao; Zhou, Quan; Li, Li

    2015-10-01

    Image information onboard processing is one o f important technology to rapidly achieve intelligence for remote sensing satellites. As a typical target, aircraft onboard detection has been getting more attention. In this paper, we propose an efficient method of aircraft detection for remote sensing satellite onboard processing. According to the feature of aircraft performance in remote sensing image, the detection algorithm consists of two steps: First Salient Object Detection (SOD) is employed to reduce the amount of calculation on large remote sensing image. SOD uses Gabor filtering and a simple binary test between pixels in a filtered image. White points are connected as regions. Plane candidate regions are screened from white regions by area, length and width of connected region. Next a new algorithm, called Circumferential Information Matching method, is used to detect aircraft on candidate regions. The results of tests show circumference curve around the plane center is stable shape, so the candidate region can be accurately detecting with this feature. In order to rotation invariant, we use circle matched filter to detect target. And discrete fast Fourier transform (DFFT) is used to accelerate and reduce calculation. Experiments show the detection accuracy rate of proposed algorithm is 90% with less than 0.5s processing time. In addition, the calculation of the proposed method through quantitative anglicized is very small. Experimental results and theoretical analysis show that the proposed method is reasonable and highly-efficient.

  2. Progress toward scalable tomography of quantum maps using twirling-based methods and information hierarchies

    SciTech Connect

    Lopez, Cecilia C.; Bendersky, Ariel; Paz, Juan Pablo; Cory, David G.

    2010-06-15

    We present in a unified manner the existing methods for scalable partial quantum process tomography. We focus on two main approaches: the one presented in Bendersky et al. [Phys. Rev. Lett. 100, 190403 (2008)] and the ones described, respectively, in Emerson et al. [Science 317, 1893 (2007)] and Lopez et al. [Phys. Rev. A 79, 042328 (2009)], which can be combined together. The methods share an essential feature: They are based on the idea that the tomography of a quantum map can be efficiently performed by studying certain properties of a twirling of such a map. From this perspective, in this paper we present extensions, improvements, and comparative analyses of the scalable methods for partial quantum process tomography. We also clarify the significance of the extracted information, and we introduce interesting and useful properties of the {chi}-matrix representation of quantum maps that can be used to establish a clearer path toward achieving full tomography of quantum processes in a scalable way.

  3. Prognostic value of graph theory-based tissue architecture analysis in carcinomas of the tongue.

    PubMed

    Sudbø, J; Bankfalvi, A; Bryne, M; Marcelpoil, R; Boysen, M; Piffko, J; Hemmer, J; Kraft, K; Reith, A

    2000-12-01

    Several studies on oral squamous cell carcinomas (OSCC) suggest that the clinical value of traditional histologic grading is limited both by poor reproducibility and by low prognostic impact. However, the prognostic potential of a strictly quantitative and highly reproducible assessment of the tissue architecture in OSCC has not been evaluated. Using image analysis, in 193 cases of T1-2 (Stage I-II) OSCC we retrospectively investigated the prognostic impact of two graph theory-derived structural features: the average Delaunay Edge Length (DEL_av) and the average homogeneity of the Ulam Tree (ELH_av). Both structural features were derived from subgraphs of the Voronoi Diagram. The geometric centers of the cell nuclei were computed, generating a two-dimensional swarm of point-like seeds from which graphs could be constructed. The impact on survival of the computed values of ELH_av and DEL_av was estimated by the method of Kaplan and Meier, with relapse-free survival and overall survival as end-points. The prognostic values of DEL_av and ELH_av as computed for the invasive front, the superficial part of the carcinoma, the total carcinoma, and the normal-appearing oral mucosa were compared. For DEL_av, significant prognostic information was found in the invasive front (p < 0.001). No significant prognostic information was found in superficial part of the carcinoma (p = 0.34), in the carcinoma as a whole (p = 0.35), or in the normal-appearing mucosa (p = 0.27). For ELH_av, significant prognostic information was found in the invasive front (p = 0.01) and, surprisingly, in putatively normal mucosa (p = 0.03). No significant prognostic information was found in superficial parts of the carcinoma (p = 0.34) or in the total carcinoma (p = 0.11). In conclusion, strictly quantitative assessment of tissue architecture in the invasive front of OSCC yields highly prognostic information. PMID:11140700

  4. A blind image detection method for information hiding with double random-phase encoding

    NASA Astrophysics Data System (ADS)

    Sheng, Yuan; Xin, Zhou; Jian-guo, Chen; Yong-liang, Xiao; Qiang, Liu

    2009-07-01

    In this paper, a blind image detection method based on a statistical hypothesis test for information hiding with double random-phase encoding (DRPE) is proposed. This method aims to establish a quantitative criterion which is used to judge whether there is secret information embedded in the detected image. The main process can be described as follows: at the beginning, we decompose the detected gray-scale image into 8 bit planes considering it has 256 gray levels, and suppose that a secret image has been hidden in the detected image after it was encrypted by DRPE, thus the lower bit planes of the detected image exhibit strong randomness. Then, we divide the bit plane to be tested into many windows, and establish a statistical variable to measure the relativity between pixels in every window. Finally, judge whether the secret image exists in the detected image by operating the t test on all statistical variables. Numerical simulation shows that the accuracy is quite satisfactory, when we need to distinguish the images carrying secret information from a large amount of images.

  5. Support of Wheelchairs Using Pheromone Information with Two Types of Communication Methods

    NASA Astrophysics Data System (ADS)

    Yamamoto, Koji; Nitta, Katsumi

    In this paper, we propose a communication framework which combined two types of communication among wheelchairs and mobile devices. Due to restriction of range of activity, there is a problem that wheelchair users tend to shut themselves up in their houses. We developed a navigational wheelchair which loads a system that displays information on a map through WWW. However, this wheelchair is expensive because it needs a solid PC, a precise GPS, a battery, and so on. We introduce mobile devices and use this framework to provide information to wheelchair users and to facilitate them to go out. When a user encounters other users, they exchange messages which they have by short-distance wireless communication. Once a message is delivered to a navigational wheelchair, the wheelchair uploads the message to the system. We use two types of pheromone information which represent trends of user's movement and existences of a crowd of users. First, when users gather, ``crowd of people pheromone'' is emitted virtually. Users do not send these pheromones to the environment but carry them. If the density exceeds the threshold, messages that express ``people gethered'' are generated automatically. The other pheromone is ``movement trend pheromone'', which is used to improve probability of successful transmissions. From results of experiments, we concluded that our method can deliver information that wheelchair users gathered to other wheelchairs.

  6. Methods and apparatuses for information analysis on shared and distributed computing systems

    DOEpatents

    Bohn, Shawn J [Richland, WA; Krishnan, Manoj Kumar [Richland, WA; Cowley, Wendy E [Richland, WA; Nieplocha, Jarek [Richland, WA

    2011-02-22

    Apparatuses and computer-implemented methods for analyzing, on shared and distributed computing systems, information comprising one or more documents are disclosed according to some aspects. In one embodiment, information analysis can comprise distributing one or more distinct sets of documents among each of a plurality of processes, wherein each process performs operations on a distinct set of documents substantially in parallel with other processes. Operations by each process can further comprise computing term statistics for terms contained in each distinct set of documents, thereby generating a local set of term statistics for each distinct set of documents. Still further, operations by each process can comprise contributing the local sets of term statistics to a global set of term statistics, and participating in generating a major term set from an assigned portion of a global vocabulary.

  7. Recording, display, and evaluation methods to obtain quantitative information from electron holograms

    SciTech Connect

    Voelkl, E.; Allard, L.F.; Frost, B.

    1999-04-01

    Digital recording has become a basic requirement for electron holography for many reasons. The fact that it allows live-time evaluation of the phase information and easy recording of a reference hologram are two very important reasons that are widely appreciated. Here the authors discuss requirements for recording electron holograms under the special conditions imposed by the Nyquist limit and the modulation transfer function (MTF) of the charge-coupled device (CCD) camera. As electron holography provides complex images carrying both the amplitude and phase of the image wave, the question of how to best display the information will be investigated. This is not an easy question, because special aspects of different applications require different solutions. Methods for display and evaluation of holographic data are described.

  8. Method for Detecting Core Malware Sites Related to Biomedical Information Systems

    PubMed Central

    Kim, Dohoon; Choi, Donghee; Jin, Jonghyun

    2015-01-01

    Most advanced persistent threat attacks target web users through malicious code within landing (exploit) or distribution sites. There is an urgent need to block the affected websites. Attacks on biomedical information systems are no exception to this issue. In this paper, we present a method for locating malicious websites that attempt to attack biomedical information systems. Our approach uses malicious code crawling to rearrange websites in the order of their risk index by analyzing the centrality between malware sites and proactively eliminates the root of these sites by finding the core-hub node, thereby reducing unnecessary security policies. In particular, we dynamically estimate the risk index of the affected websites by analyzing various centrality measures and converting them into a single quantified vector. On average, the proactive elimination of core malicious websites results in an average improvement in zero-day attack detection of more than 20%. PMID:25821511

  9. Method for detecting core malware sites related to biomedical information systems.

    PubMed

    Kim, Dohoon; Choi, Donghee; Jin, Jonghyun

    2015-01-01

    Most advanced persistent threat attacks target web users through malicious code within landing (exploit) or distribution sites. There is an urgent need to block the affected websites. Attacks on biomedical information systems are no exception to this issue. In this paper, we present a method for locating malicious websites that attempt to attack biomedical information systems. Our approach uses malicious code crawling to rearrange websites in the order of their risk index by analyzing the centrality between malware sites and proactively eliminates the root of these sites by finding the core-hub node, thereby reducing unnecessary security policies. In particular, we dynamically estimate the risk index of the affected websites by analyzing various centrality measures and converting them into a single quantified vector. On average, the proactive elimination of core malicious websites results in an average improvement in zero-day attack detection of more than 20%. PMID:25821511

  10. DEVELOPMENT OF AUTOMATIC EXTRACTION METHOD FOR ROAD UPDATE INFORMATION BASED ON PUBLIC WORK ORDER OUTLOOK

    NASA Astrophysics Data System (ADS)

    Sekimoto, Yoshihide; Nakajo, Satoru; Minami, Yoshitaka; Yamaguchi, Syohei; Yamada, Harutoshi; Fuse, Takashi

    Recently, disclosure of statistic data, representing financial effects or burden for public work, through each web site of national or local government, enables us to discuss macroscopic financial trends. However, it is still difficult to grasp a basic property nationwide how each spot was changed by public work. In this research, our research purpose is to collect road update information reasonably which various road managers provide, in order to realize efficient updating of various maps such as car navigation maps. In particular, we develop the system extracting public work concerned and registering summary including position information to database automatically from public work order outlook, released by each local government, combinating some web mining technologies. Finally, we collect and register several tens of thousands from web site all over Japan, and confirm the feasibility of our method.

  11. Co-design of RAD and ETHICS methodologies: a combination of information system development methods

    NASA Astrophysics Data System (ADS)

    Nasehi, Arezo; Shahriyari, Salman

    2011-12-01

    Co-design is a new trend in the social world which tries to capture different ideas in order to use the most appropriate features for a system. In this paper, co-design of two information system methodologies is regarded; rapid application development (RAD) and effective technical and human implementation of computer-based systems (ETHICS). We tried to consider the characteristics of these methodologies to see the possibility of having a co-design or combination of them for developing an information system. To reach this purpose, four different aspects of them are analyzed: social or technical approach, user participation and user involvement, job satisfaction, and overcoming change resistance. Finally, a case study using the quantitative method is analyzed in order to examine the possibility of co-design using these factors. The paper concludes that RAD and ETHICS are appropriate to be co-designed and brings some suggestions for the co-design.

  12. Designing and evaluating an effective theory-based continuing interprofessional education program to improve sepsis care by enhancing healthcare team collaboration.

    PubMed

    Owen, John A; Brashers, Valentina L; Littlewood, Keith E; Wright, Elisabeth; Childress, Reba Moyer; Thomas, Shannon

    2014-05-01

    Continuing interprofessional education (CIPE) differs from traditional continuing education (CE) in both the learning process and content, especially when it occurs in the workplace. Applying theories to underpin the development, implementation, and evaluation of CIPE activities informs educational design, encourages reflection, and enhances our understanding of CIPE and collaborative practice. The purpose of this article is to describe a process of design, implementation, and evaluation of CIPE through the application of explicit theories related to CIPE and workplace learning. A description of an effective theory-based program delivered to faculty and clinicians to enhance healthcare team collaboration is provided. Results demonstrated that positive changes in provider perceptions of and commitment to team-based care were achieved using this theory-based approach. Following this program, participants demonstrated a greater appreciation for the roles of other team members by indicating that more responsibility for implementing the Surviving Sepsis guideline should be given to nurses and respiratory therapists and less to physicians. Furthermore, a majority (86%) of the participants made commitments to demonstrate specific collaborative behaviors in their own practice. The article concludes with a discussion of our enhanced understanding of CIPE and a reinterpretation of the learning process which has implications for future CIPE workplace learning activities. PMID:24593326

  13. Comparison of Information Dissemination Methods in Inle Lake: A Lesson for Reconsidering Framework for Environmental Education Strategies

    ERIC Educational Resources Information Center

    Oo, Htun Naing; Sutheerawatthana, Pitch; Minato, Takayuki

    2010-01-01

    This article analyzes the practice of information dissemination regarding pesticide usage in floating gardening in a rural area. The analysis reveals reasons why the current information dissemination methods employed by relevant stakeholders do not work. It then puts forward a proposition that information sharing within organizations of and among…

  14. A Novel Method of Multi-Information Acquisition for Electromagnetic Flow Meters

    PubMed Central

    Cui, Wenhua; Li, Bin; Chen, Jie; Li, Xinwei

    2015-01-01

    In this paper, a novel method is proposed for multi-information acquisition from the electromagnetic flow meter, using magnetic excitation to measure the fluid velocity and electrochemistry impedance spectroscopy (EIS) for both the fluid quality and the contamination level of the transducer. The impedance spectra of the transducer are measured with an additional electrical stimulus in series with the electrode measurement loop. The series connection mode instead of the parallel one improves the signal-to-noise ratio (SNR) of the fluid velocity measurement and offers a wide range of impedance measurements by using a sample capacitance. In addition, a multi-frequency synchronous excitation source is synthesized based on the method of dual-base power sequences for fast EIS measurement. The conductivity measurements in the range of 1.7 μS/cm–2 mS/cm showed a relatively high accuracy with a measurement error of 5%, and the electrode adhesion detection on both with coating and no coating showed the ability of the qualitative determination of the electrode adhesion, which validated the feasibility of the multi-information acquisition method for the electromagnetic flow meter (EMFM). PMID:26712762

  15. Geographic Information System Software to Remodel Population Data Using Dasymetric Mapping Methods

    USGS Publications Warehouse

    Sleeter, Rachel; Gould, Michael

    2007-01-01

    The U.S. Census Bureau provides decadal demographic data collected at the household level and aggregated to larger enumeration units for anonymity purposes. Although this system is appropriate for the dissemination of large amounts of national demographic data, often the boundaries of the enumeration units do not reflect the distribution of the underlying statistical phenomena. Conventional mapping methods such as choropleth mapping, are primarily employed due to their ease of use. However, the analytical drawbacks of choropleth methods are well known ranging from (1) the artificial transition of population at the boundaries of mapping units to (2) the assumption that the phenomena is evenly distributed across the enumeration unit (when in actuality there can be significant variation). Many methods to map population distribution have been practiced in geographic information systems (GIS) and remote sensing fields. Many cartographers prefer dasymetric mapping to map population because of its ability to more accurately distribute data over geographic space. Similar to ?choropleth maps?, a dasymetric map utilizes standardized data (for example, census data). However, rather than using arbitrary enumeration zones to symbolize population distribution, a dasymetric approach introduces ancillary information to redistribute the standardized data into zones relative to land use and land cover (LULC), taking into consideration actual changing densities within the boundaries of the enumeration unit. Thus, new zones are created that correlate to the function of the map, capturing spatial variations in population density. The transfer of data from census enumeration units to ancillary-driven homogenous zones is performed by a process called areal interpolation.

  16. A Novel Method of Multi-Information Acquisition for Electromagnetic Flow Meters.

    PubMed

    Cui, Wenhua; Li, Bin; Chen, Jie; Li, Xinwei

    2015-01-01

    In this paper, a novel method is proposed for multi-information acquisition from the electromagnetic flow meter, using magnetic excitation to measure the fluid velocity and electrochemistry impedance spectroscopy (EIS) for both the fluid quality and the contamination level of the transducer. The impedance spectra of the transducer are measured with an additional electrical stimulus in series with the electrode measurement loop. The series connection mode instead of the parallel one improves the signal-to-noise ratio (SNR) of the fluid velocity measurement and offers a wide range of impedance measurements by using a sample capacitance. In addition, a multi-frequency synchronous excitation source is synthesized based on the method of dual-base power sequences for fast EIS measurement. The conductivity measurements in the range of 1.7 μS/cm-2 mS/cm showed a relatively high accuracy with a measurement error of 5%, and the electrode adhesion detection on both with coating and no coating showed the ability of the qualitative determination of the electrode adhesion, which validated the feasibility of the multi-information acquisition method for the electromagnetic flow meter (EMFM). PMID:26712762

  17. A Method for Evaluating Tuning Functions of Single Neurons based on Mutual Information Maximization

    NASA Astrophysics Data System (ADS)

    Brostek, Lukas; Eggert, Thomas; Ono, Seiji; Mustari, Michael J.; Büttner, Ulrich; Glasauer, Stefan

    2011-03-01

    We introduce a novel approach for evaluation of neuronal tuning functions, which can be expressed by the conditional probability of observing a spike given any combination of independent variables. This probability can be estimated out of experimentally available data. By maximizing the mutual information between the probability distribution of the spike occurrence and that of the variables, the dependence of the spike on the input variables is maximized as well. We used this method to analyze the dependence of neuronal activity in cortical area MSTd on signals related to movement of the eye and retinal image movement.

  18. Methods of Hematoxylin and Erosin Image Information Acquisition and Optimization in Confocal Microscopy

    PubMed Central

    Yoon, Woong Bae; Kim, Hyunjin; Kim, Kwang Gi; Choi, Yongdoo; Chang, Hee Jin

    2016-01-01

    Objectives We produced hematoxylin and eosin (H&E) staining-like color images by using confocal laser scanning microscopy (CLSM), which can obtain the same or more information in comparison to conventional tissue staining. Methods We improved images by using several image converting techniques, including morphological methods, color space conversion methods, and segmentation methods. Results An image obtained after image processing showed coloring very similar to that in images produced by H&E staining, and it is advantageous to conduct analysis through fluorescent dye imaging and microscopy rather than analysis based on single microscopic imaging. Conclusions The colors used in CLSM are different from those seen in H&E staining, which is the method most widely used for pathologic diagnosis and is familiar to pathologists. Computer technology can facilitate the conversion of images by CLSM to be very similar to H&E staining images. We believe that the technique used in this study has great potential for application in clinical tissue analysis. PMID:27525165

  19. Benchmarking Clinical Speech Recognition and Information Extraction: New Data, Methods, and Evaluations

    PubMed Central

    Zhou, Liyuan; Hanlen, Leif; Ferraro, Gabriela

    2015-01-01

    Background Over a tenth of preventable adverse events in health care are caused by failures in information flow. These failures are tangible in clinical handover; regardless of good verbal handover, from two-thirds to all of this information is lost after 3-5 shifts if notes are taken by hand, or not at all. Speech recognition and information extraction provide a way to fill out a handover form for clinical proofing and sign-off. Objective The objective of the study was to provide a recorded spoken handover, annotated verbatim transcriptions, and evaluations to support research in spoken and written natural language processing for filling out a clinical handover form. This dataset is based on synthetic patient profiles, thereby avoiding ethical and legal restrictions, while maintaining efficacy for research in speech-to-text conversion and information extraction, based on realistic clinical scenarios. We also introduce a Web app to demonstrate the system design and workflow. Methods We experiment with Dragon Medical 11.0 for speech recognition and CRF++ for information extraction. To compute features for information extraction, we also apply CoreNLP, MetaMap, and Ontoserver. Our evaluation uses cross-validation techniques to measure processing correctness. Results The data provided were a simulation of nursing handover, as recorded using a mobile device, built from simulated patient records and handover scripts, spoken by an Australian registered nurse. Speech recognition recognized 5276 of 7277 words in our 100 test documents correctly. We considered 50 mutually exclusive categories in information extraction and achieved the F1 (ie, the harmonic mean of Precision and Recall) of 0.86 in the category for irrelevant text and the macro-averaged F1 of 0.70 over the remaining 35 nonempty categories of the form in our 101 test documents. Conclusions The significance of this study hinges on opening our data, together with the related performance benchmarks and some

  20. Evidence base for an intervention to maximise uptake of glaucoma testing: a theory-based cross-sectional survey

    PubMed Central

    Prior, Maria; Burr, Jennifer M; Ramsay, Craig R; Jenkinson, David; Campbell, Susan

    2012-01-01

    Objective To identify factors associated with intention to attend a hypothetical eye health test and provide an evidence base for developing an intervention to maximise attendance, for use in studies evaluating glaucoma screening programmes. Design Theory-based cross-sectional survey, based on an extended Theory of Planned Behaviour (TPB) and the Common Sense Self-Regulation Model, conducted in June 2010. Participants General population including oversampling from low socioeconomic areas. Setting Aberdeenshire and the London Boroughs of Lewisham and Southwark, UK. Results From 867 questionnaires posted, 327 completed questionnaires were returned (38%). In hierarchical regression analysis, the three theoretical predictors in the TPB (Attitude, Subjective norm and Perceived Behavioural Control) accounted for two-thirds of the variance in intention scores (adjusted R2=0.65). All three predictors contributed significantly to prediction. Adding ‘Anticipated regret’ as a factor in the TPB model resulted in a significant increase in prediction (adjusted R2=0.74). In the Common Sense Self-Regulation Model, only illness representations about the personal consequences of glaucoma (How much do you think glaucoma would affect your life?) and illness concern (How concerned are you about getting glaucoma?) significantly predicted. The final model explained 75% of the variance in intention scores, with ethnicity significantly contributing to prediction. Conclusions In this population-based sample (including over-representation of lower socioeconomic groupings), the main predictors of intention to attend a hypothetical eye health test were Attitude, Perceived control over attendance, Anticipated regret if did not attend and black ethnicity. This evidence informs the design of a behavioural intervention with intervention components targeting low intentions and predicted to influence health-related behaviours. PMID:22382121

  1. Using three-phase theory-based formative research to explore healthy eating in Australian truck drivers.

    PubMed

    Vayro, Caitlin; Hamilton, Kyra

    2016-03-01

    In Australia, fruit and vegetable consumption is lower than recommended while discretionary foods (i.e., foods high in fat, sugar, and salt) are eaten in excess. Long-haul truck drivers are a group at risk of unhealthy eating but have received limited attention in the health literature. We aimed to examine long-haul truck drivers eating decisions in order to develop theory-based and empirically-driven health messages to improve their healthy food choices. Drawing on the Theory of Planned Behavior, three-phased formative research was conducted using self-report surveys. Phase 1 (N = 30, Mage = 39.53, SDage = 10.72) identified modal salient beliefs about fruit and vegetable (FV) intake and limiting discretionary choices (DC). There were nine behavioral and seven normative beliefs elicited for both FV and DC; while nine and five control beliefs were elicited for FV and DC, respectively. Phase 2 (N = 148, Mage = 44.23, SDage = 12.08) adopted a prospective design with one week follow-up to examine the predictors of FV and DC intention and behavior. A variety of behavioral and control beliefs were predictive of FV and DC intention and behavior. Normative beliefs were predictive of FV intention and behavior and DC intention only. Phase 3 (N = 20, Mage = 46.9, SDage = 12.85) elicited the reasons why each belief is held/solutions to negative beliefs, that could be used as health messages. In total, 40 reasons/solutions were identified: 26 for FV and 14 for DC. In summary, we found that specific behavioral, normative and control beliefs influenced FV and DC eating decisions. These results have implications for truck driver's health and provide formative research to inform future interventions to improve the food choices of a unique group who are at risk of unhealthy eating behaviors. PMID:26710674

  2. A sample theory-based logic model to improve program development, implementation, and sustainability of Farm to School programs.

    PubMed

    Ratcliffe, Michelle M

    2012-08-01

    Farm to School programs hold promise to address childhood obesity. These programs may increase students’ access to healthier foods, increase students’ knowledge of and desire to eat these foods, and increase their consumption of them. Implementing Farm to School programs requires the involvement of multiple people, including nutrition services, educators, and food producers. Because these groups have not traditionally worked together and each has different goals, it is important to demonstrate how Farm to School programs that are designed to decrease childhood obesity may also address others’ objectives, such as academic achievement and economic development. A logic model is an effective tool to help articulate a shared vision for how Farm to School programs may work to accomplish multiple goals. Furthermore, there is evidence that programs based on theory are more likely to be effective at changing individuals’ behaviors. Logic models based on theory may help to explain how a program works, aid in efficient and sustained implementation, and support the development of a coherent evaluation plan. This article presents a sample theory-based logic model for Farm to School programs. The presented logic model is informed by the polytheoretical model for food and garden-based education in school settings (PMFGBE). The logic model has been applied to multiple settings, including Farm to School program development and evaluation in urban and rural school districts. This article also includes a brief discussion on the development of the PMFGBE, a detailed explanation of how Farm to School programs may enhance the curricular, physical, and social learning environments of schools, and suggestions for the applicability of the logic model for practitioners, researchers, and policy makers. PMID:22867069

  3. Mapping Information Delivery Networks: The Objectives, the Methods, the Benefits, and the Model.

    ERIC Educational Resources Information Center

    Murr, L. E.; And Others

    1986-01-01

    A compact atlas--"Information Highways: Mapping Information Delivery Networks in the Pacific Northwest"--provides a model for assimilating large volumes of information dealing with generic topics. It combines data and information blocks with information technology graphics blocks and maps showing the actual locations of information delivery…

  4. Obtaining structural and functional information for GPCRs using the substituted-cysteine accessibility method (SCAM).

    PubMed

    Liapakis, George

    2014-01-01

    G-protein coupled receptors (GPCRs) are proteins of the plasma membrane, which are characterized by seven membrane-spanning segments (TMs). GPCRs play an important role in almost all of our physiological and pathophysiological conditions by interacting with a large variety of ligands and stimulating different G-proteins and signaling cascades. By playing a key role in the function of our body and being involved in the pathophysiology of many disorders, GPCRs are very important therapeutic targets. Determination of the structure and function of GPCRs could advance the design of novel receptor-specific drugs against various diseases. A powerful method to obtain structural and functional information for GPCRs is the cysteine substituted accessibility method (SCAM). SCAM is used to systematically map the TM residues of GPCRs and determine their functional role. SCAM can also be used to determine differences in the structures of the TMs in different functional states of GPCRs. PMID:25335535

  5. High-orbit satellite reflection surface geometry information estimation using photometric measurement method

    NASA Astrophysics Data System (ADS)

    Zhang, Shixue

    2015-10-01

    The method to get high-orbit satellite basic information such as geometry and material characteristic, is an important goal in the field of space posture apperception. In this paper, we calculate the satellite magnitude by comparing the output value of camera's CCD between the known fixed star and the satellite. We select certain reference stars to calculate the luminance value of a certain object on the acquired image using a background-removing method. We make time-domain analysis of the measurement data, and get the statistic result. With the knowledge of the theory brightness of the target, we estimate the geometric characteristics of the target. We have got a serious of the images of a certain satellite on large telescope. The experimental results demonstrate that, the accuracy of the measured magnitude is better than 0.12Mv, and the estimation error of the target reflection surface size is less than 15%.

  6. Information Accessibility of the Charcoal Burning Suicide Method in Mainland China

    PubMed Central

    Cheng, Qijin; Chang, Shu-Sen; Guo, Yingqi; Yip, Paul S. F.

    2015-01-01

    Background There has been a marked rise in suicide by charcoal burning (CB) in some East Asian countries but little is known about its incidence in mainland China. We examined media-reported CB suicides and the availability of online information about the method in mainland China. Methods We extracted and analyzed data for i) the characteristics and trends of fatal and nonfatal CB suicides reported by mainland Chinese newspapers (1998–2014); ii) trends and geographic variations in online searches using keywords relating to CB suicide (2011–2014); and iii) the content of Internet search results. Results 109 CB suicide attempts (89 fatal and 20 nonfatal) were reported by newspapers in 13 out of the 31 provinces or provincial-level-municipalities in mainland China. There were increasing trends in the incidence of reported CB suicides and in online searches using CB-related keywords. The province-level search intensities were correlated with CB suicide rates (Spearman’s correlation coefficient = 0.43 [95% confidence interval: 0.08–0.68]). Two-thirds of the web links retrieved using the search engine contained detailed information about the CB suicide method, of which 15% showed pro-suicide attitudes, and the majority (86%) did not encourage people to seek help. Limitations The incidence of CB suicide was based on newspaper reports and likely to be underestimated. Conclusions Mental health and suicide prevention professionals in mainland China should be alert to the increased use of this highly lethal suicide method. Better surveillance and intervention strategies need to be developed and implemented. PMID:26474297

  7. ProphNet: A generic prioritization method through propagation of information

    PubMed Central

    2014-01-01

    Background Prioritization methods have become an useful tool for mining large amounts of data to suggest promising hypotheses in early research stages. Particularly, network-based prioritization tools use a network representation for the interactions between different biological entities to identify novel indirect relationships. However, current network-based prioritization tools are strongly tailored to specific domains of interest (e.g. gene-disease prioritization) and they do not allow to consider networks with more than two types of entities (e.g. genes and diseases). Therefore, the direct application of these methods to accomplish new prioritization tasks is limited. Results This work presents ProphNet, a generic network-based prioritization tool that allows to integrate an arbitrary number of interrelated biological entities to accomplish any prioritization task. We tested the performance of ProphNet in comparison with leading network-based prioritization methods, namely rcNet and DomainRBF, for gene-disease and domain-disease prioritization, respectively. The results obtained by ProphNet show a significant improvement in terms of sensitivity and specificity for both tasks. We also applied ProphNet to disease-gene prioritization on Alzheimer, Diabetes Mellitus Type 2 and Breast Cancer to validate the results and identify putative candidate genes involved in these diseases. Conclusions ProphNet works on top of any heterogeneous network by integrating information of different types of biological entities to rank entities of a specific type according to their degree of relationship with a query set of entities of another type. Our method works by propagating information across data networks and measuring the correlation between the propagated values for a query and a target sets of entities. ProphNet is available at: http://genome2.ugr.es/prophnet. A Matlab implementation of the algorithm is also available at the website. PMID:24564336

  8. A theory-based exercise intervention in patients with heart failure: A protocol for randomized, controlled trial

    PubMed Central

    Rajati, Fatemeh; Mostafavi, Firoozeh; Sharifirad, Gholamreza; Sadeghi, Masoomeh; Tavakol, Kamran; Feizi, Awat; Pashaei, Tahereh

    2013-01-01

    Background: Regular exercise has been associated with improved quality of life (QoL) in patients with heart failure (HF). However, less is known on the theoretical framework, depicting how educational intervention on psychological, social, and cognitive variables affects physical activity (PA). The purpose of this study is to assess the effectiveness of a social cognitive theory-based (SCT-based) exercise intervention in patients with HF. Materials and Methods: This is a randomized controlled trial, with measurements at baseline, immediately following the intervention, and at 1, 3, and 6 months follow-up. Sixty patients who are referred to the cardiac rehabilitation (CR) unit and meet the inclusion criteria will be randomly allocated to either an intervention group or a usual-care control group. Data will be collected using various methods (i.e., questionnaires, physical tests, paraclinical tests, patients’ interviews, and focus groups). The patients in the intervention group will receive eight face-to-face counseling sessions, two focus groups, and six educational sessions over a 2-month period. The intervention will include watching videos, using book and pamphlets, and sending short massage services to the participants. The primary outcome measures are PA and QoL. The secondary outcome measures will be the components of SCT, heart rate and blood pressure at rest, body mass index, left ventricular ejection fraction, exercise capacity, and maximum heart rate. Conclusion: The findings of this trial may assist with the development of a theoretical model for exercise intervention in CR. The intervention seems to be promising and has the potential to bridge the gap of the usually limited and incoherent provision of educational care in the CR setting. PMID:24379841

  9. A method of 3-D data information storage with virtual holography

    NASA Astrophysics Data System (ADS)

    Huang, Zhen; Liu, Guodong; Ren, Zhong; Zeng, Lüming

    2008-12-01

    In this paper, a new method of 3-D data cube based on virtual holographic storage is presented. Firstly, the data information is encoded in the form of 3-D data cube with a certain algorithm, in which the interval along coordinates between every data is d. Using the plane-scanning method, the 3-D cube can be described as a assembly of slices which are parallel planes along the coordinates at an interval of d. The dot on the slice represents a bit. The bright one means "1", while the dark one means "0". Secondly, a hologram of the 3-D cube is obtained by computer with virtual optics technology. All the information of a 3-D cube can be described by a 2-D hologram. At last, the hologram is inputted in the SLM, and recorded in the recording material by intersecting two coherent laser beams. When the 3-D data is exported, a reference light illuminates the hologram, and a CCD is used to get the object image which is a hologram of the 3-D data. Then the 3-D data is computed with virtual optical technology. Compared with 2-D data page storage, the 3-D data cube storage has outstanding performance in larger capacity of data storage and higher security of data.

  10. A method of construction of information images of the acoustic signals of the human bronchopulmonary system

    NASA Astrophysics Data System (ADS)

    Bureev, A. Sh.; Zhdanov, D. S.; Zemlyakov, I. Yu.; Kiseleva, E. Yu.; Khokhlova, L. A.

    2015-11-01

    The present study focuses on the development of a method of identification of respiratory sounds and noises of a human naturally and in various pathological conditions. The existing approaches based on a simple method of frequency and time signal analysis, have insufficient specificity, efficiency and unambiguous interpretation of the results of a clinical study. An algorithm for a phase selection of respiratory cycles and analysis of respiratory sounds resulting from bronchi examination of a patient has been suggested. The algorithm is based on the method of phase timing analysis of bronchi phonograms. The results of the phase-frequency algorithm with high resolution reflects a time position of the traceable signals and the individual structure of recorded signals. This allows using the proposed method for the formation of information images (models) of the diagnostically significant fragments. A weight function, frequency parameters of which can be selectively modified, is used for this purpose. The vision of the weighting function is specific to each type of respiratory noise, traditionally referred to quality characteristics (wet or dry noise, crackling, etc.).

  11. A new method to obtain uniform distribution of ground control points based on regional statistical information

    NASA Astrophysics Data System (ADS)

    Ma, Chao; An, Wei; Deng, Xinpu

    2015-10-01

    The Ground Control Points (GCPs) is an important source of fundamental data in geometric correction for remote sensing imagery. The quantity, accuracy and distribution of GCPs are three factors which may affect the accuracy of geometric correction. It is generally required that the distribution of GCP should be uniform, so they can fully control the accuracy of mapping regions. In this paper, we establish an objective standard of evaluating the uniformity of the GCPs' distribution based on regional statistical information (RSI), and get an optimal distribution of GCPs. This sampling method is called RSIS for short in this work. The Amounts of GCPs in different regions by equally partitioning the image in regions in different manners are counted which forms a vector called RSI vector in this work. The uniformity of GCPs' distribution can be evaluated by a mathematical quantity of the RSI vector. An optimal distribution of GCPs is obtained by searching the RSI vector with the minimum mathematical quantity. In this paper, the simulation annealing is employed to search the optimal distribution of GCPs that have the minimum mathematical quantity of the RSI vector. Experiments are carried out to test the method proposed in this paper, and sampling designs compared are simple random sampling and universal kriging model-based sampling. The experiments indicate that this method is highly recommended as new GCPs sampling design method for geometric correction of remotely sensed imagery.

  12. An inversion method for retrieving soil moisture information from satellite altimetry observations

    NASA Astrophysics Data System (ADS)

    Uebbing, Bernd; Forootan, Ehsan; Kusche, Jürgen; Braakmann-Folgmann, Anne

    2016-04-01

    Soil moisture represents an important component of the terrestrial water cycle that controls., evapotranspiration and vegetation growth. Consequently, knowledge on soil moisture variability is essential to understand the interactions between land and atmosphere. Yet, terrestrial measurements are sparse and their information content is limited due to the large spatial variability of soil moisture. Therefore, over the last two decades, several active and passive radar and satellite missions such as ERS/SCAT, AMSR, SMOS or SMAP have been providing backscatter information that can be used to estimate surface conditions including soil moisture which is proportional to the dielectric constant of the upper (few cm) soil layers . Another source of soil moisture information are satellite radar altimeters, originally designed to measure sea surface height over the oceans. Measurements of Jason-1/2 (Ku- and C-Band) or Envisat (Ku- and S-Band) nadir radar backscatter provide high-resolution along-track information (~ 300m along-track resolution) on backscatter every ~10 days (Jason-1/2) or ~35 days (Envisat). Recent studies found good correlation between backscatter and soil moisture in upper layers, especially in arid and semi-arid regions, indicating the potential of satellite altimetry both to reconstruct and to monitor soil moisture variability. However, measuring soil moisture using altimetry has some drawbacks that include: (1) the noisy behavior of the altimetry-derived backscatter (due to e.g., existence of surface water in the radar foot-print), (2) the strong assumptions for converting altimetry backscatters to the soil moisture storage changes, and (3) the need for interpolating between the tracks. In this study, we suggest a new inversion framework that allows to retrieve soil moisture information from along-track Jason-2 and Envisat satellite altimetry data, and we test this scheme over the Australian arid and semi-arid regions. Our method consists of: (i

  13. Evaluation of Statistical Rainfall Disaggregation Methods Using Rain-Gauge Information for West-Central Florida

    SciTech Connect

    Murch, Renee Rokicki; Zhang, Jing; Ross, Mark; Ganguly, Auroop R; Nachabe, Mahmood

    2008-01-01

    Rainfall disaggregation in time can be useful for the simulation of hydrologic systems and the prediction of floods and flash floods. Disaggregation of rainfall to timescales less than 1 h can be especially useful for small urbanized watershed study, and for continuous hydrologic simulations and when Hortonian or saturation-excess runoff dominates. However, the majority of rain gauges in any region record rainfall in daily time steps or, very often, hourly records have extensive missing data. Also, the convective nature of the rainfall can result in significant differences in the measured rainfall at nearby gauges. This study evaluates several statistical approaches for rainfall disaggregation which may be applicable using data from West-Central Florida, specifically from 1 h observations to 15 min records, and proposes new methodologies that have the potential to outperform existing approaches. Four approaches are examined. The first approach is an existing direct scaling method that utilizes observed 15 min rainfall at secondary rain gauges, to disaggregate observed 1 h rainfall at more numerous primary rain gauges. The second approach is an extension of an existing method for continuous rainfall disaggregation through statistical distributional assumptions. The third approach relies on artificial neural networks for the disaggregation process without sorting and the fourth approach extends the neural network methods through statistical preprocessing via new sorting and desorting schemes. The applicability and performance of these methods were evaluated using information from a fairly dense rain gauge network in West-Central Florida. Of the four methods compared, the sorted neural networks and the direct scaling method predicted peak rainfall magnitudes significantly better than the remaining techniques. The study also suggests that desorting algorithms would also be useful to randomly replace the artificial hyetograph within a rainfall period.

  14. A novel concealed information test method based on independent component analysis and support vector machine.

    PubMed

    Gao, Junfeng; Lu, Liang; Yang, Yong; Yu, Gang; Na, Liantao; Rao, NiNi

    2012-01-01

    The concealed information test (CIT) has drawn much attention and has been widely investigated in recent years. In this study, a novel CIT method based on denoised P3 and machine learning was proposed to improve the accuracy of lie detection. Thirty participants were chosen as the guilty and innocent participants to perform the paradigms of 3 types of stimuli. The electroencephalogram (EEG) signals were recorded and separated into many single trials. In order to enhance the signal noise ratio (SNR) of P3 components, the independent component analysis (ICA) method was adopted to separate non-P3 components (i.e., artifacts) from every single trial. In order to automatically identify the P3 independent components (ICs), a new method based on topography template was proposed to automatically identify the P3 ICs. Then the P3 waveforms with high SNR were reconstructed on Pz electrodes. Second, the 3 groups of features based on time,frequency, and wavelets were extracted from the reconstructed P3 waveforms. Finally, 2 classes of feature samples were used to train a support vector machine (SVM) classifier because it has higher performance compared with several other classifiers. Meanwhile, the optimal number of P3 ICs and some other parameter values in the classifiers were determined by the cross-validation procedures. The presented method achieved a balance test accuracy of 84.29% on detecting P3 components for the guilty and innocent participants. The presented method improves the efficiency of CIT in comparison with previous reported methods. PMID:22423552

  15. Securing Mobile Ad Hoc Networks Using Danger Theory-Based Artificial Immune Algorithm

    PubMed Central

    2015-01-01

    A mobile ad hoc network (MANET) is a set of mobile, decentralized, and self-organizing nodes that are used in special cases, such as in the military. MANET properties render the environment of this network vulnerable to different types of attacks, including black hole, wormhole and flooding-based attacks. Flooding-based attacks are one of the most dangerous attacks that aim to consume all network resources and thus paralyze the functionality of the whole network. Therefore, the objective of this paper is to investigate the capability of a danger theory-based artificial immune algorithm called the mobile dendritic cell algorithm (MDCA) to detect flooding-based attacks in MANETs. The MDCA applies the dendritic cell algorithm (DCA) to secure the MANET with additional improvements. The MDCA is tested and validated using Qualnet v7.1 simulation tool. This work also introduces a new simulation module for a flooding attack called the resource consumption attack (RCA) using Qualnet v7.1. The results highlight the high efficiency of the MDCA in detecting RCAs in MANETs. PMID:25946001

  16. The use of theory based semistructured elicitation questionnaires: formative research for CDC's Prevention Marketing Initiative.

    PubMed Central

    Middlestadt, S E; Bhattacharyya, K; Rosenbaum, J; Fishbein, M; Shepherd, M

    1996-01-01

    Through one of its many HIV prevention programs, the Prevention Marketing Initiative, the Centers for Disease Control and Prevention promotes a multifaceted strategy for preventing the sexual transmission of HIV/AIDS among people less than 25 years of age. The Prevention Marketing Initiative is an application of marketing and consumer-oriented technologies that rely heavily on behavioral research and behavior change theories to bring the behavioral and social sciences to bear on practical program planning decisions. One objective of the Prevention Marketing Initiative is to encourage consistent and correct condom use among sexually active young adults. Qualitative formative research is being conducted in several segments of the population of heterosexually active, unmarried young adults between 18 and 25 using a semistructured elicitation procedure to identify and understand underlying behavioral determinants of consistent condom use. The purpose of this paper is to illustrate the use of this type of qualitative research methodology in designing effective theory-based behavior change interventions. Issues of research design and data collection and analysis are discussed. To illustrate the methodology, results of content analyses of selected responses to open-ended questions on consistent condom use are presented by gender (male, female), ethnic group (white, African American), and consistency of condom use (always, sometimes). This type of formative research can be applied immediately to designing programs and is invaluable for valid and relevant larger-scale quantitative research. PMID:8862153

  17. Statistical synthesis of contextual knowledge to increase the effectiveness of theory-based behaviour change interventions.

    PubMed

    Hanbury, Andria; Thompson, Carl; Mannion, Russell

    2011-07-01

    Tailored implementation strategies targeting health professionals' adoption of evidence-based recommendations are currently being developed. Research has focused on how to select an appropriate theoretical base, how to use that theoretical base to explore the local context, and how to translate theoretical constructs associated with the key factors found to influence innovation adoption into feasible and tailored implementation strategies. The reasons why an intervention is thought not to have worked are often cited as being: inappropriate choice of theoretical base; unsystematic development of the implementation strategies; and a poor evidence base to guide the process. One area of implementation research that is commonly overlooked is how to synthesize the data collected in a local context in order to identify what factors to target with the implementation strategies. This is suggested to be a critical process in the development of a theory-based intervention. The potential of multilevel modelling techniques to synthesize data collected at different hierarchical levels, for example, individual attitudes and team level variables, is discussed. Future research is needed to explore further the potential of multilevel modelling for synthesizing contextual data in implementation studies, as well as techniques for synthesizing qualitative and quantitative data. PMID:21543383

  18. Securing mobile ad hoc networks using danger theory-based artificial immune algorithm.

    PubMed

    Abdelhaq, Maha; Alsaqour, Raed; Abdelhaq, Shawkat

    2015-01-01

    A mobile ad hoc network (MANET) is a set of mobile, decentralized, and self-organizing nodes that are used in special cases, such as in the military. MANET properties render the environment of this network vulnerable to different types of attacks, including black hole, wormhole and flooding-based attacks. Flooding-based attacks are one of the most dangerous attacks that aim to consume all network resources and thus paralyze the functionality of the whole network. Therefore, the objective of this paper is to investigate the capability of a danger theory-based artificial immune algorithm called the mobile dendritic cell algorithm (MDCA) to detect flooding-based attacks in MANETs. The MDCA applies the dendritic cell algorithm (DCA) to secure the MANET with additional improvements. The MDCA is tested and validated using Qualnet v7.1 simulation tool. This work also introduces a new simulation module for a flooding attack called the resource consumption attack (RCA) using Qualnet v7.1. The results highlight the high efficiency of the MDCA in detecting RCAs in MANETs. PMID:25946001

  19. Grey-Theory-Based Optimization Model of Emergency Logistics Considering Time Uncertainty.

    PubMed

    Qiu, Bao-Jian; Zhang, Jiang-Hua; Qi, Yuan-Tao; Liu, Yang

    2015-01-01

    Natural disasters occur frequently in recent years, causing huge casualties and property losses. Nowadays, people pay more and more attention to the emergency logistics problems. This paper studies the emergency logistics problem with multi-center, multi-commodity, and single-affected-point. Considering that the path near the disaster point may be damaged, the information of the state of the paths is not complete, and the travel time is uncertainty, we establish the nonlinear programming model that objective function is the maximization of time-satisfaction degree. To overcome these drawbacks: the incomplete information and uncertain time, this paper firstly evaluates the multiple roads of transportation network based on grey theory and selects the reliable and optimal path. Then simplify the original model under the scenario that the vehicle only follows the optimal path from the emergency logistics center to the affected point, and use Lingo software to solve it. The numerical experiments are presented to show the feasibility and effectiveness of the proposed method. PMID:26417946

  20. Grey-Theory-Based Optimization Model of Emergency Logistics Considering Time Uncertainty

    PubMed Central

    Qiu, Bao-Jian; Zhang, Jiang-Hua; Qi, Yuan-Tao; Liu, Yang

    2015-01-01

    Natural disasters occur frequently in recent years, causing huge casualties and property losses. Nowadays, people pay more and more attention to the emergency logistics problems. This paper studies the emergency logistics problem with multi-center, multi-commodity, and single-affected-point. Considering that the path near the disaster point may be damaged, the information of the state of the paths is not complete, and the travel time is uncertainty, we establish the nonlinear programming model that objective function is the maximization of time-satisfaction degree. To overcome these drawbacks: the incomplete information and uncertain time, this paper firstly evaluates the multiple roads of transportation network based on grey theory and selects the reliable and optimal path. Then simplify the original model under the scenario that the vehicle only follows the optimal path from the emergency logistics center to the affected point, and use Lingo software to solve it. The numerical experiments are presented to show the feasibility and effectiveness of the proposed method. PMID:26417946

  1. [Dissemination of medical information in Europe, the USA and Japan, 1850-1870: focusing on information concerning the hypodermic injection method].

    PubMed

    Tsukisawa, Miyoko

    2011-12-01

    Modern medicine was introduced in Japan in the second half of the nineteenth century. In order to investigate this historical process, this paper focuses on the dissemination of information of a new medical technology developed in the mid-nineteenth century; it does so by making comparisons of the access to medical information between Europe, the USA and Japan. The hypodermic injection method was introduced in the clinical field in Europe and the USA as a newly developed therapeutic method during the 1850s and 1870s. This study analyzed information on the medical assessments of this method by clinicians of these periods. The crucial factor in accumulating this information was to develop a worldwide inter-medical communication circle with the aid of the medical journals. Information on the hypodermic injection method was introduced in Japan almost simultaneously with its introduction in Europe and the USA. However, because of the geographical distance and the language barrier, Japanese clinicians lacked access to this worldwide communication circle, and they accepted this new method without adequate medical technology assessments. PMID:22586891

  2. How much information on permeability can we expect from induced polarization methods? (Invited)

    NASA Astrophysics Data System (ADS)

    Binley, A. M.; Slater, L. D.

    2013-12-01

    Recognizing the significance of permeability heterogeneity on solute transport in groundwater, the determination of qualitative and quantitative information on permeability has been a major focus in the field of hydrogeophysics for some time. This drive has been particularly encouraged due to the minimal invasive method of most geophysical techniques, and the ability to produce spatially dense datasets of geophysical properties. Whilst DC resistivity, as a method, has matured into an extremely robust and flexible technique, and despite its wide use for mapping lithologies, translation of DC resistivity, as a property, to permeability is extremely limited, principally because of the sensitivity to pore fluid states (e.g. salinity) and grain surface electrical conductivity. Induced polarization (IP), in contrast, is sensitive to properties related to the grain surface and/or pore throat geometry, and thus it is intuitive to assume that the permeability and induced polarization response may be closely linked. Spectral IP (SIP) potentially adds further valuable information, given the measure of distribution of polarization length scales. In fact, IP as a tool for hydrogeological studies has been recognized for over 50 years, although it is only over the past two decades that significant advances have been made in both methodology (e.g. instruments, data inversion, etc.) and hydrogeological interpretation. Attempts to link IP (including SIP) and permeability have been explored through laboratory, field and model studies. Mechanistic models have been proposed, along with several empirical relationships. Despite these efforts, the ability to link permeability to IP measurements remains challenging. Formation-specific relationships have been demonstrated, and yet a universal link continues to be elusive. Here, we discuss the principal constraints, illustrated using laboratory and field datasets from a number of studies. We highlight specific challenges, including

  3. Multi-factor Constrained Information Analysis Method for Landslide Hazard Risk

    NASA Astrophysics Data System (ADS)

    Tao, Kunwang; Wang, Liang; Qian, Xinlin

    2015-04-01

    Landslide hazard cause enormous damage to human life, property, and the environment. The most effective way to mitigate the effects of landslides is to evaluate the risk of the landslides, and take measures to avoid the loss in advance. Various factors should be considered for the landslides' risk assessment, so the assessment has great complexity and uncertainty. According to landslides, the multiple factors constrained method for landslides risk assessment was proposed at the same time, which with three step to carried it out, first using the GIS technology to divide the analysis grid as the base analysis unit, second, according to the available information, slope, lithology, faults, landuse, etc. as the multiple evaluation factors, finally, based on years of landslides in the observed data, the risk assessment analysis of landslides were realized with of multiple factors constrained assessment model, which the weight value of every factor was determined by the information model. The region of Gongliu was selected as the experimental area which located in Xinjiang Ili River basin and the altitude of 600 to 4000 meters, with the special terrain characteristics of long at the east to west, narrow at north to south. The unique topography characters is the abundant rainfall which causing frequent landslides. By selecting 500m * 500m as an analysis grid which covering the whole study area, based on the multiple factors constrained method for the landslides risk assessment, a comprehensive assessment of the landslides risk in this region were computed out, and the landslide hazard classification map was realized at the end. From the experimental results of the statistical perspective, the proportion of landslide hazard point is 94.04% at a little high risk and high risk areas. And the proportion of 4.64% at the low risk zone, 1.32% at the most low risk zone. The results showed a high probability of landslides at high level of the assess region, which showed that

  4. Comparison and evaluation of joint histogram estimation methods for mutual information based image registration

    NASA Astrophysics Data System (ADS)

    Liang, Yongfang; Chen, Hua-mei

    2005-04-01

    Joint histogram is the only quantity required to calculate the mutual information (MI) between two images. For MI based image registration, joint histograms are often estimated through linear interpolation or partial volume interpolation (PVI). It has been pointed out that both methods may result in a phenomenon known as interpolation induced artifacts. In this paper, we implemented a wide range of interpolation/approximation kernels for joint histogram estimation. Some kernels are nonnegative. In this case, these kernels are applied in two ways as the linear kernel is applied in linear interpolation and PVI. In addition, we implemented two other joint histogram estimation methods devised to overcome the interpolation artifact problem. They are nearest neighbor interpolation with jittered sampling with/without histogram blurring and data resampling. We used the clinical data obtained from Vanderbilt University for all of the experiments. The objective of this study is to perform a comprehensive comparison and evaluation of different joint histogram estimation methods for MI based image registration in terms of artifacts reduction and registration accuracy.

  5. A Review of Data Quality Assessment Methods for Public Health Information Systems

    PubMed Central

    Chen, Hong; Hailey, David; Wang, Ning; Yu, Ping

    2014-01-01

    High quality data and effective data quality assessment are required for accurately evaluating the impact of public health interventions and measuring public health outcomes. Data, data use, and data collection process, as the three dimensions of data quality, all need to be assessed for overall data quality assessment. We reviewed current data quality assessment methods. The relevant study was identified in major databases and well-known institutional websites. We found the dimension of data was most frequently assessed. Completeness, accuracy, and timeliness were the three most-used attributes among a total of 49 attributes of data quality. The major quantitative assessment methods were descriptive surveys and data audits, whereas the common qualitative assessment methods were interview and documentation review. The limitations of the reviewed studies included inattentiveness to data use and data collection process, inconsistency in the definition of attributes of data quality, failure to address data users’ concerns and a lack of systematic procedures in data quality assessment. This review study is limited by the coverage of the databases and the breadth of public health information systems. Further research could develop consistent data quality definitions and attributes. More research efforts should be given to assess the quality of data use and the quality of data collection process. PMID:24830450

  6. A varying threshold method for ChIP peak-calling using multiple sources of information

    PubMed Central

    Chen, Kuan-Bei; Zhang, Yu

    2010-01-01

    Motivation: Gene regulation commonly involves interaction among DNA, proteins and biochemical conditions. Using chromatin immunoprecipitation (ChIP) technologies, protein–DNA interactions are routinely detected in the genome scale. Computational methods that detect weak protein-binding signals and simultaneously maintain a high specificity yet remain to be challenging. An attractive approach is to incorporate biologically relevant data, such as protein co-occupancy, to improve the power of protein-binding detection. We call the additional data related with the target protein binding as supporting tracks. Results: We propose a novel but rigorous statistical method to identify protein occupancy in ChIP data using multiple supporting tracks (PASS2). We demonstrate that utilizing biologically related information can significantly increase the discovery of true protein-binding sites, while still maintaining a desired level of false positive calls. Applying the method to GATA1 restoration in mouse erythroid cell line, we detected many new GATA1-binding sites using GATA1 co-occupancy data. Availability: http://stat.psu.edu/∼yuzhang/pass2.tar Contact: yuzhang@stat.psu.edu PMID:20823314

  7. Systems and methods for supplemental weather information presentation on a display

    NASA Technical Reports Server (NTRS)

    Bunch, Brian (Inventor)

    2010-01-01

    An embodiment of the supplemental weather display system presents supplemental weather information on a display in a craft. An exemplary embodiment receives the supplemental weather information from a remote source, determines a location of the supplemental weather information relative to the craft, receives weather information from an on-board radar system, and integrates the supplemental weather information with the weather information received from the on-board radar system.

  8. A method for detecting damage-induced nonlinearities in structures using information theory

    NASA Astrophysics Data System (ADS)

    Nichols, J. M.; Seaver, M.; Trickey, S. T.

    2006-10-01

    In this work a new approach is presented for detecting the presence of damage-induced nonlinearities in structures from measurements of structural dynamics. Two different information-theoretic (IT) measures, the time-delayed mutual information and the time-delayed transfer entropy are used to provide a probabilistic measure of the coupling between structural components. These measures may be used to capture both linear and nonlinear relationships among time-series data. The formula for both quantities is derived for a linear, five degree-of-freedom system subject to Gaussian excitation. An algorithm is then described for computing the IT metrics from time-series data and results are shown to agree with theory. We then show that as the coupling between the structure's components changes from linear to nonlinear the "information flow" can be used to indicate the degree of nonlinearity. Deviations from a linear model are quantified statistically by generating surrogate data sets that, by construction, possess only linear (second-order) correlations. We then apply the proposed algorithms to both the original data and the surrogates. Differences in the results are shown to be proportional to the degree of nonlinearity. This result is shown to be independent of global changes in stiffness and is therefore unaffected by certain models of environmental variability. Furthermore, the method provides an absolute measure of nonlinearity and therefore does not require a baseline data set for making comparisons. This approach is discussed in the context of structural health monitoring where damage is often associated with structural nonlinearity.

  9. A Causal Modelling Approach to the Development of Theory-Based Behaviour Change Programmes for Trial Evaluation

    ERIC Educational Resources Information Center

    Hardeman, Wendy; Sutton, Stephen; Griffin, Simon; Johnston, Marie; White, Anthony; Wareham, Nicholas J.; Kinmonth, Ann Louise

    2005-01-01

    Theory-based intervention programmes to support health-related behaviour change aim to increase health impact and improve understanding of mechanisms of behaviour change. However, the science of intervention development remains at an early stage. We present a causal modelling approach to developing complex interventions for evaluation in…

  10. The Impact of a Social Cognitive Theory-Based Intervention on Physical Education Teacher Self-Efficacy

    ERIC Educational Resources Information Center

    Martin, Jeffrey J.; McCaughtry, Nate; Kulinna, Pamela Hodges; Cothran, Donetta

    2009-01-01

    Adolescents are physically inactive and non-Caucasian adolescents achieve the least amount of physical activity. Hence, supporting teachers' efforts to increase their students' physical activity during physical education is important. We examined the influence of a social cognitive theory-based intervention on teachers' efficacy to teach…

  11. Efficacy of theory-based interventions to promote physical activity. A meta-analysis of randomised controlled trials.

    PubMed

    Gourlan, M; Bernard, P; Bortolon, C; Romain, A J; Lareyre, O; Carayol, M; Ninot, G; Boiché, J

    2016-01-01

    Implementing theory-based interventions is an effective way to influence physical activity (PA) behaviour in the population. This meta-analysis aimed to (1) determine the global effect of theory-based randomised controlled trials dedicated to the promotion of PA among adults, (2) measure the actual efficacy of interventions against their theoretical objectives and (3) compare the efficacy of single- versus combined-theory interventions. A systematic search through databases and review articles was carried out. Our results show that theory-based interventions (k = 82) significantly impact the PA behaviour of participants (d = 0.31, 95% CI [0.24, 0.37]). While moderation analyses revealed no efficacy difference between theories, interventions based on a single theory (d = 0.35; 95% CI [0.26, 0.43]) reported a higher impact on PA behaviour than those based on a combination of theories (d = 0.21; 95% CI [0.11, 0.32]). In spite of the global positive effect of theory-based interventions on PA behaviour, further research is required to better identify the specificities, overlaps or complementarities of the components of interventions based on relevant theories. PMID:25402606

  12. Physician's Referral Letter Bibliographic Service: A New Method of Disseminating Medical Information *

    PubMed Central

    Lodico, Norma Jean

    1973-01-01

    At the time this paper was written a unique trial project for disseminating medical literature to physicians had been in operation for six months (October 1971-April 1972) at the Virginia Medical Information System. Doctors in the state who referred patients to the Medical College of Virginia received short lists of references relevant to the problems of their patients as described in referral letters sent them by MCV consultants. Doctors receiving such lists were offered free photocopies of the articles cited if they could not obtain them locally. Of some 700 reference lists sent out, VAMIS received 12½ percent direct responses, and 22 percent of respondents to a questionnaire reported obtaining articles elsewhere. Ninety percent of the respondents favored the service and approved of its method. Funding problems necessitated revision of the service in the summer of 1972. Presently the service is provided only on request of the referring physician. PMID:4800294

  13. Municipal Solid Waste Management using Geographical Information System aided methods: a mini review.

    PubMed

    Khan, Debishree; Samadder, Sukha Ranjan

    2014-11-01

    Municipal Solid Waste Management (MSWM) is one of the major environmental challenges in developing countries. Many efforts to reduce and recover the wastes have been made, but still land disposal of solid wastes is the most popular one. Finding an environmentally sound landfill site is a challenging task. This paper addresses a mini review on various aspects of MSWM (suitable landfill site selection, route optimization and public acceptance) using the Geographical Information System (GIS) coupled with other tools. The salient features of each of the integrated tools with GIS are discussed in this paper. It is also addressed how GIS can help in optimizing routes for collection of solid wastes from transfer stations to disposal sites to reduce the overall cost of solid waste management. A detailed approach on performing a public acceptance study of a proposed landfill site is presented in this study. The study will help municipal authorities to identify the most effective method of MSWM. PMID:25352293

  14. Analysis of informativeness of immunohistochemical and flow cytometric methods for estrogen receptor α assessment.

    PubMed

    Bogush, T A; Dudko, E A; Rodionova, M V; Bogush, E A; Kirsanov, V J; Rodionov, V V; Vorotnikov, I K

    2015-01-01

    Informative capacity analysis of immunohistochemistry (IHC) and flow cytometry (FCM) in the assessment of estrogen receptor α (ERα) expression in breast cancer tissue was performed. Similar frequencies of expression were shown by both methods: 27% of ERα-negative and 73% ERα-positive cases. However, IHC evaluation detected low levels in only 20% of ERα-positive cases, whereas low levels of ERα detected by FCM were 2 times more often (48%). Moreover, FCM revealed positive expression (23-60%) in 33% of IHC ERα-negative cases. Among IHC ER-positive cases, zero ERα expression was detected by FCM in 12.5%. The approaches to minimize errors in routine clinical determination of the estrogen receptor status were proposed. PMID:26728725

  15. Structure analysis of the Polish academic information society using MDS method

    NASA Astrophysics Data System (ADS)

    Kaliczynska, Malgorzata

    2006-03-01

    The article presents the methodology of webometrics research and analysis aiming at determining similar features of objects belonging to the Polish information society, which uses the Internet and its www resources for communication purposes. In particular, the analysis applies to the selected Polish technical universities. The research was carried out in several phases - on different data groups - with regards to the Internet space and time changes. The results have been presented in a form of two and three-dimensional topography maps. For the purposes of this analysis, the computer methods of multidimensional scaling were used. The research will be further continued for a selected group of objects over a longer time frame. Its next stage will be the research on more diversified objects, also in a multinational aspect.

  16. Using cognitive science methods to assess the role of social information processing in sexually coercive behavior.

    PubMed

    Treat, T A; McFall, R M; Viken, R J; Kruschke, J K

    2001-12-01

    Seventy-four undergraduate men completed cognitive performance tasks assessing perceptual organization, classification, and category learning, as well as self-report measures relevant to sexual coercion. The stimuli were slides of Caucasian women who varied along affect and physical exposure (i.e., sensuality) dimensions. Data were analyzed using a weighted multidimensional scaling model, signal-detection theory analyses, and a connectionist learning model (RASHNL; J. K. Kruschke & M. K. Johansen, 1999). Individual differences in performance on the classification and category-learning tasks were congruent with individual differences in perceptual organization. Additionally, participants who showed relatively more attention to exposure than to affect were less sensitive to women's negative responses to unwanted sexual advances. Overall, the study demonstrates the feasibility and utility of cognitive science methods for studying information processing in psychopathology. PMID:11793898

  17. Clinical simulation: A method for development and evaluation of clinical information systems.

    PubMed

    Jensen, Sanne; Kushniruk, Andre W; Nøhr, Christian

    2015-04-01

    Use of clinical simulation in the design and evaluation of eHealth systems and applications has increased during the last decade. This paper describes a methodological approach for using clinical simulations in the design and evaluation of clinical information systems. The method is based on experiences from more than 20 clinical simulation studies conducted at the ITX-lab in the Capital Region of Denmark during the last 5 years. A ten-step approach to conducting simulations is presented in this paper. To illustrate the approach, a clinical simulation study concerning implementation of Digital Clinical Practice Guidelines in a prototype planning and coordination module is presented. In the case study potential benefits were assessed in a full-scale simulation test including 18 health care professionals. The results showed that health care professionals can benefit from such a module. Unintended consequences concerning terminology and changes in the division of responsibility amongst healthcare professionals were also identified, and questions were raised concerning future workflow across sector borders. Furthermore unexpected new possible benefits concerning improved communication, content of information in discharge letters and quality management emerged during the testing. In addition new potential groups of users were identified. The case study is used to demonstrate the potential of using the clinical simulation approach described in the paper. PMID:25684129

  18. Surveillance of contact allergies: methods and results of the Information Network of Departments of Dermatology (IVDK).

    PubMed

    Schnuch, A; Geier, J; Lessmann, H; Arnold, R; Uter, W

    2012-07-01

    Contact allergy (CA) surveillance networks provide information to a multitude of stakeholders, which is indispensable for evidence-based decision-making in the field of prevention. Methods and results of the German surveillance system on CA are reviewed and discussed with reference to other systems. The German network structure comprises 56 departments of dermatology and includes all patients who are patch-tested for suspected CA. Data analysis considers the results of patch testing and further pertinent information for each patient. Following aspects are addressed: (i) the description of the clinical population, (ii) evaluation of patch test reactions, (iii) relationship between patch test results and population characteristics. Trend analyses on chromate (decreasing), epoxy resin (increasing) and nickel (heterogeneous) served as examples for surveillance system analyses, with the identification of sentinel events, as well as proof of success or failure of prevention. In addition, external data sources can be used such as sales data of patch test preparations to estimate frequencies of sensitization on a population level. National prescription data of drugs and statistics of labelling of preservatives on cosmetics can be included, the latter two approaches allowing for risk estimates conferred by specific allergens. PMID:22563651

  19. Genetically informative research on adolescent substance use: methods, findings and challenges

    PubMed Central

    Lynskey, Michael T.; Agrawal, Arpana; Heath, Andrew C.

    2010-01-01

    Objective To provide an overview of the genetic epidemiology of substance use and misuse in adolescents. Method We present a selective review of genetically informative research strategies, their limitations and key findings examining issues related to the heritability of substance use and substance use disorders in children and adolescents. Results Adoption, twin and extended family designs have established there is a strong heritable component to liability to nicotine, alcohol and illicit drug dependence in adults. However, shared environmental influences are relatively stronger in youth samples and at earlier stages of substance involvement (e.g., use). There is considerable overlap in the genetic influences associated with the abuse/ dependence across drug classes while shared genetic influences also contribute to the commonly observed associations between substance use disorders and both externalizing and, to a lesser extent, internalizing psychopathology. Rapid technological advances have made the identification of specific gene variants that influence risks for substance use disorders feasible and linkage and association (including genomewide association studies) have identified promising candidate genes implicated in the development of substance use disorders. Conclusions Studies using genetically informative research designs, including those that examine aggregate genetic factors and those examining specific gene variants, individually and in interaction with environmental influences, offer promising avenues not only for delineating genetic effects on substance use disorders but also for understanding the unfolding of risk across development and the interaction between environmental and genetic factors in the etiology of these disorders. PMID:21093770

  20. Information content and analysis methods for Multi-Modal High-Throughput Biomedical Data

    NASA Astrophysics Data System (ADS)

    Ray, Bisakha; Henaff, Mikael; Ma, Sisi; Efstathiadis, Efstratios; Peskin, Eric R.; Picone, Marco; Poli, Tito; Aliferis, Constantin F.; Statnikov, Alexander

    2014-03-01

    The spectrum of modern molecular high-throughput assaying includes diverse technologies such as microarray gene expression, miRNA expression, proteomics, DNA methylation, among many others. Now that these technologies have matured and become increasingly accessible, the next frontier is to collect ``multi-modal'' data for the same set of subjects and conduct integrative, multi-level analyses. While multi-modal data does contain distinct biological information that can be useful for answering complex biology questions, its value for predicting clinical phenotypes and contributions of each type of input remain unknown. We obtained 47 datasets/predictive tasks that in total span over 9 data modalities and executed analytic experiments for predicting various clinical phenotypes and outcomes. First, we analyzed each modality separately using uni-modal approaches based on several state-of-the-art supervised classification and feature selection methods. Then, we applied integrative multi-modal classification techniques. We have found that gene expression is the most predictively informative modality. Other modalities such as protein expression, miRNA expression, and DNA methylation also provide highly predictive results, which are often statistically comparable but not superior to gene expression data. Integrative multi-modal analyses generally do not increase predictive signal compared to gene expression data.

  1. Information content and analysis methods for Multi-Modal High-Throughput Biomedical Data

    PubMed Central

    Ray, Bisakha; Henaff, Mikael; Ma, Sisi; Efstathiadis, Efstratios; Peskin, Eric R.; Picone, Marco; Poli, Tito; Aliferis, Constantin F.; Statnikov, Alexander

    2014-01-01

    The spectrum of modern molecular high-throughput assaying includes diverse technologies such as microarray gene expression, miRNA expression, proteomics, DNA methylation, among many others. Now that these technologies have matured and become increasingly accessible, the next frontier is to collect “multi-modal” data for the same set of subjects and conduct integrative, multi-level analyses. While multi-modal data does contain distinct biological information that can be useful for answering complex biology questions, its value for predicting clinical phenotypes and contributions of each type of input remain unknown. We obtained 47 datasets/predictive tasks that in total span over 9 data modalities and executed analytic experiments for predicting various clinical phenotypes and outcomes. First, we analyzed each modality separately using uni-modal approaches based on several state-of-the-art supervised classification and feature selection methods. Then, we applied integrative multi-modal classification techniques. We have found that gene expression is the most predictively informative modality. Other modalities such as protein expression, miRNA expression, and DNA methylation also provide highly predictive results, which are often statistically comparable but not superior to gene expression data. Integrative multi-modal analyses generally do not increase predictive signal compared to gene expression data. PMID:24651673

  2. Information and treatment of unknown correlations in the combination of measurements using the BLUE method

    NASA Astrophysics Data System (ADS)

    Valassi, Andrea; Chierici, Roberto

    2014-03-01

    We discuss the effect of large positive correlations in the combinations of several measurements of a single physical quantity using the Best Linear Unbiased Estimate (BLUE) method. We suggest a new approach for comparing the relative weights of the different measurements in their contributions to the combined knowledge about the unknown parameter, using the well-established concept of Fisher information. We argue, in particular, that one contribution to information comes from the collective interplay of the measurements through their correlations and that this contribution cannot be attributed to any of the individual measurements alone. We show that negative coefficients in the BLUE weighted average invariably indicate the presence of a regime of high correlations, where the effect of further increasing some of these correlations is that of reducing the error on the combined estimate. In these regimes, we stress that assuming fully correlated systematic uncertainties is not a truly conservative choice, and that the correlations provided as input to BLUE combinations need to be assessed with extreme care instead. In situations where the precise evaluation of these correlations is impractical, or even impossible, we provide tools to help experimental physicists perform more conservative combinations.

  3. Retrieval of Aerosol information from UV measurement by using optimal estimation method

    NASA Astrophysics Data System (ADS)

    KIM, M.; Kim, J.; Jeong, U.; Kim, W. V.; Kim, S. K.; Lee, S. D.; Moon, K. J.

    2014-12-01

    An algorithm to retrieve aerosol optical depth (AOD), single scattering albedo (SSA), and aerosol loading height is developed for GEMS (Geostationary Environment Monitoring Spectrometer) measurement. The GEMS is planned to be launched in geostationary orbit in 2018, and employs hyper-spectral imaging with 0.6 nm resolution to observe solar backscatter radiation in the UV and Visible range. In the UV range, the low surface contribution to the backscattered radiation and strong interaction between aerosol absorption and molecular scattering can be advantageous in retrieving aerosol information such as AOD and SSA [Torres et al., 2007; Torres et al., 2013; Ahn et al., 2014]. However, the large contribution of atmospheric scattering results in the increase of the sensitivity of the backward radiance to aerosol loading height. Thus, the assumption of aerosol loading height becomes important issue to obtain accurate result. Accordingly, this study focused on the simultaneous retrieval of aerosol loading height with AOD and SSA by utilizing the optimal estimation method. For the RTM simulation, the aerosol optical properties were analyzed from AERONET inversion data (level 2.0) at 46 AERONET sites over ASIA. Also, 2-channel inversion method is applied to estimate a priori value of the aerosol information to solve the Lavenberg Marquardt equation. The GEMS aerosol algorithm is tested with OMI level-1B dataset, a provisional data for GEMS measurement, and the result is compared with OMI standard aerosol product and AERONET values. The retrieved AOD and SSA show reasonable distribution compared with OMI products, and are well correlated with the value measured from AERONET. However, retrieval uncertainty in aerosol loading height is relatively larger than other results.

  4. Informing HIV prevention efforts targeting Liberian youth: a study using the PLACE method in Liberia

    PubMed Central

    2013-01-01

    Background Preventing HIV infection among young people is a priority for the Liberian government. Data on the young people in Liberia are scarce but needed to guide HIV programming efforts. Methods We used the Priorities for Local AIDS Control Efforts (PLACE) method to gather information on risk behaviors that young people (ages 14 to 24) engage in or are exposed to that increase their vulnerability for HIV infection. Community informants identified 240 unique venues of which 150 were visited and verified by research staff. 89 of the 150 venues comprised our sampling frame and 571 females and 548 males were interviewed in 50 venues using a behavioral survey. Results Ninety-one percent of females and 86% of males reported being sexually active. 56% of females and 47% of males reported they initiated sexual activity before the age of 15. Among the sexually active females, 71% reported they had received money or a gift for sex and 56% of males reported they had given money or goods for sex. 20% of females and 6% males reported that their first sexual encounter was forced and 15% of females and 6% of males reported they had been forced to have sex in the past year. Multiple partnerships were common among both sexes with 81% females and 76% males reporting one or more sex partners in the past four weeks. Less than 1% reported having experiences with injecting drugs and only 1% of males reporting have sex with men. While knowledge of HIV/AIDS was high, prevention behaviors including HIV testing and condom use were low. Conclusion Youth-focused HIV efforts in Liberia need to address transactional sex and multiple and concurrent partnerships. HIV prevention interventions should include efforts to meet the economic needs of youth. PMID:24107301

  5. A preliminary investigation of genetic counselors’ information needs when receiving a variant of uncertain significance result: a mixed methods study

    PubMed Central

    Scherr, Courtney L.; Lindor, Noralane M.; Malo, Teri L.; Couch, Fergus J.; Vadaparampil, Susan T.

    2015-01-01

    Purpose To explore genetic counselors’ information preferences on reports of variant of uncertain significance (VUS) results from cancer genetic testing. Methods This mixed methods report (quantitative and qualitative approaches) utilized a survey of genetic counselors containing closed- and open-ended questions to explore genetic counselors’ information needs and perceptions of the industry’s current information sharing practices. Descriptive statistics were calculated for responses to the closed-ended questions and thematic analysis guided the interpretation of the open-ended questions. Results Of the 267 participants (28.6% response rate), the majority indicated a perceived lack of information on VUS laboratory reports, were concerned about the perceived practice of withholding information, and stated the information they wanted to see. Although most did not indicate how additional information would be used, some reported they would provide information directly to patients, and others reported the information would be used to contextualize the VUS result when counseling patients. Conclusion This analysis identified information genetic counselors believe is needed on VUS reports indicating what they believe are best practices in lieu of guidelines for laboratories currently providing genetic testing services, and implies needed guidelines for reporting VUS. Future studies should explore how genetic counselors use additional information contained on VUS reports. PMID:25569439

  6. Information for School Administrators.

    ERIC Educational Resources Information Center

    Kowitz, Gerald T.; And Others

    Modern management theory, based on the reduction of uncertainties, demands the collection and manipulation of large amounts of information. School Administrators choke in the process of trying to digest a proliferation of data, only some of which are useful. The aims of the study were to explore the extent to which large amounts of data could be…

  7. Cases on Successful E-Learning Practices in the Developed and Developing World: Methods for the Global Information Economy

    ERIC Educational Resources Information Center

    Olaniran, Bolanle A., Ed.

    2010-01-01

    E-learning has become a significant aspect of training and education in the worldwide information economy as an attempt to create and facilitate a competent global work force. "Cases on Successful E-Learning Practices in the Developed and Developing World: Methods for the Global Information Economy" provides eclectic accounts of case studies in…

  8. Evaluation of television as a method of disseminating solar energy information

    SciTech Connect

    Edington, E.D.

    1980-08-01

    This project included three separate studies undertaken to determine the effectiveness of television instruction as a method of effectively delivering information about solar energy systems to present and future workers in related industries, and as a method of delivery for adult continuing education instruction. All three studies used a series of five half-hour videotapes. A survey of the general public in the Las Cruces area was conducted to determine the number and occupational status of people watching the series on a local public broadcasting station. Surveys of random samples then assessed the prior level of knowledge on solar energy of residents from differing socioeconomic strata and any increase in knowledge after viewing the series. The worker study included apprentices, journeymen sheet metal workers, and materials handlers. These workers were given a pretest, shown the television series, and given a posttest. The relative effectiveness of television and regular instruction were compared in solar energy related vocational classes at two postsecondary institutions, and student attitudes concerning television instruction were assessed.

  9. Generalized Cross Entropy Method for estimating joint distribution from incomplete information

    NASA Astrophysics Data System (ADS)

    Xu, Hai-Yan; Kuo, Shyh-Hao; Li, Guoqi; Legara, Erika Fille T.; Zhao, Daxuan; Monterola, Christopher P.

    2016-07-01

    Obtaining a full joint distribution from individual marginal distributions with incomplete information is a non-trivial task that continues to challenge researchers from various domains including economics, demography, and statistics. In this work, we develop a new methodology referred to as "Generalized Cross Entropy Method" (GCEM) that is aimed at addressing the issue. The objective function is proposed to be a weighted sum of divergences between joint distributions and various references. We show that the solution of the GCEM is unique and global optimal. Furthermore, we illustrate the applicability and validity of the method by utilizing it to recover the joint distribution of a household profile of a given administrative region. In particular, we estimate the joint distribution of the household size, household dwelling type, and household home ownership in Singapore. Results show a high-accuracy estimation of the full joint distribution of the household profile under study. Finally, the impact of constraints and weight on the estimation of joint distribution is explored.

  10. A practical method for skin dose estimation in interventional cardiology based on fluorographic DICOM information.

    PubMed

    Matthews, Lucy; Dixon, Matthew; Rowles, Nick; Stevens, Greg

    2016-03-01

    A practical method for skin dose estimation for interventional cardiology patients has been developed to inform pre-procedure planning and post-procedure patient management. Absorbed dose to the patient skin for certain interventional radiology procedures can exceed thresholds for deterministic skin injury, requiring documentation within the patient notes and appropriate patient follow-up. The primary objective was to reduce uncertainty associated with current methods, particularly surrounding field overlap. This was achieved by considering rectangular field geometry incident on a spherical patient model in a polar coordinate system. The angular size of each field was quantified at surface of the sphere, i.e. the skin surface. Computer-assisted design software enabled the modelling of a sufficient dataset that was subsequently validated with radiochromic film. Modelled overlap was found to agree with overlap measured using film to within 2.2° ± 2.0°, showing that the overall error associated with the model was < 1 %. Mathematical comparison against exposure data extracted from procedural Digital Imaging and Communication in Medicine files was used to generate a graphical skin dose map, demonstrating the dose distribution over a sphere centred at the interventional reference point. Dosimetric accuracy of the software was measured as between 3.5 and 17 % for different variables. PMID:25994848

  11. A Rapid Monitoring and Evaluation Method of Schistosomiasis Based on Spatial Information Technology.

    PubMed

    Wang, Yong; Zhuang, Dafang

    2015-12-01

    Thanks to Spatial Information Technologies (SITs) such as Remote Sensing (RS) and Geographical Information System (GIS) that are being quickly developed and updated, SITs are being used more widely in the public health field. The use of SITs to study the characteristics of the temporal and spatial distribution of Schistosoma japonicum and to assess the risk of infection provides methods for the control and prevention of schistosomiasis japonica has gradually become a hot topic in the field. The purpose of the present paper was to use RS and GIS technology to develop an efficient method of prediction and assessment of the risk of schistosomiasis japonica. We choose the Yueyang region, close to the east DongTing Lake (Hunan Province, China), as the study area, where a recent serious outbreak of schistosomiasis japonica took place. We monitored and evaluated the transmission risk of schistosomiasis japonica in the region using SITs. Water distribution data were extracted from RS images. The ground temperature, ground humidity and vegetation index were calculated based on RS images. Additionally, the density of oncomelania snails, which are the Schistosoma japonicum intermediate host, was calculated on the base of RS data and field measurements. The spatial distribution of oncomelania snails was explored using SITs in order to estimate the area surrounding the residents with transmission risk of schistosomiasis japonica. Our research result demonstrated: (1) the risk factors for the transmission of schistosomiasis japonica were closely related to the living environment of oncomelania snails. Key factors such as water distribution, ground temperature, ground humidity and vegetation index can be quickly obtained and calculated from RS images; (2) using GIS technology and a RS deduction technique along with statistical regression models, the density distribution model of oncomelania snails could be quickly built; (3) using SITs and analysis with overlaying population

  12. A Rapid Monitoring and Evaluation Method of Schistosomiasis Based on Spatial Information Technology

    PubMed Central

    Wang, Yong; Zhuang, Dafang

    2015-01-01

    Thanks to Spatial Information Technologies (SITs) such as Remote Sensing (RS) and Geographical Information System (GIS) that are being quickly developed and updated, SITs are being used more widely in the public health field. The use of SITs to study the characteristics of the temporal and spatial distribution of Schistosoma japonicum and to assess the risk of infection provides methods for the control and prevention of schistosomiasis japonica has gradually become a hot topic in the field. The purpose of the present paper was to use RS and GIS technology to develop an efficient method of prediction and assessment of the risk of schistosomiasis japonica. We choose the Yueyang region, close to the east DongTing Lake (Hunan Province, China), as the study area, where a recent serious outbreak of schistosomiasis japonica took place. We monitored and evaluated the transmission risk of schistosomiasis japonica in the region using SITs. Water distribution data were extracted from RS images. The ground temperature, ground humidity and vegetation index were calculated based on RS images. Additionally, the density of oncomelania snails, which are the Schistosoma japonicum intermediate host, was calculated on the base of RS data and field measurements. The spatial distribution of oncomelania snails was explored using SITs in order to estimate the area surrounding the residents with transmission risk of schistosomiasis japonica. Our research result demonstrated: (1) the risk factors for the transmission of schistosomiasis japonica were closely related to the living environment of oncomelania snails. Key factors such as water distribution, ground temperature, ground humidity and vegetation index can be quickly obtained and calculated from RS images; (2) using GIS technology and a RS deduction technique along with statistical regression models, the density distribution model of oncomelania snails could be quickly built; (3) using SITs and analysis with overlaying population

  13. A generic model for data acquisition: Connectionist methods of information processing

    NASA Astrophysics Data System (ADS)

    Ehrlich, Jacques

    1993-06-01

    EDDAKS (Event Driven Data Acquisition Kernel System), for the quality control of products created in industrial production processes, is proposed. It is capable of acquiring information about discrete event systems by synchronizing to them via the events. EDDAKS consists of EdObjects, forming a hierarchy, which react to EdEvents, and perform processing operations on messages. The hierarchy of EdObjects consists (from bottom up) of the Sensor, the Phase, the Extracter, the Dynamic Spreadsheet, and EDDAKS itself. The first three levels contribute to building the internal representation: a state vector characterizing a product in the course of production. The Dynamic Spreadsheet, is a processing structure that can be parameterized, used to perform calculations on a set of internal representations in order to deliver the external representation to the user. A system intended for quality control of the products delivered by a concrete production plant was generated by EDDAKS and used to validate. Processing methods using the multilayer perceptron model were considered. Two contributions aimed at improving the performance of this network are proposed. One consists of implanting a conjugate gradient method. The effectiveness of this method depends on the determination of an optimum gradient step that is efficiently calculated by a linear search using a secant algorithm. The other is intended to reduce the connectivity of the network by adapting it to the problem to be solved. It consists of identifying links having little or no activity and destroying them. This activity is determined by evaluating the covariance between each of the inputs of a cell and its output. An experiment in which nonlinear prediction is applied to a civil engineering problem is described.

  14. Nonparametric Methods for Incorporating Genomic Information Into Genetic Evaluations: An Application to Mortality in Broilers

    PubMed Central

    González-Recio, Oscar; Gianola, Daniel; Long, Nanye; Weigel, Kent A.; Rosa, Guilherme J. M.; Avendaño, Santiago

    2008-01-01

    Four approaches using single-nucleotide polymorphism (SNP) information (F∞-metric model, kernel regression, reproducing kernel Hilbert spaces (RKHS) regression, and a Bayesian regression) were compared with a standard procedure of genetic evaluation (E-BLUP) of sires using mortality rates in broilers as a response variable, working in a Bayesian framework. Late mortality (14–42 days of age) records on 12,167 progeny of 200 sires were precorrected for fixed and random (nongenetic) effects used in the model for genetic evaluation and for the mate effect. The average of the corrected records was computed for each sire. Twenty-four SNPs seemingly associated with late mortality were included in three methods used for genomic assisted evaluations. One thousand SNPs were included in the Bayesian regression, to account for markers along the whole genome. The posterior mean of heritability of mortality was 0.02 in the E-BLUP approach, suggesting that genetic evaluation could be improved if suitable molecular markers were available. Estimates of posterior means and standard deviations of the residual variance were 24.38 (3.88), 29.97 (3.22), 17.07 (3.02), and 20.74 (2.87) for E-BLUP, the linear model on SNPs, RKHS regression, and the Bayesian regression, respectively, suggesting that RKHS accounted for more variance in the data. The two nonparametric methods (kernel and RKHS regression) fitted the data better, having a lower residual sum of squares. Predictive ability, assessed by cross-validation, indicated advantages of the RKHS approach, where accuracy was increased from 25 to 150%, relative to other methods. PMID:18430951

  15. Healthcare information systems: data mining methods in the creation of a clinical recommender system

    NASA Astrophysics Data System (ADS)

    Duan, L.; Street, W. N.; Xu, E.

    2011-05-01

    Recommender systems have been extensively studied to present items, such as movies, music and books that are likely of interest to the user. Researchers have indicated that integrated medical information systems are becoming an essential part of the modern healthcare systems. Such systems have evolved to an integrated enterprise-wide system. In particular, such systems are considered as a type of enterprise information systems or ERP system addressing healthcare industry sector needs. As part of efforts, nursing care plan recommender systems can provide clinical decision support, nursing education, clinical quality control, and serve as a complement to existing practice guidelines. We propose to use correlations among nursing diagnoses, outcomes and interventions to create a recommender system for constructing nursing care plans. In the current study, we used nursing diagnosis data to develop the methodology. Our system utilises a prefix-tree structure common in itemset mining to construct a ranked list of suggested care plan items based on previously-entered items. Unlike common commercial systems, our system makes sequential recommendations based on user interaction, modifying a ranked list of suggested items at each step in care plan construction. We rank items based on traditional association-rule measures such as support and confidence, as well as a novel measure that anticipates which selections might improve the quality of future rankings. Since the multi-step nature of our recommendations presents problems for traditional evaluation measures, we also present a new evaluation method based on average ranking position and use it to test the effectiveness of different recommendation strategies.

  16. Applying Sequential Analytic Methods to Self-Reported Information to Anticipate Care Needs

    PubMed Central

    Bayliss, Elizabeth A.; Powers, J. David; Ellis, Jennifer L.; Barrow, Jennifer C.; Strobel, MaryJo; Beck, Arne

    2016-01-01

    Purpose: Identifying care needs for newly enrolled or newly insured individuals is important under the Affordable Care Act. Systematically collected patient-reported information can potentially identify subgroups with specific care needs prior to service use. Methods: We conducted a retrospective cohort investigation of 6,047 individuals who completed a 10-question needs assessment upon initial enrollment in Kaiser Permanente Colorado (KPCO), a not-for-profit integrated delivery system, through the Colorado State Individual Exchange. We used responses from the Brief Health Questionnaire (BHQ), to develop a predictive model for cost for receiving care in the top 25 percent, then applied cluster analytic techniques to identify different high-cost subpopulations. Per-member, per-month cost was measured from 6 to 12 months following BHQ response. Results: BHQ responses significantly predictive of high-cost care included self-reported health status, functional limitations, medication use, presence of 0–4 chronic conditions, self-reported emergency department (ED) use during the prior year, and lack of prior insurance. Age, gender, and deductible-based insurance product were also predictive. The largest possible range of predicted probabilities of being in the top 25 percent of cost was 3.5 percent to 96.4 percent. Within the top cost quartile, examples of potentially actionable clusters of patients included those with high morbidity, prior utilization, depression risk and financial constraints; those with high morbidity, previously uninsured individuals with few financial constraints; and relatively healthy, previously insured individuals with medication needs. Conclusions: Applying sequential predictive modeling and cluster analytic techniques to patient-reported information can identify subgroups of individuals within heterogeneous populations who may benefit from specific interventions to optimize initial care delivery. PMID:27563684

  17. Where is information quality lost at clinical level? A mixed-method study on information systems and data quality in three urban Kenyan ANC clinics

    PubMed Central

    Hahn, Daniel; Wanjala, Pepela; Marx, Michael

    2013-01-01

    Background Well-working health information systems are considered vital with the quality of health data ranked of highest importance for decision making at patient care and policy levels. In particular, health facilities play an important role, since they are not only the entry point for the national health information system but also use health data (and primarily) for patient care. Design A multiple case study was carried out between March and August 2012 at the antenatal care (ANC) clinics of two private and one public Kenyan hospital to describe clinical information systems and assess the quality of information. The following methods were developed and employed in an iterative process: workplace walkthroughs, structured and in-depth interviews with staff members, and a quantitative assessment of data quality (completeness and accurate transmission of clinical information and reports in ANC). Views of staff and management on the quality of employed information systems, data quality, and influencing factors were captured qualitatively. Results Staff rated the quality of information higher in the private hospitals employing computers than in the public hospital which relies on paper forms. Several potential threats to data quality were reported. Limitations in data quality were common at all study sites including wrong test results, missing registers, and inconsistencies in reports. Feedback was seldom on content or quality of reports and usage of data beyond individual patient care was low. Conclusions We argue that the limited data quality has to be seen in the broader perspective of the information systems in which it is produced and used. The combination of different methods has proven to be useful for this. To improve the effectiveness and capabilities of these systems, combined measures are needed which include technical and organizational aspects (e.g. regular feedback to health workers) and individual skills and motivation. PMID:23993022

  18. The JPL Tropical Cyclone Information System: Methods for Creating Near Real-Time Science Data Portals

    NASA Astrophysics Data System (ADS)

    Knosp, B.; Li, P.; Vu, Q.; Hristova-Veleva, S. M.; Turk, F. J.; Shen, T.; Poulsen, W. L.; Lambrigtsen, B.

    2013-12-01

    The JPL Tropical Cyclone Information System (TCIS) brings together satellite, aircraft, and model forecast data from several NASA, NOAA, and other data centers to assist researchers in comparing and analyzing data related to tropical cyclones. The JPL TCIS was made public in 2008 and initially served as a data and plot archive for past storms. More recently, the TCIS has expanded its functionality to provide near real-time (NRT) data portals for specific science field campaigns, such as the Genesis and Rapid Intensification Processes (GRIP) campaign in 2010 and ongoing Hurricane and Severe Storm Sentinel (HS3) campaign. These NRT portals allow campaign team members to look at current conditions in the geographical domain of interest. Creating the NRT portals has been particularly challenging due to (1) the wide breadth of data that needs to be collected, (2) the number of data product plots that need to be served to the user, (3) the mechanics of the search and discovery tools, and (4) the issue of how to display multiple data plots at once in a meaningful way. Recently, the TCIS team has been working to redevelop the NRT portals with these challenges in mind. The new architecture we created allows for configurable mission portals that can be created on the fly. In addition to a new database that handles portal configuration, these updated NRT portals also support an improved navigation method that allows users to see what data is available, as well as a resizable visualization area based on the users' client. The integration of the NRT portal with the NASA Earth Observing System Simulators Suite (NEOS3) and a set of new online data analysis tools allows users to compare the observation and model outputs directly and perform statistical analysis with multiple datasets. In this poster, we will present the methods and practices we used to create configurable portals, gather and plot science data with low latencies, design a navigation scheme that supports multiple

  19. Assessing the Government Information Locator Service (GILS): A Multi-Method Approach for Evaluating Networked Services.

    ERIC Educational Resources Information Center

    Moen, William E.; McClure, Charles R.; Koelker, June

    1997-01-01

    Describes a multimethod approach used to evaluate the Government Information Locator Service (GILS). Highlights the limitations and opportunities of available approaches to evaluating complex characteristics of networked information services and digital collections. (Author/AEF)

  20. The Evolution of Library Instruction Delivery in the Chemistry Curriculum Informed by Mixed Assessment Methods

    ERIC Educational Resources Information Center

    Mandernach, Meris A.; Shorish, Yasmeen; Reisner, Barbara A.

    2014-01-01

    As information continues to evolve over time, the information literacy expectations for chemistry students also change. This article examines transformations to an undergraduate chemistry course that focuses on chemical literature and information literacy and is co-taught by a chemistry professor and a chemistry librarian. This article also…

  1. Discriminating micropathogen lineages and their reticulate evolution through graph theory-based network analysis: the case of Trypanosoma cruzi, the agent of Chagas disease.

    PubMed

    Arnaud-Haond, Sophie; Moalic, Yann; Barnabé, Christian; Ayala, Francisco José; Tibayrenc, Michel

    2014-01-01

    Micropathogens (viruses, bacteria, fungi, parasitic protozoa) share a common trait, which is partial clonality, with wide variance in the respective influence of clonality and sexual recombination on the dynamics and evolution of taxa. The discrimination of distinct lineages and the reconstruction of their phylogenetic history are key information to infer their biomedical properties. However, the phylogenetic picture is often clouded by occasional events of recombination across divergent lineages, limiting the relevance of classical phylogenetic analysis and dichotomic trees. We have applied a network analysis based on graph theory to illustrate the relationships among genotypes of Trypanosoma cruzi, the parasitic protozoan responsible for Chagas disease, to identify major lineages and to unravel their past history of divergence and possible recombination events. At the scale of T. cruzi subspecific diversity, graph theory-based networks applied to 22 isoenzyme loci (262 distinct Multi-Locus-Enzyme-Electrophoresis -MLEE) and 19 microsatellite loci (66 Multi-Locus-Genotypes -MLG) fully confirms the high clustering of genotypes into major lineages or "near-clades". The release of the dichotomic constraint associated with phylogenetic reconstruction usually applied to Multilocus data allows identifying putative hybrids and their parental lineages. Reticulate topology suggests a slightly different history for some of the main "near-clades", and a possibly more complex origin for the putative hybrids than hitherto proposed. Finally the sub-network of the near-clade T. cruzi I (28 MLG) shows a clustering subdivision into three differentiated lesser near-clades ("Russian doll pattern"), which confirms the hypothesis recently proposed by other investigators. The present study broadens and clarifies the hypotheses previously obtained from classical markers on the same sets of data, which demonstrates the added value of this approach. This underlines the potential of graph

  2. [Guideline for the Development of Evidence-based Patient Information: insights into the methods and implementation of evidence-based health information].

    PubMed

    Lühnen, Julia; Albrecht, Martina; Hanßen, Käthe; Hildebrandt, Julia; Steckelberg, Anke

    2015-01-01

    The "Guideline for the Development of Evidence-based Patient Information" project is a novelty. The aim of this project is to enhance the quality of health information. The development and implementation process is guided by national and international standards. Involvement of health information developers plays an essential role. This article provides an insight into the guideline's underlying methodology, using graphics as an example. In addition, the results of a qualitative study exploring the competencies of health information developers are presented. These results will guide the implementation of the guideline. We conducted systematic literature searches (until June 2014), critical appraisal and descriptive analyses applying GRADE for two selected guideline questions. Out of 3,287 hits 11 RCTs were included in the analysis. The evidence has been rated to be of low to moderate quality. Additional graphics may have a positive effect on cognitive outcomes. However, the relevance of the results is questionable. For graphics, we found some indication that especially pictograms but also bar graphs have a positive effect on cognitive outcomes and meet patients' preferences. In order to prepare for the implementation of the guideline, we conducted a qualitative study to explore the competencies of health information developers using expert interviews. Four telephone interviews were conducted, audio recorded, transcribed and analysed according to Grounded Theory. Six categories were identified: literature search, development of health information, participation of target groups, continuing education and further training of health information developers, cooperation with different institutions, essential competencies. Levels of competencies regarding the methods of evidence-based medicine and evidence-based health information vary considerably and indicate a need for training. These results have informed the development of a training programme that will support the

  3. An Innovative Mixed Methods Approach to Studying the Online Health Information Seeking Experiences of Adults with Chronic Health Conditions

    ERIC Educational Resources Information Center

    Mayoh, Joanne; Bond, Carol S.; Todres, Les

    2012-01-01

    This article presents an innovative sequential mixed methods approach to researching the experiences of U.K. adults with chronic health conditions seeking health information online. The use of multiple methods integrated within a single study ensured that the focus of the research was emergent and relevant and ultimately provided a more complete…

  4. System and method for simultaneously collecting serial number information from numerous identity tags

    DOEpatents

    Doty, Michael A.

    1997-01-01

    A system and method for simultaneously collecting serial number information reports from numerous colliding coded-radio-frequency identity tags. Each tag has a unique multi-digit serial number that is stored in non-volatile RAM. A reader transmits an ASCII coded "D" character on a carrier of about 900 MHz and a power illumination field having a frequency of about 1.6 Ghz. A one MHz tone is modulated on the 1.6 Ghz carrier as a timing clock for a microprocessor in each of the identity tags. Over a thousand such tags may be in the vicinity and each is powered-up and clocked by the 1.6 Ghz power illumination field. Each identity tag looks for the "D" interrogator modulated on the 900 MHz carrier, and each uses a digit of its serial number to time a response. Clear responses received by the reader are repeated for verification. If no verification or a wrong number is received by any identity tag, it uses a second digital together with the first to time out a more extended period for response. Ultimately, the entire serial number will be used in the worst case collision environments; and since the serial numbers are defined as being unique, the final possibility will be successful because a clear time-slot channel will be available.

  5. System and method for simultaneously collecting serial number information from numerous identity tags

    DOEpatents

    Doty, M.A.

    1997-01-07

    A system and method are disclosed for simultaneously collecting serial number information reports from numerous colliding coded-radio-frequency identity tags. Each tag has a unique multi-digit serial number that is stored in non-volatile RAM. A reader transmits an ASCII coded ``D`` character on a carrier of about 900 MHz and a power illumination field having a frequency of about 1.6 Ghz. A one MHz tone is modulated on the 1.6 Ghz carrier as a timing clock for a microprocessor in each of the identity tags. Over a thousand such tags may be in the vicinity and each is powered-up and clocked by the 1.6 Ghz power illumination field. Each identity tag looks for the ``D`` interrogator modulated on the 900 MHz carrier, and each uses a digit of its serial number to time a response. Clear responses received by the reader are repeated for verification. If no verification or a wrong number is received by any identity tag, it uses a second digital together with the first to time out a more extended period for response. Ultimately, the entire serial number will be used in the worst case collision environments; and since the serial numbers are defined as being unique, the final possibility will be successful because a clear time-slot channel will be available. 5 figs.

  6. Methods for Automated Identification of Informative Behaviors in Natural Bioptic Driving

    PubMed Central

    Luo, Gang; Peli, Eli

    2012-01-01

    Visually impaired people may legally drive if wearing bioptic telescopes in some developed countries. To address the controversial safety issue of the practice, we have developed a low cost in-car recording system that can be installed in study participants’ own vehicles to record their daily driving activities. We also developed a set of automated identification techniques of informative behaviors to facilitate efficient manual review of important segments submerged in the vast amount of uncontrolled data. Here we present the methods and quantitative results of the detection performance for six types of driving maneuvers and behaviors that are important for bioptic driving: bioptic telescope use, turns, curves, intersections, weaving, and rapid stops. The testing data were collected from one normally sighted and two visually impaired subjects across multiple days. The detection rates ranged from 82% up to 100%, and the false discovery rates ranged from 0% to 13%. In addition, two human observers were able to interpret about 80% of targets viewed through the telescope. These results indicate that with appropriate data processing the low-cost system is able to provide reliable data for natural bioptic driving studies. PMID:22514200

  7. Using geostatistical methods to inform field sampling of heterogeneous fractured media

    NASA Astrophysics Data System (ADS)

    Pollyea, R.; Fairley, J.

    2008-12-01

    Acquiring data at appropriately fine scales to capture heterogeneous features for use in stochastic simulation can be costly when relying on borehole data, lidar, or other geophysical methods. Additionally, relying on available borehole data typically yields lateral resolution inadequate for assessing horizontal correlation structure. Here we present methodology for developing a field sampling plan in fractured basalt using a published fracture map of a basalt surface exposure. Using a sub-meter sample grid the correlation range of the fractures on the published fracture map was determined. Field sampling was then emulated by removing samples and reassessing the correlation range. Sample spacing for field sampling was identified as the largest that would maintain the original correlation structure. This sample spacing was then implemented for obtaining field samples at surface exposures of two basalt flows near the fracture map exposure. The correlation range of the field samples suggests that using the model semivariogram to test a field sampling plan in preparation for a field sampling event may provide useful information for defining a sampling scheme in heterogeneous fractured media.

  8. Similarity landscapes: An improved method for scientific visualization of information from protein and DNA database searches

    SciTech Connect

    Dogget, N.; Myers, G.; Wills, C.J.

    1998-12-01

    This is the final report of a three-year, Laboratory Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). The authors have used computer simulations and examination of a variety of databases to answer questions about a wide range of evolutionary questions. The authors have found that there is a clear distinction in the evolution of HIV-1 and HIV-2, with the former and more virulent virus evolving more rapidly at a functional level. The authors have discovered highly non-random patterns in the evolution of HIV-1 that can be attributed to a variety of selective pressures. In the course of examination of microsatellite DNA (short repeat regions) in microorganisms, the authors have found clear differences between prokaryotes and eukaryotes in their distribution, differences that can be tied to different selective pressures. They have developed a new method (topiary pruning) for enhancing the phylogenetic information contained in DNA sequences. Most recently, the authors have discovered effects in complex rainforest ecosystems that indicate strong frequency-dependent interactions between host species and their parasites, leading to the maintenance of ecosystem variability.

  9. A multi-method study of factors associated with hospital information system success in South Africa.

    PubMed

    Hanmer, Lyn A; Isaacs, Sedick; Roode, J Dewald

    2011-01-01

    A combination of interpretivist and positivist techniques was used to develop and refine a conceptual model of factors associated with computerised hospital information system (CHIS) success in South Africa. Data from three case studies of CHIS use in level 2 public sector hospitals were combined to develop a conceptual model containing seven factors associated with CHIS success at hospital level. This conceptual model formed the basis of a fourth case study which aimed to confirm and refine the initial conceptual model. In the third phase of the study, a survey of CHIS use was conducted in 30 hospitals across two South African provinces, each using one of three different CHISs. Relationships between hospital-level factors of the conceptual model and user assessment of CHIS success were examined. A revised conceptual model of CHIS use was developed on the basis of the survey results. The use of a multi-method approach made it possible to generalise results from the case studies to multiple CHIS implementations in two provinces. PMID:21893786

  10. A Comparative Study of Information-Based Source Number Estimation Methods and Experimental Validations on Mechanical Systems

    PubMed Central

    Cheng, Wei; Zhang, Zhousuo; Cao, Hongrui; He, Zhengjia; Zhu, Guanwen

    2014-01-01

    This paper investigates one eigenvalue decomposition-based source number estimation method, and three information-based source number estimation methods, namely the Akaike Information Criterion (AIC), Minimum Description Length (MDL) and Bayesian Information Criterion (BIC), and improves BIC as Improved BIC (IBIC) to make it more efficient and easier for calculation. The performances of the abovementioned source number estimation methods are studied comparatively with numerical case studies, which contain a linear superposition case and a both linear superposition and nonlinear modulation mixing case. A test bed with three sound sources is constructed to test the performances of these methods on mechanical systems, and source separation is carried out to validate the effectiveness of the experimental studies. This work can benefit model order selection, complexity analysis of a system, and applications of source separation to mechanical systems for condition monitoring and fault diagnosis purposes. PMID:24776935

  11. Theory-based analysis of clinical efficacy of triptans using receptor occupancy

    PubMed Central

    2014-01-01

    Background Triptans, serotonin 5-HT1B/1D receptor agonists, exert their action by targeting serotonin 5-HT1B/1D receptors, are used for treatment of migraine attack. Presently, 5 different triptans, namely sumatriptan, zolmitriptan, eletriptan, rizatriptan, and naratriptan, are marketed in Japan. In the present study, we retrospectively analyzed the relationships of clinical efficacy (headache relief) in Japanese and 5-HT1B/1D receptor occupancy (Φ1B and Φ1D). Receptor occupancies were calculated from both the pharmacokinetic and pharmacodynamic data of triptans. Methods To evaluate the total amount of exposure to drug, we calculated the area under the plasma concentration-time curve (AUCcp) and the areas under the time curves for Ф1B and Ф1D (AUCФ1B and AUCФ1D). Moreover, parameters expressing drug transfer and binding rates (A cp , A Ф 1B , A Ф 1D ) were calculated. Results Our calculations showed that Фmax1B and Фmax1D were relatively high at 32.0-89.4% and 68.4-96.2%, respectively, suggesting that it is likely that a high occupancy is necessary to attain the clinical effect. In addition, the relationships between therapeutic effect and AUCcp, AUCΦ1B, AUCΦ1D, and A cp  · AUCcp differed with each drug and administered form, whereas a significant relationship was found between the therapeutic effect and A Φ 1B  · AUCΦ1B or A Φ 1D  · AUCΦ1D that was not affected by the drug and the form of administration. Conclusions These results suggest that receptor occupancy can be used as a parameter for a common index to evaluate the therapeutic effect. We considered that the present findings provide useful information to support the proper use of triptans. PMID:25488888

  12. The viscoplasticity theory based on overstress applied to the modeling of a nickel base superalloy at 815 C

    NASA Technical Reports Server (NTRS)

    Krempl, E.; Lu, H.; Yao, D.

    1988-01-01

    Short term strain rate change, creep and relaxation tests were performed in an MTS computer controlled servohydraulic testing machine. Aging and recovery were found to be insignificant for test times not exceeding 30 hrs. The material functions and constants of the theory were identified from results of strain rate change tests. Numerical integration of the theory for relaxation and creep tests showed good predictive capabilities of the viscoplasticity theory based on overstress.

  13. The role of local observations as evidence to inform effective mitigation methods for flood risk management

    NASA Astrophysics Data System (ADS)

    Quinn, Paul; ODonnell, Greg; Owen, Gareth

    2014-05-01

    This poster presents a case study that highlights two crucial aspects of a catchment-based flood management project that were used to encourage uptake of an effective flood management strategy. Specifically, (1) the role of detailed local scale observations and (2) a modelling method informed by these observations. Within a 6km2 study catchment, Belford UK, a number of Runoff Attenuation Features (RAFs) have been constructed (including ponds, wetlands and woody debris structures) to address flooding issues in the downstream village. The storage capacity of the RAFs is typically small (200 to 500m3), hence there was skepticism as to whether they would work during large flood events. Monitoring was performed using a dense network of water level recorders installed both within the RAFs and within the stream network. Using adjacent upstream and downstream water levels in the stream network and observations within the actual ponds, a detailed understanding of the local performance of the RAFs was gained. However, despite understanding the local impacts of the features, the impact on the downstream hydrograph at the catchment scale could still not be ascertained with any certainty. The local observations revealed that the RAFs typically filled on the rising limb of the hydrograph; hence there was no available storage at the time of arrival of a large flow peak. However, it was also clear that an impact on the rising limb of the hydrograph was being observed. This knowledge of the functioning of individual features was used to create a catchment model, in which a network of RAFs could then be configured to examine the aggregated impacts. This Pond Network Model (PNM) was based on the observed local physical relationships and allowed a user specified sequence of ponds to be configured into a cascade structure. It was found that there was a minimum number of RAFs needed before an impact on peak flow was achieved for a large flood event. The number of RAFs required in the

  14. Knowledge-based Method for Determining the Meaning of Ambiguous Biomedical Terms Using Information Content Measures of Similarity

    PubMed Central

    McInnes, Bridget T.; Pedersen, Ted; Liu, Ying; Melton, Genevieve B.; Pakhomov, Serguei V.

    2011-01-01

    In this paper, we introduce a novel knowledge-based word sense disambiguation method that determines the sense of an ambiguous word in biomedical text using semantic similarity or relatedness measures. These measures quantify the degree of similarity between concepts in the Unified Medical Language System (UMLS). The objective of this work was to develop a method that can disambiguate terms in biomedical text by exploiting similarity information extracted from the UMLS and to evaluate the efficacy of information content-based semantic similarity measures, which augment path-based information with probabilities derived from biomedical corpora. We show that information content-based measures obtain a higher disambiguation accuracy than path-based measures because they weight the path based on where it exists in the taxonomy coupled with the probability of the concepts occurring in a corpus of text. PMID:22195148

  15. Methods and apparatus for capture and storage of semantic information with sub-files in a parallel computing system

    DOEpatents

    Faibish, Sorin; Bent, John M; Tzelnic, Percy; Grider, Gary; Torres, Aaron

    2015-02-03

    Techniques are provided for storing files in a parallel computing system using sub-files with semantically meaningful boundaries. A method is provided for storing at least one file generated by a distributed application in a parallel computing system. The file comprises one or more of a complete file and a plurality of sub-files. The method comprises the steps of obtaining a user specification of semantic information related to the file; providing the semantic information as a data structure description to a data formatting library write function; and storing the semantic information related to the file with one or more of the sub-files in one or more storage nodes of the parallel computing system. The semantic information provides a description of data in the file. The sub-files can be replicated based on semantically meaningful boundaries.

  16. Integrating Safety Assessment Methods using the Risk Informed Safety Margins Characterization (RISMC) Approach

    SciTech Connect

    Curtis Smith; Diego Mandelli

    2013-03-01

    Safety is central to the design, licensing, operation, and economics of nuclear power plants (NPPs). As the current light water reactor (LWR) NPPs age beyond 60 years, there are possibilities for increased frequency of systems, structures, and components (SSC) degradations or failures that initiate safety significant events, reduce existing accident mitigation capabilities, or create new failure modes. Plant designers commonly “over-design” portions of NPPs and provide robustness in the form of redundant and diverse engineered safety features to ensure that, even in the case of well-beyond design basis scenarios, public health and safety will be protected with a very high degree of assurance. This form of defense-in-depth is a reasoned response to uncertainties and is often referred to generically as “safety margin.” Historically, specific safety margin provisions have been formulated primarily based on engineering judgment backed by a set of conservative engineering calculations. The ability to better characterize and quantify safety margin is important to improved decision making about LWR design, operation, and plant life extension. A systematic approach to characterization of safety margins and the subsequent margin management options represents a vital input to the licensee and regulatory analysis and decision making that will be involved. In addition, as research and development (R&D) in the LWR Sustainability (LWRS) Program and other collaborative efforts yield new data, sensors, and improved scientific understanding of physical processes that govern the aging and degradation of plant SSCs needs and opportunities to better optimize plant safety and performance will become known. To support decision making related to economics, readability, and safety, the RISMC Pathway provides methods and tools that enable mitigation options known as margins management strategies. The purpose of the RISMC Pathway R&D is to support plant decisions for risk-informed

  17. Theory-Based Behavioral Intervention Increases Self-Reported Physical Activity in South African Men: A Cluster-Randomized Controlled Trial

    PubMed Central

    Jemmott, John B.; Jemmott, Loretta S.; Ngwane, Zolani; Zhang, Jingwen; Heeren, G. Anita; Icard, Larry D.; O’Leary, Ann; Mtose, Xoliswa; Teitelman, Anne; Carty, Craig

    2014-01-01

    Objective To determine whether a health-promotion intervention increases South African men’s adherence to physical-activity guidelines. Method We utilized a cluster-randomized controlled trial design. Eligible clusters, residential neighborhoods near East London, South Africa, were matched in pairs. Within randomly selected pairs, neighborhoods were randomized to theory-based, culturally congruent health-promotion intervention encouraging physical activity or attention-matched HIV/STI risk-reduction control intervention. Men residing in the neighborhoods and reporting coitus in the previous 3 months were eligible. Primary outcome was self-reported individual-level adherence to physical-activity guidelines averaged over 6-month and 12-month post-intervention assessments. Data were collected in 2007–2010. Data collectors, but not facilitators or participants, were blind to group assignment. Results Primary outcome intention-to-treat analysis included 22 of 22 clusters and 537 of 572 men in the health-promotion intervention and 22 of 22 clusters and 569 of 609 men in the attention-control intervention. Model-estimated probability of meeting physical-activity guidelines was 51.0% in the health-promotion intervention and 44.7% in attention-matched control (OR = 1.34; 95% CI, 1.09–1.63), adjusting for baseline prevalence and clustering from 44 neighborhoods. Conclusion A theory-based culturally congruent intervention increased South African men’s self-reported physical activity, a key contributor to deaths from non-communicable diseases in South Africa. Trial registration ClinicalTrials.gov Identifier: NCT01490359. PMID:24736094

  18. Development of a Simple 12-Item Theory-Based Instrument to Assess the Impact of Continuing Professional Development on Clinical Behavioral Intentions

    PubMed Central

    Légaré, France; Borduas, Francine; Freitas, Adriana; Jacques, André; Godin, Gaston; Luconi, Francesca; Grimshaw, Jeremy

    2014-01-01

    Background Decision-makers in organizations providing continuing professional development (CPD) have identified the need for routine assessment of its impact on practice. We sought to develop a theory-based instrument for evaluating the impact of CPD activities on health professionals' clinical behavioral intentions. Methods and Findings Our multipronged study had four phases. 1) We systematically reviewed the literature for instruments that used socio-cognitive theories to assess healthcare professionals' clinically-oriented behavioral intentions and/or behaviors; we extracted items relating to the theoretical constructs of an integrated model of healthcare professionals' behaviors and removed duplicates. 2) A committee of researchers and CPD decision-makers selected a pool of items relevant to CPD. 3) An international group of experts (n = 70) reached consensus on the most relevant items using electronic Delphi surveys. 4) We created a preliminary instrument with the items found most relevant and assessed its factorial validity, internal consistency and reliability (weighted kappa) over a two-week period among 138 physicians attending a CPD activity. Out of 72 potentially relevant instruments, 47 were analyzed. Of the 1218 items extracted from these, 16% were discarded as improperly phrased and 70% discarded as duplicates. Mapping the remaining items onto the constructs of the integrated model of healthcare professionals' behaviors yielded a minimum of 18 and a maximum of 275 items per construct. The partnership committee retained 61 items covering all seven constructs. Two iterations of the Delphi process produced consensus on a provisional 40-item questionnaire. Exploratory factorial analysis following test-retest resulted in a 12-item questionnaire. Cronbach's coefficients for the constructs varied from 0.77 to 0.85. Conclusion A 12-item theory-based instrument for assessing the impact of CPD activities on health professionals' clinical behavioral

  19. Professional Identity Development among Graduate Library and Information Studies Online Learners: A Mixed Methods Study

    ERIC Educational Resources Information Center

    Croxton, Rebecca A.

    2015-01-01

    This study explores how factors relating to fully online Master of Library and Information Studies (MLIS) students' connectedness with peers and faculty may impact their professional identity development as library and information studies professionals. Participants include students enrolled in a fully online MLIS degree program in the…

  20. Using Pop Culture to Teach Information Literacy: Methods to Engage a New Generation

    ERIC Educational Resources Information Center

    Behen, Linda D.

    2006-01-01

    Building on the information needs and the learning style preferences of today's high school students, the author builds a case for using pop culture (TV shows, fads, and current technology) to build integrated information skills lessons for students. Chapters include a rationale, a review of the current literature, and examples of units of study…

  1. Emerging Information Literacy and Research-Method Competencies in Urban Community College Psychology Students

    ERIC Educational Resources Information Center

    Wolfe, Kate S.

    2015-01-01

    This article details an assignment developed to teach students at urban community colleges information-literacy skills. This annotated bibliography assignment introduces students to library research skills, helps increase information literacy in beginning college students, and helps psychology students learn research methodology crucial in…

  2. Automated Methods to Extract Patient New Information from Clinical Notes in Electronic Health Record Systems

    ERIC Educational Resources Information Center

    Zhang, Rui

    2013-01-01

    The widespread adoption of Electronic Health Record (EHR) has resulted in rapid text proliferation within clinical care. Clinicians' use of copying and pasting functions in EHR systems further compounds this by creating a large amount of redundant clinical information in clinical documents. A mixture of redundant information (especially outdated…

  3. Accidental Discovery of Information on the User-Defined Social Web: A Mixed-Method Study

    ERIC Educational Resources Information Center

    Lu, Chi-Jung

    2012-01-01

    Frequently interacting with other people or working in an information-rich environment can foster the "accidental discovery of information" (ADI) (Erdelez, 2000; McCay-Peet & Toms, 2010). With the increasing adoption of social web technologies, online user-participation communities and user-generated content have provided users the…

  4. Consumer Health Information Behavior in Public Libraries: A Mixed Methods Study

    ERIC Educational Resources Information Center

    Yi, Yong Jeong

    2012-01-01

    Previous studies indicated inadequate health literacy of American adults as one of the biggest challenges for consumer health information services provided in public libraries. Little attention, however, has been paid to public users' health literacy and health information behaviors. In order to bridge the research gap, the study aims to…

  5. A Guide to Information Tools, Methods, and Resources in Science and Engineering.

    ERIC Educational Resources Information Center

    Herner, Saul

    This guide is a recapitulation of the substantive content of a one-and-a-half day course which was given before three groups of Federal scientists and engineers. The purpose of the course was to train and inform working scientists and engineers as to the most direct and efficient means of seeking and acquiring information. A problem-solving…

  6. The Routines-Based Interview: A Method for Gathering Information and Assessing Needs

    ERIC Educational Resources Information Center

    McWilliam, R. A.; Casey, Amy M.; Sims, Jessica

    2009-01-01

    There are multiple ways to gather information from families receiving early intervention services (J. J. Woods & D. P. Lindeman, 2008). In this article, we discuss a specific strategy for doing this through information-gathering conversations with families. The routines-based interview (RBI; R. A. McWilliam, 1992, 2005a) was developed to meet a…

  7. The Swedish strategy and method for development of a national healthcare information architecture.

    PubMed

    Rosenälv, Jessica; Lundell, Karl-Henrik

    2012-01-01

    "We need a precise framework of regulations in order to maintain appropriate and structured health care documentation that ensures that the information maintains a sufficient level of quality to be used in treatment, in research and by the actual patient. The users shall be aided by clearly and uniformly defined terms and concepts, and there should be an information structure that clarifies what to document and how to make the information more useful. Most of all, we need to standardize the information, not just the technical systems." (eHälsa - nytta och näring, Riksdag report 2011/12:RFR5, p. 37). In 2010, the Swedish Government adopted the National e-Health - the national strategy for accessible and secure information in healthcare. The strategy is a revision and extension of the previous strategy from 2006, which was used as input for the most recent efforts to develop a national information structure utilizing business-oriented generic models. A national decision on healthcare informatics standards was made by the Swedish County Councils, which decided to follow and use EN/ISO 13606 as a standard for the development of a universally applicable information structure, including archetypes and templates. The overall aim of the Swedish strategy for development of National Healthcare Information Architecture is to achieve high level semantic interoperability for clinical content and clinical contexts. High level semantic interoperability requires consistently structured clinical data and other types of data with coherent traceability to be mapped to reference clinical models. Archetypes that are formal definitions of the clinical and demographic concepts and some administrative data were developed. Each archetype describes the information structure and content of overarching core clinical concepts. Information that is defined in archetypes should be used for different purposes. Generic clinical process model was made concrete and analyzed. For each decision

  8. Scenario-based design: A method for connecting information system design with public health operations and emergency management

    PubMed Central

    Reeder, Blaine; Turner, Anne M

    2011-01-01

    Responding to public health emergencies requires rapid and accurate assessment of workforce availability under adverse and changing circumstances. However, public health information systems to support resource management during both routine and emergency operations are currently lacking. We applied scenario-based design as an approach to engage public health practitioners in the creation and validation of an information design to support routine and emergency public health activities. Methods: Using semi-structured interviews we identified the information needs and activities of senior public health managers of a large municipal health department during routine and emergency operations. Results: Interview analysis identified twenty-five information needs for public health operations management. The identified information needs were used in conjunction with scenario-based design to create twenty-five scenarios of use and a public health manager persona. Scenarios of use and persona were validated and modified based on follow-up surveys with study participants. Scenarios were used to test and gain feedback on a pilot information system. Conclusion: The method of scenario-based design was applied to represent the resource management needs of senior-level public health managers under routine and disaster settings. Scenario-based design can be a useful tool for engaging public health practitioners in the design process and to validate an information system design. PMID:21807120

  9. 77 FR 24684 - Proposed Information Collection; Comment Request; 2013-2015 American Community Survey Methods...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-25

    ... Methods Panel Testing AGENCY: U.S. Census Bureau. ACTION: Notice. SUMMARY: The Department of Commerce, as... materials. The ACS Methods Panel is a research program that is designed to address and respond to survey issues and needs. During the 2013-2015 period, the Methods Panel may include testing methods...

  10. Development of a Theory-Based Intervention to Increase Prescription of Inspiratory Muscle Training by Health Professionals in the Management of People with Chronic Obstructive Pulmonary Disease

    PubMed Central

    Li, Linda C.; Reid, W. Darlene

    2011-01-01

    ABSTRACT Purpose: The purpose of this paper is twofold: (1) to provide an overview of the literature on barriers to evidence-based practice (EBP) and the effectiveness of implementation interventions in health care; and (2) to outline the development of an implementation intervention for improving the prescription of inspiratory muscle training (IMT) by physical therapists and other health professionals for people with chronic obstructive pulmonary disease (COPD). Summary of Key Points: Individuals, organizations, and the research itself present barriers to EBP in physical therapy. Despite the evidence supporting the use of IMT, this treatment continues to be under-used in managing COPD. Current health services research shows that traditional information-based approaches to implementation, such as didactic lectures, do not adequately address the challenges health professionals face when trying to make changes in practice. We propose the development of a theory-based intervention to improve health professionals' use of IMT in the management of COPD. It is postulated that a behavioural intervention, based on the theory of planned behaviour (TPB), may be more effective than an information-based strategy in increasing the prescription of IMT by health professionals. Conclusion: TPB may be used to understand the antecedents of health professionals' behaviour and to guide the development of implementation interventions. Further research is needed to evaluate the effectiveness of this proposed intervention in the management of people with COPD. PMID:22654237

  11. Strategies and methods for aligning current and best medical practices. The role of information technologies.

    PubMed Central

    Schneider, E C; Eisenberg, J M

    1998-01-01

    Rapid change in American medicine requires that physicians adjust established behaviors and acquire new skills. In this article, we address three questions: What do we know about how to change physicians' practices? How can physicians take advantage of new and evolving information technologies that are likely to have an impact on the future practice of medicine? and What strategic educational interventions will best enable physicians to show competencies in information management and readiness to change practice? We outline four guiding principles for incorporating information systems tools into both medical education and practice, and we make eight recommendations for the development of a new medical school curriculum. This curriculum will produce a future medical practitioner who is capable of using information technologies to systematically measure practice performance, appropriateness, and effectiveness while updating knowledge efficiently. PMID:9614787

  12. An Information System Development Method Combining Business Process Modeling with Executable Modeling and its Evaluation by Prototyping

    NASA Astrophysics Data System (ADS)

    Okawa, Tsutomu; Kaminishi, Tsukasa; Hirabayashi, Syuichi; Suzuki, Ryo; Mitsui, Hiroyasu; Koizumi, Hisao

    The business in the enterprise is closely related with the information system to such an extent that the business activities are difficult without the information system. The system design technique that considers the business process well, and that enables a quick system development is requested. In addition, the demand for the development cost is also severe than before. To cope with the current situation, the modeling technology named BPM(Business Process Management/Modeling)is drawing attention and becoming important as a key technology. BPM is a technology to model business activities as business processes and visualize them to improve the business efficiency. However, a general methodology to develop the information system using the analysis result of BPM doesn't exist, and a few development cases are reported. This paper proposes an information system development method combining business process modeling with executable modeling. In this paper we describe a guideline to support consistency of development and development efficiency and the framework enabling to develop the information system from model. We have prototyped the information system with the proposed method and our experience has shown that the methodology is valuable.

  13. 75 FR 8817 - Annual Submission of Tax Information for Use in the Revenue Shortfall Allocation Method

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-26

    ... Shortfall Allocation Method ACTION: Final rule. SUMMARY: The Surface Transportation Board (Board) is... Allocation Method (RSAM). RSAM is one of three benchmarks that together are used to determine the... Method, STB Ex Parte No. 646 (Sub-No. 2) (STB served May 11, 2009) (RSAM Taxes). Specifically,...

  14. A Theory-Based Framework for Assessing Domain-Specific Problem-Solving Ability.

    ERIC Educational Resources Information Center

    Sugrue, Brenda

    1995-01-01

    A more fragmented approach to assessment of global ability concepts than is generally advocated is suggested, based on the assumption that decomposing a complex ability into cognitive components and tracking performance across multiple measures will yield valid and instructionally useful information. Specifications are suggested for designing…

  15. Characterizing the Processes for Navigating Internet Health Information Using Real-Time Observations: A Mixed-Methods Approach

    PubMed Central

    Paterniti, Debora A; Wilson, Machelle; Bell, Robert A; Chan, Man Shan; Villareal, Chloe C; Nguyen, Hien Huy; Kravitz, Richard L

    2015-01-01

    Background Little is known about the processes people use to find health-related information on the Internet or the individual characteristics that shape selection of information-seeking approaches. Objective Our aim was to describe the processes by which users navigate the Internet for information about a hypothetical acute illness and to identify individual characteristics predictive of their information-seeking strategies. Methods Study participants were recruited from public settings and agencies. Interested individuals were screened for eligibility using an online questionnaire. Participants listened to one of two clinical scenarios—consistent with influenza or bacterial meningitis—and then conducted an Internet search. Screen-capture video software captured Internet search mouse clicks and keystrokes. Each step of the search was coded as hypothesis testing (etiology), evidence gathering (symptoms), or action/treatment seeking (behavior). The coded steps were used to form a step-by-step pattern of each participant’s information-seeking process. A total of 78 Internet health information seekers ranging from 21-35 years of age and who experienced barriers to accessing health care services participated. Results We identified 27 unique patterns of information seeking, which were grouped into four overarching classifications based on the number of steps taken during the search, whether a pattern consisted of developing a hypothesis and exploring symptoms before ending the search or searching an action/treatment, and whether a pattern ended with action/treatment seeking. Applying dual-processing theory, we categorized the four overarching pattern classifications as either System 1 (41%, 32/78), unconscious, rapid, automatic, and high capacity processing; or System 2 (59%, 46/78), conscious, slow, and deliberative processing. Using multivariate regression, we found that System 2 processing was associated with higher education and younger age. Conclusions We

  16. 30 CFR 48.23 - Training plans; time of submission; where filed; information required; time for approval; method...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 30 Mineral Resources 1 2011-07-01 2011-07-01 false Training plans; time of submission; where filed; information required; time for approval; method for disapproval; commencement of training; approval of instructors. 48.23 Section 48.23 Mineral Resources MINE SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR EDUCATION AND TRAINING TRAINING...

  17. 30 CFR 48.3 - Training plans; time of submission; where filed; information required; time for approval; method...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Training plans; time of submission; where filed; information required; time for approval; method for disapproval; commencement of training; approval of instructors. 48.3 Section 48.3 Mineral Resources MINE SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR EDUCATION AND TRAINING TRAINING...

  18. 30 CFR 48.23 - Training plans; time of submission; where filed; information required; time for approval; method...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Training plans; time of submission; where filed; information required; time for approval; method for disapproval; commencement of training; approval of instructors. 48.23 Section 48.23 Mineral Resources MINE SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR EDUCATION AND TRAINING TRAINING...

  19. 30 CFR 48.3 - Training plans; time of submission; where filed; information required; time for approval; method...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 30 Mineral Resources 1 2011-07-01 2011-07-01 false Training plans; time of submission; where filed; information required; time for approval; method for disapproval; commencement of training; approval of instructors. 48.3 Section 48.3 Mineral Resources MINE SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR EDUCATION AND TRAINING TRAINING...

  20. Following Experts at Work in Their Own Information Spaces: Using Observational Methods To Develop Tools for the Digital Library.

    ERIC Educational Resources Information Center

    Gorman, Paul; Lavelle, Mary; Delcambre, Lois; Maier, David

    2002-01-01

    Offers an overview of the authors' experience using several observational methods to better understand one class of users, expert clinicians treating patients in hospital settings. Shows the evolution of understanding of the users and their information-handling tasks based on observations made in the field by a multidisciplinary research team, and…

  1. Sexual Health Information Seeking Online: A Mixed-Methods Study among Lesbian, Gay, Bisexual, and Transgender Young People

    ERIC Educational Resources Information Center

    Magee, Joshua C.; Bigelow, Louisa; DeHaan, Samantha; Mustanski, Brian S.

    2012-01-01

    The current study used a mixed-methods approach to investigate the positive and negative aspects of Internet use for sexual health information among lesbian, gay, bisexual, and transgender (LGBT) young people. A diverse community sample of 32 LGBT young people (aged 16-24 years) completed qualitative interviews focusing on how, where, and when…

  2. 14 CFR 39.21 - Where can I get information about FAA-approved alternative methods of compliance?

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 14 Aeronautics and Space 1 2014-01-01 2014-01-01 false Where can I get information about FAA-approved alternative methods of compliance? 39.21 Section 39.21 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION AIRCRAFT AIRWORTHINESS DIRECTIVES § 39.21 Where can I...

  3. Extending Value of Information Methods to Include the Co-Net Benefits of Earth Observations

    NASA Astrophysics Data System (ADS)

    Macauley, M.

    2015-12-01

    The widening relevance of Earth observations information across the spectrum of natural and environmental resources markedly enhances the value of these observations. An example is observations of forest extent, species composition, health, and change; this information can help in assessing carbon sequestration, biodiversity and habitat, watershed management, fuelwood potential, and other ecosystem services as well as inform the opportunity cost of forest removal for alternative land use such as agriculture, pasture, or development. These "stacked" indicators or co- net benefits add significant value to Earth observations. In part because of reliance on case studies, much previous research about the value of information from Earth observations has assessed individual applications rather than aggregate across applications, thus tending to undervalue the observations. Aggregating across applications is difficult, however, because it requires common units of measurement: controlling for spatial, spectral, and temporal attributes of the observations; and consistent application of value of information techniques. This paper will discuss general principles of co-net benefit aggregation and illustrate its application to attributing value to Earth observations.

  4. Novel classification method for remote sensing images based on information entropy discretization algorithm and vector space model

    NASA Astrophysics Data System (ADS)

    Xie, Li; Li, Guangyao; Xiao, Mang; Peng, Lei

    2016-04-01

    Various kinds of remote sensing image classification algorithms have been developed to adapt to the rapid growth of remote sensing data. Conventional methods typically have restrictions in either classification accuracy or computational efficiency. Aiming to overcome the difficulties, a new solution for remote sensing image classification is presented in this study. A discretization algorithm based on information entropy is applied to extract features from the data set and a vector space model (VSM) method is employed as the feature representation algorithm. Because of the simple structure of the feature space, the training rate is accelerated. The performance of the proposed method is compared with two other algorithms: back propagation neural networks (BPNN) method and ant colony optimization (ACO) method. Experimental results confirm that the proposed method is superior to the other algorithms in terms of classification accuracy and computational efficiency.

  5. 3D Encoding of Musical Score Information and the Playback Method Used by the Cellular Phone

    NASA Astrophysics Data System (ADS)

    Kubo, Hitoshi; Sugiura, Akihiko

    Recently, 3G cellular phone that can take a movie has spread by improving the digital camera function. And, 2Dcode has accurate readout and high operability. And it has spread as an information transmission means. However, the symbol is expanded and complicated when information of 2D codes increases. To solve these, 3D code was proposed. But it need the special equipment for readout, and specializes in the enhancing reality feeling technology. Therefore, it is difficult to apply it to the cellular phone. And so, we propose 3D code that can be recognized by the movie shooting function of the cellular phone. And, score information was encoded. We apply Gray Code to the property of music, and encode it. And the effectiveness was verified.

  6. A Method of Conducting Controlled Testing of Hospital Information System Components

    PubMed Central

    Rusnak, James E.

    1981-01-01

    One of the greatest areas of difficulty in the acquisition and installation of Hospital Information Systems is the system testing and validation. These systems are complex, consisting of a large number of program modules. These program modules tend to have a high degree of dependency upon one another and the system's data files. The real-time nature of Hospital Information Systems, and the types of applications supported by these systems, introduces new problems in testing and validation of the system's programs. New techniques are required to replace those used in testing batch oriented systems. Through the use of a controlled testing methodology, a Hospital Information System can be tested and validated with a limited risk factor of failure after the system is installed. The quality of testing is directly reflected in the reliability of the installed system and its acceptance in the hospital environment by the affected department's staff.

  7. Game Theory Based Security in Wireless Body Area Network with Stackelberg Security Equilibrium.

    PubMed

    Somasundaram, M; Sivakumar, R

    2015-01-01

    Wireless Body Area Network (WBAN) is effectively used in healthcare to increase the value of the patient's life and also the value of healthcare services. The biosensor based approach in medical care system makes it difficult to respond to the patients with minimal response time. The medical care unit does not deploy the accessing of ubiquitous broadband connections full time and hence the level of security will not be high always. The security issue also arises in monitoring the user body function records. Most of the systems on the Wireless Body Area Network are not effective in facing the security deployment issues. To access the patient's information with higher security on WBAN, Game Theory with Stackelberg Security Equilibrium (GTSSE) is proposed in this paper. GTSSE mechanism takes all the players into account. The patients are monitored by placing the power position authority initially. The position authority in GTSSE is the organizer and all the other players react to the organizer decision. Based on our proposed approach, experiment has been conducted on factors such as security ratio based on patient's health information, system flexibility level, energy consumption rate, and information loss rate. Stackelberg Security considerably improves the strength of solution with higher security. PMID:26759829

  8. Game Theory Based Security in Wireless Body Area Network with Stackelberg Security Equilibrium

    PubMed Central

    Somasundaram, M.; Sivakumar, R.

    2015-01-01

    Wireless Body Area Network (WBAN) is effectively used in healthcare to increase the value of the patient's life and also the value of healthcare services. The biosensor based approach in medical care system makes it difficult to respond to the patients with minimal response time. The medical care unit does not deploy the accessing of ubiquitous broadband connections full time and hence the level of security will not be high always. The security issue also arises in monitoring the user body function records. Most of the systems on the Wireless Body Area Network are not effective in facing the security deployment issues. To access the patient's information with higher security on WBAN, Game Theory with Stackelberg Security Equilibrium (GTSSE) is proposed in this paper. GTSSE mechanism takes all the players into account. The patients are monitored by placing the power position authority initially. The position authority in GTSSE is the organizer and all the other players react to the organizer decision. Based on our proposed approach, experiment has been conducted on factors such as security ratio based on patient's health information, system flexibility level, energy consumption rate, and information loss rate. Stackelberg Security considerably improves the strength of solution with higher security. PMID:26759829

  9. Methods of extending signatures and training without ground information. [data processing, pattern recognition

    NASA Technical Reports Server (NTRS)

    Henderson, R. G.; Thomas, G. S.; Nalepka, R. F.

    1975-01-01

    Methods of performing signature extension, using LANDSAT-1 data, are explored. The emphasis is on improving the performance and cost-effectiveness of large area wheat surveys. Two methods were developed: ASC, and MASC. Two methods, Ratio, and RADIFF, previously used with aircraft data were adapted to and tested on LANDSAT-1 data. An investigation into the sources and nature of between scene data variations was included. Initial investigations into the selection of training fields without in situ ground truth were undertaken.

  10. Transforming Elementary Science Teacher Education by Bridging Formal and Informal Science Education in an Innovative Science Methods Course

    NASA Astrophysics Data System (ADS)

    Riedinger, Kelly; Marbach-Ad, Gili; Randy McGinnis, J.; Hestness, Emily; Pease, Rebecca

    2011-02-01

    We investigated curricular and pedagogical innovations in an undergraduate science methods course for elementary education majors at the University of Maryland. The goals of the innovative elementary science methods course included: improving students' attitudes toward and views of science and science teaching, to model innovative science teaching methods and to encourage students to continue in teacher education. We redesigned the elementary science methods course to include aspects of informal science education. The informal science education course features included informal science educator guest speakers, a live animal demonstration and a virtual field trip. We compared data from a treatment course ( n = 72) and a comparison course ( n = 26). Data collection included: researchers' observations, instructors' reflections, and teacher candidates' feedback. Teacher candidate feedback involved interviews and results on a reliable and valid Attitudes and Beliefs about the Nature of and the Teaching of Science instrument. We used complementary methods to analyze the data collected. A key finding of the study was that while benefits were found in both types of courses, the difference in results underscores the need of identifying the primary purpose for innovation as a vital component of consideration.

  11. Bootstrap rank-ordered conditional mutual information (broCMI): A nonlinear input variable selection method for water resources modeling

    NASA Astrophysics Data System (ADS)

    Quilty, John; Adamowski, Jan; Khalil, Bahaa; Rathinasamy, Maheswaran

    2016-03-01

    The input variable selection problem has recently garnered much interest in the time series modeling community, especially within water resources applications, demonstrating that information theoretic (nonlinear)-based input variable selection algorithms such as partial mutual information (PMI) selection (PMIS) provide an improved representation of the modeled process when compared to linear alternatives such as partial correlation input selection (PCIS). PMIS is a popular algorithm for water resources modeling problems considering nonlinear input variable selection; however, this method requires the specification of two nonlinear regression models, each with parametric settings that greatly influence the selected input variables. Other attempts to develop input variable selection methods using conditional mutual information (CMI) (an analog to PMI) have been formulated under different parametric pretenses such as k nearest-neighbor (KNN) statistics or kernel density estimates (KDE). In this paper, we introduce a new input variable selection method based on CMI that uses a nonparametric multivariate continuous probability estimator based on Edgeworth approximations (EA). We improve the EA method by considering the uncertainty in the input variable selection procedure by introducing a bootstrap resampling procedure that uses rank statistics to order the selected input sets; we name our proposed method bootstrap rank-ordered CMI (broCMI). We demonstrate the superior performance of broCMI when compared to CMI-based alternatives (EA, KDE, and KNN), PMIS, and PCIS input variable selection algorithms on a set of seven synthetic test problems and a real-world urban water demand (UWD) forecasting experiment in Ottawa, Canada.

  12. Evaluation of a Noise-Robust Multi-Stream Speaker Verification Method Using F0 Information

    NASA Astrophysics Data System (ADS)

    Asami, Taichi; Iwano, Koji; Furui, Sadaoki

    We have previously proposed a noise-robust speaker verification method using fundamental frequency (F0) extracted using the Hough transform. The method also incorporates an automatic stream-weight and decision threshold estimation technique. It has been confirmed that the proposed method is effective for white noise at various SNR conditions. This paper evaluates the proposed method in more practical in-car and elevator-hall noise conditions. The paper first describes the noise-robust F0 extraction method and details of our robust speaker verification method using multi-stream HMMs for integrating the extracted F0 and cepstral features. Details of the automatic stream-weight and threshold estimation method for multi-stream speaker verification framework are also explained. This method simultaneously optimizes stream-weights and a decision threshold by combining the linear discriminant analysis (LDA) and the Adaboost technique. Experiments were conducted using Japanese connected digit speech contaminated by white, in-car, or elevator-hall noise at various SNRs. Experimental results show that the F0 features improve the verification performance in various noisy environments, and that our stream-weight and threshold optimization method effectively estimates control parameters so that FARs and FRRs are adjusted to achieve equal error rates (EERs) under various noisy conditions.

  13. Patent information analysis methods and their effective use : A study through activities of PAT-LIST Research Workshop adviser

    NASA Astrophysics Data System (ADS)

    Nakamura, Sakae

    For effective use of technical information, various analytical tools and methods (e.g., patent map analysis) have been proposed. It was against this background that the “PAT-LIST Research Workshop” (supported by Raytec Co., Ltd.) was established in 2006. This article discusses, as an example, some actual research subject that the author as an adviser to the forum has studied through our activities in the past six years, especially the subject for 2010 (unveiling intellectual property strategies of specified enterprises from technical information analysis results). Practically useful analysis methods will be proposed showing some points of notes in analysis about the methods. What is also introduced is macroanalysis using text mining tools and the significance of controlled technical classification in a problem/solution map for determining critical fields.

  14. Applying Information-Retrieval Methods to Software Reuse: A Case Study.

    ERIC Educational Resources Information Center

    Stierna, Eric J.; Rowe, Neil C.

    2003-01-01

    Discusses reuse of existing software for new purposes as a key aspect of efficient software engineering by matching formal written requirements used to define the new and the old software. Explores two matching methodologies that use information retrieval techniques and describes test results from a comparison of two military systems. (Author/LRW)

  15. 42 CFR 423.888 - Payment methods, including provision of necessary information.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... historical data and generally accepted actuarial principles) of the difference between such gross costs and... necessary information. (a) Basis. The provisions of § 423.301 through § 423.343, including requirements to... annual basis, as elected by the plansponsor under guidance specified by CMS, unless CMS determines...

  16. Query Methods in Information Retrieval--Criteria for Selection and Application.

    ERIC Educational Resources Information Center

    Goldenson, A.F.; Cardwell, D.W.

    This report studies and classifies, according to distinct user-oriented features, the various computer-aided systems developed for storage and retrieval of information in a wide range of fields. Such features are described and evaluated to determine the nature of characteristics that have a strong bearing on the relative success of various query…

  17. Graph-Based Weakly-Supervised Methods for Information Extraction & Integration

    ERIC Educational Resources Information Center

    Talukdar, Partha Pratim

    2010-01-01

    The variety and complexity of potentially-related data resources available for querying--webpages, databases, data warehouses--has been growing ever more rapidly. There is a growing need to pose integrative queries "across" multiple such sources, exploiting foreign keys and other means of interlinking data to merge information from diverse…

  18. Pathfinding in the Research Forest: The Pearl Harvesting Method for Effective Information Retrieval

    ERIC Educational Resources Information Center

    Sandieson, Robert

    2006-01-01

    Knowledge of empirical research has become important for everyone involved in education and special education. Policy, practice, and informed reporting rely on locating and understanding unfiltered, original source material. Although access to vast amounts of research has been greatly facilitated by online databases, such as ERIC and PsychInfo,…

  19. An Informal Reading Readiness Inventory: A Diagnostic Method of Predicting First Grade Reading Achievement.

    ERIC Educational Resources Information Center

    Anderson, Carolyn C.; Koenke, Karl

    A study was undertaken to create and validate a diagnostic, task-based Informal Reading Readiness Inventory (IRRI). IRRI subtests were created to reflect four areas found to be important in reading readiness: awareness of self and media, language experience, reasoning, and phonics. Prereading curriculum components formed the basis for test item…

  20. Study and Proposal for the Improvement of Military Technical Information Transfer Methods. Final Report.

    ERIC Educational Resources Information Center

    Shriver, Edgar L.; Hart, Fred L.

    Concepts currently used in conveying technical information about the operation and maintenance of equipment in the U.S. Army were investigated. The objective was to develop a more cost effective maintenance program by reducing personnel costs through a more effective software link between the hardware and maintenance personnel. The study…

  1. Communication and Research Skills in the Information Systems Curriculum: A Method of Assessment

    ERIC Educational Resources Information Center

    Lazarony, Paul J.; Driscoll, Donna A.

    2010-01-01

    Assessment of learning goals has become the norm in business programs in higher education across the country. This paper offers a methodology for the assessment of both communication skills and research skills within a curriculum of the Bachelor of Science in Information Systems Program. Program level learning goals assessed in this paper are: (1)…

  2. A Hierarchy Fuzzy MCDM Method for Studying Electronic Marketing Strategies in the Information Service Industry.

    ERIC Educational Resources Information Center

    Tang, Michael T.; Tzeng, Gwo-Hshiung

    In this paper, the impacts of Electronic Commerce (EC) on the international marketing strategies of information service industries are studied. In seeking to blend humanistic concerns in this research with technological development by addressing challenges for deterministic attitudes, the paper examines critical environmental factors relevant to…

  3. 77 FR 34124 - 2011 Tax Information for Use in the Revenue Shortfall Allocation Method

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-08

    ... Transp., Inc. v. STB, 568 F.3d 236 (DC Cir. 2009), and vacated in part on reh'g, CSX Transp., Inc. v. STB, 584 F.3d 1076 (DC Cir. 2009). In Annual Submission of Tax Information for Use in the Revenue...

  4. Electronic and Courier Methods of Information Dissemination: A Test of Accuracy.

    ERIC Educational Resources Information Center

    DeWine, Sue; And Others

    As part of a larger endeavor to evaluate the impact of communication technology on organizations, this study assesses the accuracy of information diffusion via electronic-mail and courier-mail systems in two large organizations which have implemented electronic-mail systems in the last three years. Data were obtained through the use of…

  5. Personalised Information Services Using a Hybrid Recommendation Method Based on Usage Frequency

    ERIC Educational Resources Information Center

    Kim, Yong; Chung, Min Gyo

    2008-01-01

    Purpose: This paper seeks to describe a personal recommendation service (PRS) involving an innovative hybrid recommendation method suitable for deployment in a large-scale multimedia user environment. Design/methodology/approach: The proposed hybrid method partitions content and user into segments and executes association rule mining,…

  6. [Application of information technology in orthodontics. 3. Practical method for computer aided measurements of orthodontic models].

    PubMed

    Reinhardt, H; Haffner, T; Ifert, F; Malsch, J; Schneider, P

    1989-09-01

    A practical method for computer aided measurements of orthodontic models is introduced here. Measurements values are taken by using in incremental donor with the traditional slide rule. Advantages of this method lead to a higher efficiency of model measurements. PMID:2636504

  7. A Straightforward Method for Advance Estimation of User Charges for Information in Numeric Databases.

    ERIC Educational Resources Information Center

    Jarvelin, Kalervo

    1986-01-01

    Describes a method for advance estimation of user charges for queries in relational data model-based numeric databases when charges are based on data retrieved. Use of this approach is demonstrated by sample queries to an imaginary marketing database. The principles and methods of this approach and its relevance are discussed. (MBR)

  8. An Information Retrieval Model Based on Vector Space Method by Supervised Learning.

    ERIC Educational Resources Information Center

    Tai, Xiaoying; Ren, Fuji; Kita, Kenji

    2002-01-01

    Proposes a method to improve retrieval performance of the vector space model by using users' relevance feedback. Discusses the use of singular value decomposition and the latent semantic indexing model, and reports the results of two experiments that show the effectiveness of the proposed method. (Author/LRW)

  9. Incremental Validity and Informant Effect from a Multi-Method Perspective: Assessing Relations between Parental Acceptance and Children's Behavioral Problems.

    PubMed

    Izquierdo-Sotorrío, Eva; Holgado-Tello, Francisco P; Carrasco, Miguel Á

    2016-01-01

    This study examines the relationships between perceived parental acceptance and children's behavioral problems (externalizing and internalizing) from a multi-informant perspective. Using mothers, fathers, and children as sources of information, we explore the informant effect and incremental validity. The sample was composed of 681 participants (227 children, 227 fathers, and 227 mothers). Children's (40% boys) ages ranged from 9 to 17 years (M = 12.52, SD = 1.81). Parents and children completed both the Parental Acceptance Rejection/Control Questionnaire (PARQ/Control) and the check list of the Achenbach System of Empirically Based Assessment (ASEBA). Statistical analyses were based on the correlated uniqueness multitrait-multimethod matrix (model MTMM) by structural equations and different hierarchical regression analyses. Results showed a significant informant effect and a different incremental validity related to which combination of sources was considered. A multi-informant perspective rather than a single one increased the predictive value. Our results suggest that mother-father or child-father combinations seem to be the best way to optimize the multi-informant method in order to predict children's behavioral problems based on perceived parental acceptance. PMID:27242582

  10. Groundwater Potential Assessment Using Geographic Information Systems and Ahp Method (case Study: Baft City, Kerman, Iran)

    NASA Astrophysics Data System (ADS)

    Zeinolabedini, M.; Esmaeily, A.

    2015-12-01

    The purpose of the present study is to use Geographical Information Systems (GISs) for determining the best areas having ground water potential in Baft city. To achieve this objective, parameters such as precipitation, slope, fault, vegetation, land cover and lithology were used. Regarding different weight of these parameters effect, Analytic Hierarchy Process (AHP) was used. After developing informational layers in GIS and weighing each of them, a model was developed. The final map of ground waters potential was calculated through the above-mentioned model. Through applying our developed model four areas having high, average, low potential and without required potential distinguished. Results of this research indicated that 0.74, 41.23 and 45.63 percent of the area had high, average and low potential, respectively. Moreover, 12.38% of this area had no potential. Obtained results can be useful in management plans of ground water resources and preventing excessive exploitation.

  11. Method and apparatus for optimizing a train trip using signal information

    DOEpatents

    Kumar, Ajith Kuttannair; Daum, Wolfgang; Otsubo, Tom; Hershey, John Erik; Hess, Gerald James

    2014-06-10

    A system is provided for operating a railway network including a first railway vehicle during a trip along track segments. The system includes a first element for determining travel parameters of the first railway vehicle, a second element for determining travel parameters of a second railway vehicle relative to the track segments to be traversed by the first vehicle during the trip, a processor for receiving information from the first and the second elements and for determining a relationship between occupation of a track segment by the second vehicle and later occupation of the same track segment by the first vehicle and an algorithm embodied within the processor having access to the information to create a trip plan that determines a speed trajectory for the first vehicle. The speed trajectory is responsive to the relationship and further in accordance with one or more operational criteria for the first vehicle.

  12. The cognitive interview method of conducting police interviews: eliciting extensive information and promoting therapeutic jurisprudence.

    PubMed

    Fisher, Ronald P; Geiselman, R Edward

    2010-01-01

    Police officers receive little or no training to conduct interviews with cooperative witnesses, and as a result they conduct interviews poorly, eliciting less information than is available and providing little support to assist victims overcome psychological problems that may have arisen from the crime. We analyze the components of a typical police interview that limits the amount of information witnesses communicate, and which militate against victims' overcoming psychological problems. We then describe an alternative interviewing protocol, the Cognitive Interview, which enhances witness recollection and also likely contributes to victims' well being. The component elements of the Cognitive Interview are described, with emphasis on those elements that likely promote better witness recollection and also help to assist victims' psychological health. PMID:20875685

  13. A theory-based newsletter nutrition education program reduces nutritional risk and improves dietary intake for congregate meal participants.

    PubMed

    Francis, Sarah L; MacNab, Lindsay; Shelley, Mack

    2014-01-01

    At-risk older adults need community-based nutrition programs that improve nutritional status and practices. This 6-month study assessed the impact of the traditional Chef Charles (CC) program (Control) compared to a theory-based CC program (Treatment) on nutritional risk (NR), dietary intakes, self-efficacy (SE), food security (FS), and program satisfaction for congregate meal participants. Participants were mostly educated, single, "food secure" White females. NR change for the treatment group was significantly higher (P = 0.042) than the control group. No differences were noted for SE or FS change and program satisfaction between groups. The overall distribution classification levels of FS changed significantly (P < .001) from pre to post. Over half (n = 46, 76.7%) reported making dietary changes and the majority (n = 52, 86.7%) rated CC as good to excellent. Results suggest the theory-based CC program (treatment) is more effective in reducing NR and dietary practices than the traditional CC program (control). PMID:24827061

  14. Parents of children with eating disorders: developing theory-based health communication messages to promote caregiver well-being.

    PubMed

    Patel, Sheetal; Shafer, Autumn; Brown, Jane; Bulik, Cynthia; Zucker, Nancy

    2014-01-01

    Parents of children with eating disorders experience extreme emotional burden because of the intensity and duration of the recovery process. While parental involvement in a child's eating disorder treatment improves outcomes, parents often neglect their own well-being, which can impede their child's recovery. This study extends the research on caregivers and on health theory in practice by conducting formative research to develop a theory-based communication intervention encouraging parents to engage in adaptive coping and self-care behaviors. The Transactional Model of Stress and Coping and the Transtheoretical Model guided qualitative assessments of the determinants of parents' coping behaviors. Three focus groups with 19 parents of children with eating disorders and 19 semi-structured interviews with experts specializing in eating disorders were conducted. Findings indicate that parents and experts see parents' need for permission to take time for themselves as the main barrier to self-care. The main motivator for parents to engage in coping behaviors is awareness of a connection between self-care and their child's health outcomes. Participant evaluation of six potential messages for main themes and effectiveness revealed that theory-based elements, such as certain processes of change within the Transtheoretical Model, were important to changing health behavior. PMID:24380433

  15. Testing a social cognitive theory-based model of indoor tanning: implications for skin cancer prevention messages.

    PubMed

    Noar, Seth M; Myrick, Jessica Gall; Zeitany, Alexandra; Kelley, Dannielle; Morales-Pico, Brenda; Thomas, Nancy E

    2015-01-01

    The lack of a theory-based understanding of indoor tanning is a major impediment to the development of effective messages to prevent or reduce this behavior. This study applied the Comprehensive Indoor Tanning Expectations (CITE) scale in an analysis of indoor tanning behavior among sorority women (total N = 775). Confirmatory factor analyses indicated that CITE positive and negative expectations were robust, multidimensional factors and that a hierarchical structure fit the data well. Social cognitive theory-based structural equation models demonstrated that appearance-oriented variables were significantly associated with outcome expectations. Outcome expectations were, in turn, significantly associated with temptations to tan, intention to tan indoors, and indoor tanning behavior. The implications of these findings for the development of messages to prevent and reduce indoor tanning behavior are discussed in two domains: (a) messages that attempt to change broader societal perceptions about tan skin, and (b) messages that focus more narrowly on indoor tanning-challenging positive expectations, enhancing negative expectations, and encouraging substitution of sunless tanning products. PMID:25470441

  16. General theory based on fluctuational electrodynamics for van der Waals interactions in colloidal systems

    SciTech Connect

    Yannopapas, Vassilios

    2007-12-15

    A rigorous theory for the determination of the van der Waals interactions in colloidal systems is presented. The method is based on fluctuational electrodynamics and a multiple-scattering method which provides the electromagnetic Green's tensor. In particular, expressions for the Green's tensor are presented for arbitrary, finite collections of colloidal particles, for infinitely periodic or defected crystals, as well as for finite slabs of crystals. The presented formalism allows for ab initio calculations of the van der Waals interactions in colloidal systems since it takes fully into account retardation, many-body, multipolar, and near-field effects.

  17. A mutual-information-based mining method for marine abnormal association rules

    NASA Astrophysics Data System (ADS)

    Cunjin, Xue; Wanjiao, Song; Lijuan, Qin; Qing, Dong; Xiaoyang, Wen

    2015-03-01

    Long time series of remote sensing images are a key source of data for exploring large-scale marine abnormal association patterns, but pose significant challenges for traditional approaches to spatiotemporal analysis. This paper proposes a mutual-information-based quantitative association rule-mining algorithm (MIQarma) to address these challenges. MIQarma comprises three key steps. First, MIQarma calculates the asymmetrical mutual information between items with one scan of the database, and extracts pair-wise related items according to the user-specified information threshold. Second, a linking-pruning-generating recursive loop generates (k+1)-dimensional candidate association rules from k-dimensional rules on basis of the user-specified minimum support threshold, and this step is repeated until no more candidate association rules are generated. Finally, strong association rules are generated according to the user-specified minimum evaluation indicators. To demonstrate the feasibility and efficiency of MIQarma, we present two case studies: one considers performance analysis and the other identifies marine abnormal association relationships.

  18. Method for the evaluation of structure-activity relationship information associated with coordinated activity cliffs.

    PubMed

    Dimova, Dilyana; Stumpfe, Dagmar; Bajorath, Jürgen

    2014-08-14

    Activity cliffs are generally defined as pairs of active compounds having a large difference in potency. Although this definition of activity cliffs focuses on compound pairs, the vast majority of cliffs are formed in a coordinated manner. This means that multiple highly and weakly potent compounds form series of activity cliffs, which often overlap. In activity cliff networks, coordinated cliffs emerge as disjoint activity cliff clusters. Recently, we have identified all cliff clusters from current bioactive compounds and analyzed their topologies. For structure-activity relationship (SAR) analysis, activity cliff clusters are of high interest, since they contain more SAR information than cliffs that are individually considered. For medicinal chemistry applications, a key question becomes how to best extract SAR information from activity cliff clusters. This represents a challenging problem, given the complexity of many activity cliff configurations. Herein we introduce a generally applicable methodology to organize activity cliff clusters on the basis of structural relationships, prioritize clusters, and systematically extract SAR information from them. PMID:25014781

  19. A Study towards Building An Optimal Graph Theory Based Model For The Design of Tourism Website

    NASA Astrophysics Data System (ADS)

    Panigrahi, Goutam; Das, Anirban; Basu, Kajla

    2010-10-01

    Effective tourism website is a key to attract tourists from different parts of the world. Here we identify the factors of improving the effectiveness of website by considering it as a graph, where web pages including homepage are the nodes and hyperlinks are the edges between the nodes. In this model, the design constraints for building a tourism website are taken into consideration. Our objectives are to build a framework of an effective tourism website providing adequate level of information, service and also to enable the users to reach to the desired page by spending minimal loading time. In this paper an information hierarchy specifying the upper limit of outgoing link of a page has also been proposed. Following the hierarchy, the web developer can prepare an effective tourism website. Here loading time depends on page size and network traffic. We have assumed network traffic as uniform and the loading time is directly proportional with page size. This approach is done by quantifying the link structure of a tourism website. In this approach we also propose a page size distribution pattern of a tourism website.

  20. Evaluation of non-animal methods for assessing skin sensitisation hazard: A Bayesian Value-of-Information analysis.

    PubMed

    Leontaridou, Maria; Gabbert, Silke; Van Ierland, Ekko C; Worth, Andrew P; Landsiedel, Robert

    2016-07-01

    This paper offers a Bayesian Value-of-Information (VOI) analysis for guiding the development of non-animal testing strategies, balancing information gains from testing with the expected social gains and costs from the adoption of regulatory decisions. Testing is assumed to have value, if, and only if, the information revealed from testing triggers a welfare-improving decision on the use (or non-use) of a substance. As an illustration, our VOI model is applied to a set of five individual non-animal prediction methods used for skin sensitisation hazard assessment, seven battery combinations of these methods, and 236 sequential 2-test and 3-test strategies. Their expected values are quantified and compared to the expected value of the local lymph node assay (LLNA) as the animal method. We find that battery and sequential combinations of non-animal prediction methods reveal a significantly higher expected value than the LLNA. This holds for the entire range of prior beliefs. Furthermore, our results illustrate that the testing strategy with the highest expected value does not necessarily have to follow the order of key events in the sensitisation adverse outcome pathway (AOP). PMID:27494625

  1. A sub-domain based regularization method with prior information for human thorax imaging using electrical impedance tomography

    NASA Astrophysics Data System (ADS)

    In Kang, Suk; Khambampati, Anil Kumar; Jeon, Min Ho; Kim, Bong Seok; Kim, Kyung Youn

    2016-02-01

    Electrical impedance tomography (EIT) is a non-invasive imaging technique that can be used as a bed-side monitoring tool for human thorax imaging. EIT has high temporal resolution characteristics but at the same time it suffers from poor spatial resolution due to ill-posedness of the inverse problem. Often regularization methods are used as a penalty term in the cost function to stabilize the sudden changes in resistivity. In human thorax monitoring, with conventional regularization methods employing Tikhonov type regularization, the reconstructed image is smoothed between the heart and the lungs, that is, it makes it difficult to distinguish the exact boundaries of the lungs and the heart. Sometimes, obtaining structural information of the object prior to this can be incorporated into the regularization method to improve the spatial resolution along with helping create clear and distinct boundaries between the objects. However, the boundary of the heart is changed rapidly due to the cardiac cycle hence there is no information concerning the exact boundary of the heart. Therefore, to improve the spatial resolution for human thorax monitoring during the cardiac cycle, in this paper, a sub-domain based regularization method is proposed assuming the lungs and part of background region is known. In the proposed method, the regularization matrix is modified anisotropically to include sub-domains as prior information, and the regularization parameter is assigned with different weights to each sub-domain. Numerical simulations and phantom experiments for 2D human thorax monitoring are performed to evaluate the performance of the proposed regularization method. The results show a better reconstruction performance with the proposed regularization method.

  2. Increasing condom use in heterosexual men: development of a theory-based interactive digital intervention.

    PubMed

    Webster, R; Michie, S; Estcourt, C; Gerressu, M; Bailey, J V

    2016-09-01

    Increasing condom use to prevent sexually transmitted infections is a key public health goal. Interventions are more likely to be effective if they are theory- and evidence-based. The Behaviour Change Wheel (BCW) provides a framework for intervention development. To provide an example of how the BCW was used to develop an intervention to increase condom use in heterosexual men (the MenSS website), the steps of the BCW intervention development process were followed, incorporating evidence from the research literature and views of experts and the target population. Capability (e.g. knowledge) and motivation (e.g. beliefs about pleasure) were identified as important targets of the intervention. We devised ways to address each intervention target, including selecting interactive features and behaviour change techniques. The BCW provides a useful framework for integrating sources of evidence to inform intervention content and deciding which influences on behaviour to target. PMID:27528531

  3. Who Would Do That? A Theory-Based Analysis of Narratives of Sources of Family Ostracism.

    PubMed

    Poulsen, Joan R; Carmon, Anna F

    2015-01-01

    There are many benefits derived from families, but not all family members are loving and accepting. Family members may act as sources of ostracism (people or groups who ostracize another person/group). We suggest sources engage in family ostracism for extended periods, their motives fit with prior theoretical models, and trait-level forgiveness may help understand source behavior. We analyzed data from 63 narratives and questionnaires to investigate the motives, power dynamics, and psychological correlates of sources of family ostracism. We found sources of ostracism are often of equal status to the targets of ostracism, and termination often occurs informally or is prompted by major changes in the family (e.g., birth, move). Also, sources of ostracism are often targets themselves suggesting family ostracism may be reciprocal in nature. Our findings support existing theory, but suggest ostracism in families has unique dynamics not captured in laboratory designs. PMID:26267127

  4. Empirical studies on informal patient payments for health care services: a systematic and critical review of research methods and instruments

    PubMed Central

    2010-01-01

    Background Empirical evidence demonstrates that informal patient payments are an important feature of many health care systems. However, the study of these payments is a challenging task because of their potentially illegal and sensitive nature. The aim of this paper is to provide a systematic review and analysis of key methodological difficulties in measuring informal patient payments. Methods The systematic review was based on the following eligibility criteria: English language publications that reported on empirical studies measuring informal patient payments. There were no limitations with regard to the year of publication. The content of the publications was analysed qualitatively and the results were organised in the form of tables. Data sources were Econlit, Econpapers, Medline, PubMed, ScienceDirect, SocINDEX. Results Informal payments for health care services are most often investigated in studies involving patients or the general public, but providers and officials are also sample units in some studies. The majority of the studies apply a single mode of data collection that involves either face-to-face interviews or group discussions. One of the main methodological difficulties reported in the publication concerns the inability of some respondents to distinguish between official and unofficial payments. Another complication is associated with the refusal of some respondents to answer questions on informal patient payments. We do not exclude the possibility that we have missed studies that reported in non-English language journals as well as very recent studies that are not yet published. Conclusions Given the recent evidence from research on survey methods, a self-administrated questionnaire during a face-to-face interview could be a suitable mode of collecting sensitive data, such as data on informal patient payments. PMID:20849658

  5. New Term Weighting Formulas for the Vector Space Method in Information Retrieval

    SciTech Connect

    Chisholm, E.; Kolda, T.G.

    1999-03-01

    The goal in information retrieval is to enable users to automatically and accurately find data relevant to their queries. One possible approach to this problem i use the vector space model, which models documents and queries as vectors in the term space. The components of the vectors are determined by the term weighting scheme, a function of the frequencies of the terms in the document or query as well as throughout the collection. We discuss popular term weighting schemes and present several new schemes that offer improved performance.

  6. [Exchange of medical imaging and data information in radiotherapy: needs, methods and current limits].

    PubMed

    Manens, J P

    1997-01-01

    Extension of the image network within radiotherapy departments provides the technical infrastructure which is made necessary by the rapid evolution of techniques in the field of diagnosis and treatment in radiotherapy. The system is aimed at managing the whole set of data (textual data and images) that are needed for planning and control of treatments. The radiotherapy network addresses two objectives: managing both the information necessary for treatment planning (target volumes definition, planning dosimetry) and the control of all parameters involved during the patient's treatment under the treatment unit. The major challenge is to improve the quality of treatment. Multimodal imaging is a major advance as it allows the use of new dosimetry and simulation techniques. The need for standards to exchange medical imaging information is now recognized by all the institutions and a majority of users and manufacturers. It is widely accepted that the lack of standard has been one of the fundamental obstacles in the deployment of operational "Picture Archiving Communication Systems". The International Standard Organisation Open System Interconnection model is the standard reference mode used to describe network protocols. The network is based on the Ethernet and TCP/IP protocol that provides the means to interconnect imaging devices and workstations dedicated to specific image processing or machines used in radiotherapy. The network uses Ethernet cabled on twisted-pair (10 BaseT) or optical fibres in a star-shaped physical layout. Dicom V3.0 supports fundamental network interactions: transfer of images (computerized tomography magnetic resonance imaging query and retrieve of images), printing on network attached cameras, support of HIS/RIS related interfacing and image management. The supplement to the Dicom standard, Dicom RT, specifies five data objects known in Dicom as Information Object Definition for relevant radiotherapy. Dicom RT objects can provide a mean for

  7. Designing Health Websites Based on Users’ Web-Based Information-Seeking Behaviors: A Mixed-Method Observational Study

    PubMed Central

    Pang, Patrick Cheong-Iao; Verspoor, Karin; Pearce, Jon

    2016-01-01

    Background Laypeople increasingly use the Internet as a source of health information, but finding and discovering the right information remains problematic. These issues are partially due to the mismatch between the design of consumer health websites and the needs of health information seekers, particularly the lack of support for “exploring” health information. Objective The aim of this research was to create a design for consumer health websites by supporting different health information–seeking behaviors. We created a website called Better Health Explorer with the new design. Through the evaluation of this new design, we derive design implications for future implementations. Methods Better Health Explorer was designed using a user-centered approach. The design was implemented and assessed through a laboratory-based observational study. Participants tried to use Better Health Explorer and another live health website. Both websites contained the same content. A mixed-method approach was adopted to analyze multiple types of data collected in the experiment, including screen recordings, activity logs, Web browsing histories, and audiotaped interviews. Results Overall, 31 participants took part in the observational study. Our new design showed a positive result for improving the experience of health information seeking, by providing a wide range of information and an engaging environment. The results showed better knowledge acquisition, a higher number of page reads, and more query reformulations in both focused and exploratory search tasks. In addition, participants spent more time to discover health information with our design in exploratory search tasks, indicating higher engagement with the website. Finally, we identify 4 design considerations for designing consumer health websites and health information–seeking apps: (1) providing a dynamic information scope; (2) supporting serendipity; (3) considering trust implications; and (4) enhancing interactivity

  8. A simple method for estimating basin-scale groundwater discharge by vegetation in the basin and range province of Arizona using remote sensing information and geographic information systems

    USGS Publications Warehouse

    Tillman, F.D.; Callegary, J.B.; Nagler, P.L.; Glenn, E.P.

    2012-01-01

    Groundwater is a vital water resource in the arid to semi-arid southwestern United States. Accurate accounting of inflows to and outflows from the groundwater system is necessary to effectively manage this shared resource, including the important outflow component of groundwater discharge by vegetation. A simple method for estimating basin-scale groundwater discharge by vegetation is presented that uses remote sensing data from satellites, geographic information systems (GIS) land cover and stream location information, and a regression equation developed within the Southern Arizona study area relating the Enhanced Vegetation Index from the MODIS sensors on the Terra satellite to measured evapotranspiration. Results computed for 16-day composited satellite passes over the study area during the 2000 through 2007 time period demonstrate a sinusoidal pattern of annual groundwater discharge by vegetation with median values ranging from around 0.3 mm per day in the cooler winter months to around 1.5 mm per day during summer. Maximum estimated annual volume of groundwater discharge by vegetation was between 1.4 and 1.9 billion m3 per year with an annual average of 1.6 billion m3. A simplified accounting of the contribution of precipitation to vegetation greenness was developed whereby monthly precipitation data were subtracted from computed vegetation discharge values, resulting in estimates of minimum groundwater discharge by vegetation. Basin-scale estimates of minimum and maximum groundwater discharge by vegetation produced by this simple method are useful bounding values for groundwater budgets and groundwater flow models, and the method may be applicable to other areas with similar vegetation types.

  9. Methods and apparatus for multi-resolution replication of files in a parallel computing system using semantic information

    DOEpatents

    Faibish, Sorin; Bent, John M.; Tzelnic, Percy; Grider, Gary; Torres, Aaron

    2015-10-20

    Techniques are provided for storing files in a parallel computing system using different resolutions. A method is provided for storing at least one file generated by a distributed application in a parallel computing system. The file comprises one or more of a complete file and a sub-file. The method comprises the steps of obtaining semantic information related to the file; generating a plurality of replicas of the file with different resolutions based on the semantic information; and storing the file and the plurality of replicas of the file in one or more storage nodes of the parallel computing system. The different resolutions comprise, for example, a variable number of bits and/or a different sub-set of data elements from the file. A plurality of the sub-files can be merged to reproduce the file.

  10. Methods for semi-automated indexing for high precision information retrieval

    NASA Technical Reports Server (NTRS)

    Berrios, Daniel C.; Cucina, Russell J.; Fagan, Lawrence M.

    2002-01-01

    OBJECTIVE: To evaluate a new system, ISAID (Internet-based Semi-automated Indexing of Documents), and to generate textbook indexes that are more detailed and more useful to readers. DESIGN: Pilot evaluation: simple, nonrandomized trial comparing ISAID with manual indexing methods. Methods evaluation: randomized, cross-over trial comparing three versions of ISAID and usability survey. PARTICIPANTS: Pilot evaluation: two physicians. Methods evaluation: twelve physicians, each of whom used three different versions of the system for a total of 36 indexing sessions. MEASUREMENTS: Total index term tuples generated per document per minute (TPM), with and without adjustment for concordance with other subjects; inter-indexer consistency; ratings of the usability of the ISAID indexing system. RESULTS: Compared with manual methods, ISAID decreased indexing times greatly. Using three versions of ISAID, inter-indexer consistency ranged from 15% to 65% with a mean of 41%, 31%, and 40% for each of three documents. Subjects using the full version of ISAID were faster (average TPM: 5.6) and had higher rates of concordant index generation. There were substantial learning effects, despite our use of a training/run-in phase. Subjects using the full version of ISAID were much faster by the third indexing session (average TPM: 9.1). There was a statistically significant increase in three-subject concordant indexing rate using the full version of ISAID during the second indexing session (p < 0.05). SUMMARY: Users of the ISAID indexing system create complex, precise, and accurate indexing for full-text documents much faster than users of manual methods. Furthermore, the natural language processing methods that ISAID uses to suggest indexes contributes substantially to increased indexing speed and accuracy.

  11. Finding a needle in a haystack: toward a psychologically informed method for aviation security screening.

    PubMed

    Ormerod, Thomas C; Dando, Coral J

    2015-02-01

    Current aviation security systems identify behavioral indicators of deception to assess risks to flights, but they lack a strong psychological basis or empirical validation. We present a new method that tests the veracity of passenger accounts. In an in vivo double-blind randomized-control trial conducted in international airports, security agents detected 66% of deceptive passengers using the veracity test method compared with less than 5% using behavioral indicator recognition. As well as revealing advantages of veracity testing over behavioral indicator identification, the study provides the highest levels to date of deception detection in a realistic setting where the known base rate of deceptive individuals is low. PMID:25365531

  12. Extracting land use information from the earth resources technology satellite data by conventional interpretation methods

    NASA Technical Reports Server (NTRS)

    Vegas, P. L.

    1974-01-01

    A procedure for obtaining land use data from satellite imagery by the use of conventional interpretation methods is presented. The satellite is described briefly, and the advantages of various scales and multispectral scanner bands are discussed. Methods for obtaining satellite imagery and the sources of this imagery are given. Equipment used in the study is described, and samples of land use maps derived from satellite imagery are included together with the land use classification system used. Accuracy percentages are cited and are compared to those of a previous experiment using small scale aerial photography.

  13. A simple ligation-based method to increase the information density in sequencing reactions used to deconvolute nucleic acid selections

    PubMed Central

    Childs-Disney, Jessica L.; Disney, Matthew D.

    2008-01-01

    Herein, a method is described to increase the information density of sequencing experiments used to deconvolute nucleic acid selections. The method is facile and should be applicable to any selection experiment. A critical feature of this method is the use of biotinylated primers to amplify and encode a BamHI restriction site on both ends of a PCR product. After amplification, the PCR reaction is captured onto streptavidin resin, washed, and digested directly on the resin. Resin-based digestion affords clean product that is devoid of partially digested products and unincorporated PCR primers. The product's complementary ends are annealed and ligated together with T4 DNA ligase. Analysis of ligation products shows formation of concatemers of different length and little detectable monomer. Sequencing results produced data that routinely contained three to four copies of the library. This method allows for more efficient formulation of structure-activity relationships since multiple active sequences are identified from a single clone. PMID:18065718

  14. Retrieval practice is an efficient method of enhancing the retention of anatomy and physiology information.

    PubMed

    Dobson, John L

    2013-06-01

    Although a great deal of empirical evidence has indicated that retrieval practice is an effective means of promoting learning and memory, very few studies have investigated the strategy in the context of an actual class. The primary purpose of this study was to determine if a series of very brief retrieval quizzes could significantly improve the retention of previously tested information throughout an anatomy and physiology course. A second purpose was to determine if there were any significant differences between expanding and uniform patterns of retrieval that followed a standardized initial retrieval delay. Anatomy and physiology students were assigned to either a control group or groups that were repeatedly prompted to retrieve a subset of previously tested course information via a series of quizzes that were administered on either an expanding or a uniform schedule. Each retrieval group completed a total of 10 retrieval quizzes, and the series of quizzes required (only) a total of 2 h to complete. Final retention of the exam subset material was assessed during the last week of the semester. There were no significant differences between the expanding and uniform retrieval groups, but both retained an average of 41% more of the subset material than did the control group (ANOVA, F = 129.8, P = 0.00, ηp(2) = 0.36). In conclusion, retrieval practice is a highly efficient and effective strategy for enhancing the retention of anatomy and physiology material. PMID:23728136

  15. Method and apparatus for optimizing a train trip using signal information

    DOEpatents

    Kumar, Ajith Kuttannair; Daum, Wolfgang; Otsubo, Tom; Hershey, John Erik; Hess, Gerald James

    2013-02-05

    One embodiment of the invention includes a system for operating a railway network comprising a first railway vehicle (400) during a trip along track segments (401/412/420). The system comprises a first element (65) for determining travel parameters of the first railway vehicle (400), a second element (65) for determining travel parameters of a second railway vehicle (418) relative to the track segments to be traversed by the first vehicle during the trip, a processor (62) for receiving information from the first (65) and the second (65) elements and for determining a relationship between occupation of a track segment (401/412/420) by the second vehicle (418) and later occupation of the same track segment by the first vehicle (400) and an algorithm embodied within the processor (62) having access to the information to create a trip plan that determines a speed trajectory for the first vehicle (400), wherein the speed trajectory is responsive to the relationship and further in accordance with one or more operational criteria for the first vehicle (400).

  16. The development of systematic quality control method using laboratory information system and unity program.

    PubMed

    Min, Won-Ki; Lee, Woochang; Park, Hyosoon

    2002-01-01

    Quality control (QC) process is performed to detect and correct errors in the laboratory, of which systematic errors are repeated and affect all the laboratory process thereafter. This makes it necessary for all the laboratories to detect and correct errors effectively and efficiently. We developed an on-line quality assurance system for detection and correction of systematic error, and linked it to the Unity Plus/Pro (Bio-Rad Laboratories, Irvine, USA), a commercially available quality management system. The laboratory information system based on the client-server paradigm was developed using NCR3600 (NCR, West Columbia, USA) as the server and database for server was Oracle 7.2 (Oracle, Belmont, USA) and development tool was Powerbuilder (Powersoft Burlignton, UK). Each QC material is registered and gets its own identification number and tested the same way as patient sample. The resulting QC data is entered into the Unity Plus/Pro program by in-house data entering program or by manual input. With the implementation of in-house laboratory information system (LIS) and linking it to Unity Plus/Pro, we could apply Westgard's multi-rule for higher error detection rate, resulting in more systematic and precise quality assurance for laboratory product, as well as complementary to conventional external quality assessment. PMID:12755272

  17. Methods of information geometry in computational system biology (consistency between chemical and biological evolution).

    PubMed

    Astakhov, Vadim

    2009-01-01

    Interest in simulation of large-scale metabolic networks, species development, and genesis of various diseases requires new simulation techniques to accommodate the high complexity of realistic biological networks. Information geometry and topological formalisms are proposed to analyze information processes. We analyze the complexity of large-scale biological networks as well as transition of the system functionality due to modification in the system architecture, system environment, and system components. The dynamic core model is developed. The term dynamic core is used to define a set of causally related network functions. Delocalization of dynamic core model provides a mathematical formalism to analyze migration of specific functions in biosystems which undergo structure transition induced by the environment. The term delocalization is used to describe these processes of migration. We constructed a holographic model with self-poetic dynamic cores which preserves functional properties under those transitions. Topological constraints such as Ricci flow and Pfaff dimension were found for statistical manifolds which represent biological networks. These constraints can provide insight on processes of degeneration and recovery which take place in large-scale networks. We would like to suggest that therapies which are able to effectively implement estimated constraints, will successfully adjust biological systems and recover altered functionality. Also, we mathematically formulate the hypothesis that there is a direct consistency between biological and chemical evolution. Any set of causal relations within a biological network has its dual reimplementation in the chemistry of the system environment. PMID:19623488

  18. Novel Method for Calculating a Nonsubjective Informative Prior for a Bayesian Model in Toxicology Screening: A Theoretical Framework.

    PubMed

    Woldegebriel, Michael

    2015-11-17

    In toxicology screening (forensic, food-safety), due to several analytical errors (e.g., retention time shift, lack of repeatability in m/z scans, etc.), the ability to confidently identify/confirm a compound remains a challenge. Due to these uncertainties, a probabilistic approach is currently preferred. However, if a probabilistic approach is followed, the only statistical method that is capable of estimating the probability of whether the compound of interest (COI) is present/absent in a given sample is Bayesian statistics. Bayes' theorem can combine prior information (prior probability) with data (likelihood) to give an optimal probability (posterior probability) reflecting the presence/absence of the COI. In this work, a novel method for calculating an informative prior probability for a Bayesian model in targeted toxicology screening is introduced. In contrast to earlier proposals making use of literature citation rates and the prior knowledge of the analyst, this method presents a thorough and nonsubjective approach. The formulation approaches the probability calculation as a clustering and random draw problem that incorporates few analytical method parameters meticulously estimated to reflect sensitivity and specificity of the system. The practicality of the method has been demonstrated and validated using real data and simulated analytical techniques. PMID:26482700

  19. Methods, systems, and apparatus for storage, transfer and/or control of information via matter wave dynamics

    NASA Technical Reports Server (NTRS)

    Vestergaard Hau, Lene (Inventor)

    2012-01-01

    Methods, systems and apparatus for generating atomic traps, and for storing, controlling and transferring information between first and second spatially separated phase-coherent objects, or using a single phase-coherent object. For plural objects, both phase-coherent objects have a macroscopic occupation of a particular quantum state by identical bosons or identical BCS-paired fermions. The information may be optical information, and the phase-coherent object(s) may be Bose-Einstein condensates, superfluids, or superconductors. The information is stored in the first phase-coherent object at a first storage time and recovered from the second phase-coherent object, or the same first phase-coherent object, at a second revival time. In one example, an integrated silicon wafer-based optical buffer includes an electrolytic atom source to provide the phase-coherent object(s), a nanoscale atomic trap for the phase-coherent object(s), and semiconductor-based optical sources to cool the phase-coherent object(s) and provide coupling fields for storage and transfer of optical information.

  20. Mixed Methods Analysis and Information Visualization: Graphical Display for Effective Communication of Research Results

    ERIC Educational Resources Information Center

    Onwuegbuzie, Anthony J.; Dickinson, Wendy B.

    2008-01-01

    In this paper, we introduce various graphical methods that can be used to represent data in mixed research. First, we present a broad taxonomy of visual representation. Next, we use this taxonomy to provide an overview of visual techniques for quantitative data display and qualitative data display. Then, we propose what we call "crossover" visual…