Science.gov

Sample records for information theory-based methods

  1. Optimal cross-sectional sampling for river modelling with bridges: An information theory-based method

    NASA Astrophysics Data System (ADS)

    Ridolfi, E.; Alfonso, L.; Di Baldassarre, G.; Napolitano, F.

    2016-06-01

    The description of river topography has a crucial role in accurate one-dimensional (1D) hydraulic modelling. Specifically, cross-sectional data define the riverbed elevation, the flood-prone area, and thus, the hydraulic behavior of the river. Here, the problem of the optimal cross-sectional spacing is solved through an information theory-based concept. The optimal subset of locations is the one with the maximum information content and the minimum amount of redundancy. The original contribution is the introduction of a methodology to sample river cross sections in the presence of bridges. The approach is tested on the Grosseto River (IT) and is compared to existing guidelines. The results show that the information theory-based approach can support traditional methods to estimate rivers' cross-sectional spacing.

  2. IMMAN: free software for information theory-based chemometric analysis.

    PubMed

    Urias, Ricardo W Pino; Barigye, Stephen J; Marrero-Ponce, Yovani; García-Jacas, César R; Valdes-Martiní, José R; Perez-Gimenez, Facundo

    2015-05-01

    The features and theoretical background of a new and free computational program for chemometric analysis denominated IMMAN (acronym for Information theory-based CheMoMetrics ANalysis) are presented. This is multi-platform software developed in the Java programming language, designed with a remarkably user-friendly graphical interface for the computation of a collection of information-theoretic functions adapted for rank-based unsupervised and supervised feature selection tasks. A total of 20 feature selection parameters are presented, with the unsupervised and supervised frameworks represented by 10 approaches in each case. Several information-theoretic parameters traditionally used as molecular descriptors (MDs) are adapted for use as unsupervised rank-based feature selection methods. On the other hand, a generalization scheme for the previously defined differential Shannon's entropy is discussed, as well as the introduction of Jeffreys information measure for supervised feature selection. Moreover, well-known information-theoretic feature selection parameters, such as information gain, gain ratio, and symmetrical uncertainty are incorporated to the IMMAN software ( http://mobiosd-hub.com/imman-soft/ ), following an equal-interval discretization approach. IMMAN offers data pre-processing functionalities, such as missing values processing, dataset partitioning, and browsing. Moreover, single parameter or ensemble (multi-criteria) ranking options are provided. Consequently, this software is suitable for tasks like dimensionality reduction, feature ranking, as well as comparative diversity analysis of data matrices. Simple examples of applications performed with this program are presented. A comparative study between IMMAN and WEKA feature selection tools using the Arcene dataset was performed, demonstrating similar behavior. In addition, it is revealed that the use of IMMAN unsupervised feature selection methods improves the performance of both IMMAN and WEKA

  3. Correlation theory-based signal processing method for CMF signals

    NASA Astrophysics Data System (ADS)

    Shen, Yan-lin; Tu, Ya-qing

    2016-06-01

    Signal processing precision of Coriolis mass flowmeter (CMF) signals affects measurement accuracy of Coriolis mass flowmeters directly. To improve the measurement accuracy of CMFs, a correlation theory-based signal processing method for CMF signals is proposed, which is comprised of the correlation theory-based frequency estimation method and phase difference estimation method. Theoretical analysis shows that the proposed method eliminates the effect of non-integral period sampling signals on frequency and phase difference estimation. The results of simulations and field experiments demonstrate that the proposed method improves the anti-interference performance of frequency and phase difference estimation and has better estimation performance than the adaptive notch filter, discrete Fourier transform and autocorrelation methods in terms of frequency estimation and the data extension-based correlation, Hilbert transform, quadrature delay estimator and discrete Fourier transform methods in terms of phase difference estimation, which contributes to improving the measurement accuracy of Coriolis mass flowmeters.

  4. Evaluation of the performance of information theory-based methods and cross-correlation to estimate the functional connectivity in cortical networks.

    PubMed

    Garofalo, Matteo; Nieus, Thierry; Massobrio, Paolo; Martinoia, Sergio

    2009-01-01

    Functional connectivity of in vitro neuronal networks was estimated by applying different statistical algorithms on data collected by Micro-Electrode Arrays (MEAs). First we tested these "connectivity methods" on neuronal network models at an increasing level of complexity and evaluated the performance in terms of ROC (Receiver Operating Characteristic) and PPC (Positive Precision Curve), a new defined complementary method specifically developed for functional links identification. Then, the algorithms better estimated the actual connectivity of the network models, were used to extract functional connectivity from cultured cortical networks coupled to MEAs. Among the proposed approaches, Transfer Entropy and Joint-Entropy showed the best results suggesting those methods as good candidates to extract functional links in actual neuronal networks from multi-site recordings. PMID:19652720

  5. Scale effects on information theory-based measures applied to streamflow patterns in two rural watersheds

    NASA Astrophysics Data System (ADS)

    Pan, Feng; Pachepsky, Yakov A.; Guber, Andrey K.; McPherson, Brian J.; Hill, Robert L.

    2012-01-01

    varying from 0.2 to 0.6 in the two watersheds. We conclude that temporal effects must be evaluated and accounted for when the information theory-based methods are used for performance evaluation and comparison of hydrological models.

  6. Kinetic theory based new upwind methods for inviscid compressible flows

    NASA Technical Reports Server (NTRS)

    Deshpande, S. M.

    1986-01-01

    Two new upwind methods called the Kinetic Numerical Method (KNM) and the Kinetic Flux Vector Splitting (KFVS) method for the solution of the Euler equations have been presented. Both of these methods can be regarded as some suitable moments of an upwind scheme for the solution of the Boltzmann equation provided the distribution function is Maxwellian. This moment-method strategy leads to a unification of the Riemann approach and the pseudo-particle approach used earlier in the development of upwind methods for the Euler equations. A very important aspect of the moment-method strategy is that the new upwind methods satisfy the entropy condition because of the Boltzmann H-Theorem and suggest a possible way of extending the Total Variation Diminishing (TVD) principle within the framework of the H-Theorem. The ability of these methods in obtaining accurate wiggle-free solution is demonstrated by applying them to two test problems.

  7. Density functional theory based generalized effective fragment potential method

    SciTech Connect

    Nguyen, Kiet A. E-mail: ruth.pachter@wpafb.af.mil; Pachter, Ruth E-mail: ruth.pachter@wpafb.af.mil; Day, Paul N.

    2014-06-28

    We present a generalized Kohn-Sham (KS) density functional theory (DFT) based effective fragment potential (EFP2-DFT) method for the treatment of solvent effects. Similar to the original Hartree-Fock (HF) based potential with fitted parameters for water (EFP1) and the generalized HF based potential (EFP2-HF), EFP2-DFT includes electrostatic, exchange-repulsion, polarization, and dispersion potentials, which are generated for a chosen DFT functional for a given isolated molecule. The method does not have fitted parameters, except for implicit parameters within a chosen functional and the dispersion correction to the potential. The electrostatic potential is modeled with a multipolar expansion at each atomic center and bond midpoint using Stone's distributed multipolar analysis. The exchange-repulsion potential between two fragments is composed of the overlap and kinetic energy integrals and the nondiagonal KS matrices in the localized molecular orbital basis. The polarization potential is derived from the static molecular polarizability. The dispersion potential includes the intermolecular D3 dispersion correction of Grimme et al. [J. Chem. Phys. 132, 154104 (2010)]. The potential generated from the CAMB3LYP functional has mean unsigned errors (MUEs) with respect to results from coupled cluster singles, doubles, and perturbative triples with a complete basis set limit (CCSD(T)/CBS) extrapolation, of 1.7, 2.2, 2.0, and 0.5 kcal/mol, for the S22, water-benzene clusters, water clusters, and n-alkane dimers benchmark sets, respectively. The corresponding EFP2-HF errors for the respective benchmarks are 2.41, 3.1, 1.8, and 2.5 kcal/mol. Thus, the new EFP2-DFT-D3 method with the CAMB3LYP functional provides comparable or improved results at lower computational cost and, therefore, extends the range of applicability of EFP2 to larger system sizes.

  8. Density functional theory based generalized effective fragment potential method.

    PubMed

    Nguyen, Kiet A; Pachter, Ruth; Day, Paul N

    2014-06-28

    We present a generalized Kohn-Sham (KS) density functional theory (DFT) based effective fragment potential (EFP2-DFT) method for the treatment of solvent effects. Similar to the original Hartree-Fock (HF) based potential with fitted parameters for water (EFP1) and the generalized HF based potential (EFP2-HF), EFP2-DFT includes electrostatic, exchange-repulsion, polarization, and dispersion potentials, which are generated for a chosen DFT functional for a given isolated molecule. The method does not have fitted parameters, except for implicit parameters within a chosen functional and the dispersion correction to the potential. The electrostatic potential is modeled with a multipolar expansion at each atomic center and bond midpoint using Stone's distributed multipolar analysis. The exchange-repulsion potential between two fragments is composed of the overlap and kinetic energy integrals and the nondiagonal KS matrices in the localized molecular orbital basis. The polarization potential is derived from the static molecular polarizability. The dispersion potential includes the intermolecular D3 dispersion correction of Grimme et al. [J. Chem. Phys. 132, 154104 (2010)]. The potential generated from the CAMB3LYP functional has mean unsigned errors (MUEs) with respect to results from coupled cluster singles, doubles, and perturbative triples with a complete basis set limit (CCSD(T)/CBS) extrapolation, of 1.7, 2.2, 2.0, and 0.5 kcal/mol, for the S22, water-benzene clusters, water clusters, and n-alkane dimers benchmark sets, respectively. The corresponding EFP2-HF errors for the respective benchmarks are 2.41, 3.1, 1.8, and 2.5 kcal/mol. Thus, the new EFP2-DFT-D3 method with the CAMB3LYP functional provides comparable or improved results at lower computational cost and, therefore, extends the range of applicability of EFP2 to larger system sizes. PMID:24985612

  9. A variable-order laminated plate theory based on the variational-asymptotical method

    NASA Technical Reports Server (NTRS)

    Lee, Bok W.; Sutyrin, Vladislav G.; Hodges, Dewey H.

    1993-01-01

    The variational-asymptotical method is a mathematical technique by which the three-dimensional analysis of laminated plate deformation can be split into a linear, one-dimensional, through-the-thickness analysis and a nonlinear, two-dimensional, plate analysis. The elastic constants used in the plate analysis are obtained from the through-the-thickness analysis, along with approximate, closed-form three-dimensional distributions of displacement, strain, and stress. In this paper, a theory based on this technique is developed which is capable of approximating three-dimensional elasticity to any accuracy desired. The asymptotical method allows for the approximation of the through-the-thickness behavior in terms of the eigenfunctions of a certain Sturm-Liouville problem associated with the thickness coordinate. These eigenfunctions contain all the necessary information about the nonhomogeneities along the thickness coordinate of the plate and thus possess the appropriate discontinuities in the derivatives of displacement. The theory is presented in this paper along with numerical results for the eigenfunctions of various laminated plates.

  10. A new theory-based social classification in Japan and its validation using historically collected information.

    PubMed

    Hiyoshi, Ayako; Fukuda, Yoshiharu; Shipley, Martin J; Bartley, Mel; Brunner, Eric J

    2013-06-01

    Studies of health inequalities in Japan have increased since the millennium. However, there remains a lack of an accepted theory-based classification to measure occupation-related social position for Japan. This study attempts to derive such a classification based on the National Statistics Socio-economic Classification in the UK. Using routinely collected data from the nationally representative Comprehensive Survey of the Living Conditions of People on Health and Welfare, the Japanese Socioeconomic Classification was derived using two variables - occupational group and employment status. Validation analyses were conducted using household income, home ownership, self-rated good or poor health, and Kessler 6 psychological distress (n ≈ 36,000). After adjustment for age, marital status, and area (prefecture), one step lower social class was associated with mean 16% (p < 0.001) lower income, and a risk ratio of 0.93 (p < 0.001) for home ownership. The probability of good health showed a trend in men and women (risk ratio 0.94 and 0.93, respectively, for one step lower social class, p < 0.001). The trend for poor health was significant in women (odds ratio 1.12, p < 0.001) but not in men. Kessler 6 psychological distress showed significant trends in men (risk ratio 1.03, p = 0.044) and in women (1.05, p = 0.004). We propose the Japanese Socioeconomic Classification, derived from basic occupational and employment status information, as a meaningful, theory-based and standard classification system suitable for monitoring occupation-related health inequalities in Japan. PMID:23631782

  11. Using a Mixed Methods Sequential Design to Identify Factors Associated with African American Mothers' Intention to Vaccinate Their Daughters Aged 9 to 12 for HPV with a Purpose of Informing a Culturally-Relevant, Theory-Based Intervention

    ERIC Educational Resources Information Center

    Cunningham, Jennifer L.

    2013-01-01

    The purpose of this sequential, explanatory mixed methods research study was to understand what factors influenced African American maternal intentions to get their daughters aged 9 years to 12 years vaccinated in Alabama. In the first, quantitative phase of the study, the research questions focused on identifying the predictive power of eleven…

  12. Spectral analysis comparisons of Fourier-theory-based methods and minimum variance (Capon) methods

    NASA Astrophysics Data System (ADS)

    Garbanzo-Salas, Marcial; Hocking, Wayne. K.

    2015-09-01

    In recent years, adaptive (data dependent) methods have been introduced into many areas where Fourier spectral analysis has traditionally been used. Although the data-dependent methods are often advanced as being superior to Fourier methods, they do require some finesse in choosing the order of the relevant filters. In performing comparisons, we have found some concerns about the mappings, particularly when related to cases involving many spectral lines or even continuous spectral signals. Using numerical simulations, several comparisons between Fourier transform procedures and minimum variance method (MVM) have been performed. For multiple frequency signals, the MVM resolves most of the frequency content only for filters that have more degrees of freedom than the number of distinct spectral lines in the signal. In the case of Gaussian spectral approximation, MVM will always underestimate the width, and can misappropriate the location of spectral line in some circumstances. Large filters can be used to improve results with multiple frequency signals, but are computationally inefficient. Significant biases can occur when using MVM to study spectral information or echo power from the atmosphere. Artifacts and artificial narrowing of turbulent layers is one such impact.

  13. Scale effects on information theory-based measures applied to streamflow patterns in two rural watersheds

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Understanding streamflow patterns in space and time is important to improve the flood and drought forecasting, water resources management, and predictions of ecological changes. The objectives of this work were (a) to characterize the spatial and temporal patterns of streamflow using information the...

  14. Multivariate drought index: An information theory based approach for integrated drought assessment

    NASA Astrophysics Data System (ADS)

    Rajsekhar, Deepthi; Singh, Vijay. P.; Mishra, Ashok. K.

    2015-07-01

    Most of the existing drought indices are based on a single variable (e.g. precipitation) or a combination of two variables (e.g., precipitation and streamflow). This may not be sufficient for reliable quantification of the existing drought condition. It is possible that a region might be experiencing only a single type of drought at times, but multiple drought types affecting a region is quite common too. To have a comprehensive representation, it is better to consider all the variables that lead to different physical forms of drought, such as meteorological, hydrological, and agricultural droughts. Therefore, we propose to develop a multivariate drought index (MDI) that will utilize information from hydroclimatic variables, including precipitation, runoff, evapotranspiration and soil moisture as indicator variables, thus accounting for all the physical forms of drought. The entropy theory was utilized to develop this proposed index, that led to the smallest set of features maximally preserving the information of the input data set. MDI was then compared with the Palmer drought severity index (PDSI) for all climate regions within Texas for the time period 1950-2012, with particular attention to the two major drought occurrences in Texas, viz. the droughts which occurred in 1950-1957, and 2010-2011. The proposed MDI was found to represent drought conditions well, due to its multivariate, multi scalar, and nonlinear properties. To help the user choose the right time scale for further analysis, entropy maps of MDI at different time scales were used as a guideline. The MDI time scale that has the highest entropy value may be chosen, since a higher entropy indicates a higher information content.

  15. An information theory based framework for the measurement of population health.

    PubMed

    Nesson, Erik T; Robinson, Joshua J

    2015-04-01

    This paper proposes a new framework for the measurement of population health and the ranking of the health of different geographies. Since population health is a latent variable, studies which measure and rank the health of different geographies must aggregate observable health attributes into one summary measure. We show that the methods used in nearly all the literature to date implicitly assume that all attributes are infinitely substitutable. Our method, based on the measurement of multidimensional welfare and inequality, minimizes the entropic distance between the summary measure of population health and the distribution of the underlying attributes. This summary function coincides with the constant elasticity of substitution and Cobb-Douglas production functions and naturally allows different assumptions regarding attribute substitutability or complementarity. To compare methodologies, we examine a well-known ranking of the population health of U.S. states, America's Health Rankings. We find that states' rankings are somewhat sensitive to changes in the weight given to each attribute, but very sensitive to changes in aggregation methodology. Our results have broad implications for well-known health rankings such as the 2000 World Health Report, as well as other measurements of population and individual health levels and the measurement and decomposition of health inequality. PMID:25792258

  16. A second-order accurate kinetic-theory-based method for inviscid compressible flows

    NASA Technical Reports Server (NTRS)

    Deshpande, Suresh M.

    1986-01-01

    An upwind method for the numerical solution of the Euler equations is presented. This method, called the kinetic numerical method (KNM), is based on the fact that the Euler equations are moments of the Boltzmann equation of the kinetic theory of gases when the distribution function is Maxwellian. The KNM consists of two phases, the convection phase and the collision phase. The method is unconditionally stable and explicit. It is highly vectorizable and can be easily made total variation diminishing for the distribution function by a suitable choice of the interpolation strategy. The method is applied to a one-dimensional shock-propagation problem and to a two-dimensional shock-reflection problem.

  17. An hybrid computing approach to accelerating the multiple scattering theory based ab initio methods

    NASA Astrophysics Data System (ADS)

    Wang, Yang; Stocks, G. Malcolm

    2014-03-01

    The multiple scattering theory method, also known as the Korringa-Kohn-Rostoker (KKR) method, is considered an elegant approach to the ab initio electronic structure calculation for solids. Its convenience in accessing the one-electron Green function has led to the development of locally-self consistent multiple scattering (LSMS) method, a linear scaling ab initio method that allows for the electronic structure calculation for complex structures requiring tens of thousands of atoms in unit cell. It is one of the few applications that demonstrated petascale computing capability. In this presentation, we discuss our recent efforts in developing a hybrid computing approach for accelerating the full potential electronic structure calculation. Specifically, in the framework of our existing LSMS code in FORTRAN 90/95, we explore the many core resources on GPGPU accelerators by implementing the compute intensive functions (for the calculation of multiple scattering matrices and the single site solutions) in CUDA, and move the computational tasks to the GPGPUs if they are found available. We explain in details our approach to the CUDA programming and the code structure, and show the speed-up of the new hybrid code by comparing its performances on CPU/GPGPU and on CPU only. The work was supported in part by the Center for Defect Physics, a DOE-BES Energy Frontier Research Center.

  18. Analytic Gradient for Density Functional Theory Based on the Fragment Molecular Orbital Method.

    PubMed

    Brorsen, Kurt R; Zahariev, Federico; Nakata, Hiroya; Fedorov, Dmitri G; Gordon, Mark S

    2014-12-01

    The equations for the response terms for the fragment molecular orbital (FMO) method interfaced with the density functional theory (DFT) gradient are derived and implemented. Compared to the previous FMO-DFT gradient, which lacks response terms, the FMO-DFT analytic gradient has improved accuracy for a variety of functionals, when compared to numerical gradients. The FMO-DFT gradient agrees with the fully ab initio DFT gradient in which no fragmentation is performed, while reducing the nonlinear scaling associated with standard DFT. Solving for the response terms requires the solution of the coupled perturbed Kohn-Sham (CPKS) equations, where the CPKS equations are solved through a decoupled Z-vector procedure called the self-consistent Z-vector method. FMO-DFT is a nonvariational method and the FMO-DFT gradient is unique compared to standard DFT gradients in that the FMO-DFT gradient requires terms from both DFT and time-dependent density functional theory (TDDFT) theories. PMID:26583213

  19. Second order Møller-Plesset perturbation theory based upon the fragment molecular orbital method

    NASA Astrophysics Data System (ADS)

    Fedorov, Dmitri G.; Kitaura, Kazuo

    2004-08-01

    The fragment molecular orbital (FMO) method was combined with the second order Møller-Plesset (MP2) perturbation theory. The accuracy of the method using the 6-31G* basis set was tested on (H2O)n, n=16,32,64; α-helices and β-strands of alanine n-mers, n=10,20,40; as well as on (H2O)n, n=16,32,64 using the 6-31++G** basis set. Relative to the regular MP2 results that could be afforded, the FMO2-MP2 error in the correlation energy did not exceed 0.003 a.u., the error in the correlation energy gradient did not exceed 0.000 05 a.u./bohr and the error in the correlation contribution to dipole moment did not exceed 0.03 debye. An approximation reducing computational load based on fragment separation was introduced and tested. The FMO2-MP2 method demonstrated nearly linear scaling and drastically reduced the memory requirements of the regular MP2, making possible calculations with several thousands basis functions using small Pentium clusters. As an example, (H2O)64 with the 6-31++G** basis set (1920 basis functions) can be run in 1 Gbyte RAM and it took 136 s on a 40-node Pentium4 cluster.

  20. Gas-Kinetic Theory Based Flux Splitting Method for Ideal Magnetohydrodynamics

    NASA Technical Reports Server (NTRS)

    Xu, Kun

    1998-01-01

    A gas-kinetic solver is developed for the ideal magnetohydrodynamics (MHD) equations. The new scheme is based on the direct splitting of the flux function of the MHD equations with the inclusion of "particle" collisions in the transport process. Consequently, the artificial dissipation in the new scheme is much reduced in comparison with the MHD Flux Vector Splitting Scheme. At the same time, the new scheme is compared with the well-developed Roe-type MHD solver. It is concluded that the kinetic MHD scheme is more robust and efficient than the Roe- type method, and the accuracy is competitive. In this paper the general principle of splitting the macroscopic flux function based on the gas-kinetic theory is presented. The flux construction strategy may shed some light on the possible modification of AUSM- and CUSP-type schemes for the compressible Euler equations, as well as to the development of new schemes for a non-strictly hyperbolic system.

  1. Assessing the density functional theory-based multireference configuration interaction (DFT/MRCI) method for transition metal complexes

    SciTech Connect

    Escudero, Daniel E-mail: thiel@kofo.mpg.de; Thiel, Walter E-mail: thiel@kofo.mpg.de

    2014-05-21

    We report an assessment of the performance of density functional theory-based multireference configuration interaction (DFT/MRCI) calculations for a set of 3d- and 4d-transition metal (TM) complexes. The DFT/MRCI results are compared to published reference data from reliable high-level multi-configurational ab initio studies. The assessment covers the relative energies of different ground-state minima of the highly correlated CrF{sub 6} complex, the singlet and triplet electronically excited states of seven typical TM complexes (MnO{sub 4}{sup −}, Cr(CO){sub 6}, [Fe(CN){sub 6}]{sup 4−}, four larger Fe and Ru complexes), and the corresponding electronic spectra (vertical excitation energies and oscillator strengths). It includes comparisons with results from different flavors of time-dependent DFT (TD-DFT) calculations using pure, hybrid, and long-range corrected functionals. The DFT/MRCI method is found to be superior to the tested TD-DFT approaches and is thus recommended for exploring the excited-state properties of TM complexes.

  2. Battling the challenges of training nurses to use information systems through theory-based training material design.

    PubMed

    Galani, Malatsi; Yu, Ping; Paas, Fred; Chandler, Paul

    2014-01-01

    The attempts to train nurses to effectively use information systems have had mixed results. One problem is that training materials are not adequately designed to guide trainees to gradually learn to use a system without experiencing a heavy cognitive load. This is because training design often does not take into consideration a learner's cognitive ability to absorb new information in a short training period. Given the high cost and difficulty of organising training in healthcare organisations, there is an urgent need for information system trainers to be aware of how cognitive overload or information overload affect a trainee's capability to acquire new knowledge and skills, and what instructional techniques can be used to facilitate effective learning. This paper introduces the concept of cognitive load and how it affects nurses when learning to use a new health information system. This is followed by the relevant strategies for instructional design, underpinned by the principles of cognitive load theory, which may be helpful for the development of effective instructional materials and activities for training nurses to use information systems. PMID:25087524

  3. Nonlinear gyrokinetic theory based on a new method and computation of the guiding-center orbit in tokamaks

    SciTech Connect

    Xu, Yingfeng Dai, Zongliang; Wang, Shaojie

    2014-04-15

    The nonlinear gyrokinetic theory in the tokamak configuration based on the two-step transform is developed; in the first step, we transform the magnetic potential perturbation to the Hamiltonian part, and in the second step, we transform away the gyroangle-dependent part of the perturbed Hamiltonian. Then the I-transform method is used to decoupled the perturbation part of the motion from the unperturbed motion. The application of the I-transform method to the computation of the guiding-center orbit and the guiding-center distribution function in tokamaks is presented. It is demonstrated that the I-transform method of the orbit computation which involves integrating only along the unperturbed orbit agrees with the conventional method which integrates along the full orbit. A numerical code based on the I-transform method is developed and two numerical examples are given to verify the new method.

  4. Did you have an impact? A theory-based method for planning and evaluating knowledge-transfer and exchange activities in occupational health and safety.

    PubMed

    Kramer, Desré M; Wells, Richard P; Carlan, Nicolette; Aversa, Theresa; Bigelow, Philip P; Dixon, Shane M; McMillan, Keith

    2013-01-01

    Few evaluation tools are available to assess knowledge-transfer and exchange interventions. The objective of this paper is to develop and demonstrate a theory-based knowledge-transfer and exchange method of evaluation (KEME) that synthesizes 3 theoretical frameworks: the promoting action on research implementation of health services (PARiHS) model, the transtheoretical model of change, and a model of knowledge use. It proposes a new term, keme, to mean a unit of evidence-based transferable knowledge. The usefulness of the evaluation method is demonstrated with 4 occupational health and safety knowledge transfer and exchange (KTE) implementation case studies that are based upon the analysis of over 50 pre-existing interviews. The usefulness of the evaluation model has enabled us to better understand stakeholder feedback, frame our interpretation, and perform a more comprehensive evaluation of the knowledge use outcomes of our KTE efforts. PMID:23498710

  5. Fuzzy theory based control method for an in-pipe robot to move in variable resistance environment

    NASA Astrophysics Data System (ADS)

    Li, Te; Ma, Shugen; Li, Bin; Wang, Minghui; Wang, Yuechao

    2015-11-01

    Most of the existing screw drive in-pipe robots cannot actively adjust the maximum traction capacity, which limits the adaptability to the wide range of variable environment resistance, especially in curved pipes. In order to solve this problem, a screw drive in-pipe robot based on adaptive linkage mechanism is proposed. The differential property of the adaptive linkage mechanism allows the robot to move without motion interference in the straight and varied curved pipes by adjusting inclining angles of rollers self-adaptively. The maximum traction capacity of the robot can be changed by actively adjusting the inclining angles of rollers. In order to improve the adaptability to the variable resistance, a torque control method based on the fuzzy controller is proposed. For the variable environment resistance, the proposed control method can not only ensure enough traction force, but also limit the output torque in a feasible region. In the simulations, the robot with the proposed control method is compared to the robot with fixed inclining angles of rollers. The results show that the combination of the torque control method and the proposed robot achieves the better adaptability to the variable resistance in the straight and curved pipes.

  6. Analytic second derivative of the energy for density functional theory based on the three-body fragment molecular orbital method

    SciTech Connect

    Nakata, Hiroya; Fedorov, Dmitri G.; Zahariev, Federico; Schmidt, Michael W.; Gordon, Mark S.; Kitaura, Kazuo; Nakamura, Shinichiro

    2015-03-28

    Analytic second derivatives of the energy with respect to nuclear coordinates have been developed for spin restricted density functional theory (DFT) based on the fragment molecular orbital method (FMO). The derivations were carried out for the three-body expansion (FMO3), and the two-body expressions can be obtained by neglecting the three-body corrections. Also, the restricted Hartree-Fock (RHF) Hessian for FMO3 can be obtained by neglecting the density-functional related terms. In both the FMO-RHF and FMO-DFT Hessians, certain terms with small magnitudes are neglected for computational efficiency. The accuracy of the FMO-DFT Hessian in terms of the Gibbs free energy is evaluated for a set of polypeptides and water clusters and found to be within 1 kcal/mol of the corresponding full (non-fragmented) ab initio calculation. The FMO-DFT method is also applied to transition states in S{sub N}2 reactions and for the computation of the IR and Raman spectra of a small Trp-cage protein (PDB: 1L2Y). Some computational timing analysis is also presented.

  7. Analytic second derivative of the energy for density functional theory based on the three-body fragment molecular orbital method.

    PubMed

    Nakata, Hiroya; Fedorov, Dmitri G; Zahariev, Federico; Schmidt, Michael W; Kitaura, Kazuo; Gordon, Mark S; Nakamura, Shinichiro

    2015-03-28

    Analytic second derivatives of the energy with respect to nuclear coordinates have been developed for spin restricted density functional theory (DFT) based on the fragment molecular orbital method (FMO). The derivations were carried out for the three-body expansion (FMO3), and the two-body expressions can be obtained by neglecting the three-body corrections. Also, the restricted Hartree-Fock (RHF) Hessian for FMO3 can be obtained by neglecting the density-functional related terms. In both the FMO-RHF and FMO-DFT Hessians, certain terms with small magnitudes are neglected for computational efficiency. The accuracy of the FMO-DFT Hessian in terms of the Gibbs free energy is evaluated for a set of polypeptides and water clusters and found to be within 1 kcal/mol of the corresponding full (non-fragmented) ab initio calculation. The FMO-DFT method is also applied to transition states in SN2 reactions and for the computation of the IR and Raman spectra of a small Trp-cage protein (PDB: 1L2Y). Some computational timing analysis is also presented. PMID:25833559

  8. Analytic second derivative of the energy for density functional theory based on the three-body fragment molecular orbital method

    NASA Astrophysics Data System (ADS)

    Nakata, Hiroya; Fedorov, Dmitri G.; Zahariev, Federico; Schmidt, Michael W.; Kitaura, Kazuo; Gordon, Mark S.; Nakamura, Shinichiro

    2015-03-01

    Analytic second derivatives of the energy with respect to nuclear coordinates have been developed for spin restricted density functional theory (DFT) based on the fragment molecular orbital method (FMO). The derivations were carried out for the three-body expansion (FMO3), and the two-body expressions can be obtained by neglecting the three-body corrections. Also, the restricted Hartree-Fock (RHF) Hessian for FMO3 can be obtained by neglecting the density-functional related terms. In both the FMO-RHF and FMO-DFT Hessians, certain terms with small magnitudes are neglected for computational efficiency. The accuracy of the FMO-DFT Hessian in terms of the Gibbs free energy is evaluated for a set of polypeptides and water clusters and found to be within 1 kcal/mol of the corresponding full (non-fragmented) ab initio calculation. The FMO-DFT method is also applied to transition states in SN2 reactions and for the computation of the IR and Raman spectra of a small Trp-cage protein (PDB: 1L2Y). Some computational timing analysis is also presented.

  9. Theory-Based Stakeholder Evaluation

    ERIC Educational Resources Information Center

    Hansen, Morten Balle; Vedung, Evert

    2010-01-01

    This article introduces a new approach to program theory evaluation called theory-based stakeholder evaluation or the TSE model for short. Most theory-based approaches are program theory driven and some are stakeholder oriented as well. Practically, all of the latter fuse the program perceptions of the various stakeholder groups into one unitary…

  10. NbIT--a new information theory-based analysis of allosteric mechanisms reveals residues that underlie function in the leucine transporter LeuT.

    PubMed

    LeVine, Michael V; Weinstein, Harel

    2014-05-01

    Complex networks of interacting residues and microdomains in the structures of biomolecular systems underlie the reliable propagation of information from an input signal, such as the concentration of a ligand, to sites that generate the appropriate output signal, such as enzymatic activity. This information transduction often carries the signal across relatively large distances at the molecular scale in a form of allostery that is essential for the physiological functions performed by biomolecules. While allosteric behaviors have been documented from experiments and computation, the mechanism of this form of allostery proved difficult to identify at the molecular level. Here, we introduce a novel analysis framework, called N-body Information Theory (NbIT) analysis, which is based on information theory and uses measures of configurational entropy in a biomolecular system to identify microdomains and individual residues that act as (i)-channels for long-distance information sharing between functional sites, and (ii)-coordinators that organize dynamics within functional sites. Application of the new method to molecular dynamics (MD) trajectories of the occluded state of the bacterial leucine transporter LeuT identifies a channel of allosteric coupling between the functionally important intracellular gate and the substrate binding sites known to modulate it. NbIT analysis is shown also to differentiate residues involved primarily in stabilizing the functional sites, from those that contribute to allosteric couplings between sites. NbIT analysis of MD data thus reveals rigorous mechanistic elements of allostery underlying the dynamics of biomolecular systems. PMID:24785005

  11. A recursively formulated first-order semianalytic artificial satellite theory based on the generalized method of averaging. Volume 1: The generalized method of averaging applied to the artificial satellite problem

    NASA Technical Reports Server (NTRS)

    Mcclain, W. D.

    1977-01-01

    A recursively formulated, first-order, semianalytic artificial satellite theory, based on the generalized method of averaging is presented in two volumes. Volume I comprehensively discusses the theory of the generalized method of averaging applied to the artificial satellite problem. Volume II presents the explicit development in the nonsingular equinoctial elements of the first-order average equations of motion. The recursive algorithms used to evaluate the first-order averaged equations of motion are also presented in Volume II. This semianalytic theory is, in principle, valid for a term of arbitrary degree in the expansion of the third-body disturbing function (nonresonant cases only) and for a term of arbitrary degree and order in the expansion of the nonspherical gravitational potential function.

  12. Methods of Organizational Information Security

    NASA Astrophysics Data System (ADS)

    Martins, José; Dos Santos, Henrique

    The principle objective of this article is to present a literature review for the methods used in the security of information at the level of organizations. Some of the principle problems are identified and a first group of relevant dimensions is presented for an efficient management of information security. The study is based on the literature review made, using some of the more relevant certified articles of this theme, in international reports and in the principle norms of management of information security. From the readings that were done, we identified some of the methods oriented for risk management, norms of certification and good practice of security of information. Some of the norms are oriented for the certification of the product or system and others oriented to the processes of the business. There are also studies with the proposal of Frameworks that suggest the integration of different approaches with the foundation of norms focused on technologies, in processes and taking into consideration the organizational and human environment of the organizations. In our perspective, the biggest contribute to the security of information is the development of a method of security of information for an organization in a conflicting environment. This should make available the security of information, against the possible dimensions of attack that the threats could exploit, through the vulnerability of the organizational actives. This method should support the new concepts of "Network centric warfare", "Information superiority" and "Information warfare" especially developed in this last decade, where information is seen simultaneously as a weapon and as a target.

  13. Derivation of a measure of systolic blood pressure mutability: a novel information theory-based metric from ambulatory blood pressure tests.

    PubMed

    Contreras, Danitza J; Vogel, Eugenio E; Saravia, Gonzalo; Stockins, Benjamin

    2016-03-01

    We provide ambulatory blood pressure (BP) exams with tools based on information theory to quantify fluctuations thus increasing the capture of dynamic test components. Data from 515 ambulatory 24-hour BP exams were considered. Average age was 54 years, 54% were women, and 53% were under BP treatment. The average systolic pressure (SP) was 127 ± 8 mm Hg. A data compressor (wlzip) designed to recognize meaningful information is invoked to measure mutability which is a form of dynamical variability. For patients with the same average SP, different mutability values are obtained which reflects the differences in dynamical variability. In unadjusted linear regression models, mutability had low association with the mean systolic BP (R(2) = 0.056; P < .000001) but larger association with the SP deviation (R(2) = 0.761; P < .001). Wlzip allows detecting levels of variability in SP that could be hazardous. This new indicator can be easily added to the 24-hour BP monitors improving information toward diagnosis. PMID:26965751

  14. Information storage media and method

    SciTech Connect

    Miller, S.D.; Endres, G.W.

    1999-09-28

    Disclosed is a method for storing and retrieving information. More specifically, the present invention is a method for forming predetermined patterns, or data structures, using materials which exhibit enhanced absorption of light at certain wavelengths or, when interrogated with a light having a first wavelength, provide a luminescent response at a second wavelength. These materials may exhibit this response to light inherently, or may be made to exhibit this response by treating the materials with ionizing radiation.

  15. Information storage media and method

    DOEpatents

    Miller, Steven D.; Endres, George W.

    1999-01-01

    Disclosed is a method for storing and retrieving information. More specifically, the present invention is a method for forming predetermined patterns, or data structures, using materials which exhibit enhanced absorption of light at certain wavelengths or, when interrogated with a light having a first wavelength, provide a luminescent response at a second wavelength. These materials may exhibit this response to light inherently, or may be made to exhibit this response by treating the materials with ionizing radiation.

  16. Levels of Reconstruction as Complementarity in Mixed Methods Research: A Social Theory-Based Conceptual Framework for Integrating Qualitative and Quantitative Research

    PubMed Central

    Carroll, Linda J.; Rothe, J. Peter

    2010-01-01

    Like other areas of health research, there has been increasing use of qualitative methods to study public health problems such as injuries and injury prevention. Likewise, the integration of qualitative and quantitative research (mixed-methods) is beginning to assume a more prominent role in public health studies. Likewise, using mixed-methods has great potential for gaining a broad and comprehensive understanding of injuries and their prevention. However, qualitative and quantitative research methods are based on two inherently different paradigms, and their integration requires a conceptual framework that permits the unity of these two methods. We present a theory-driven framework for viewing qualitative and quantitative research, which enables us to integrate them in a conceptually sound and useful manner. This framework has its foundation within the philosophical concept of complementarity, as espoused in the physical and social sciences, and draws on Bergson’s metaphysical work on the ‘ways of knowing’. Through understanding how data are constructed and reconstructed, and the different levels of meaning that can be ascribed to qualitative and quantitative findings, we can use a mixed-methods approach to gain a conceptually sound, holistic knowledge about injury phenomena that will enhance our development of relevant and successful interventions. PMID:20948937

  17. High-resolution wave-theory-based ultrasound reflection imaging using the split-step fourier and globally optimized fourier finite-difference methods

    SciTech Connect

    Huang, Lianjie

    2013-10-29

    Methods for enhancing ultrasonic reflection imaging are taught utilizing a split-step Fourier propagator in which the reconstruction is based on recursive inward continuation of ultrasonic wavefields in the frequency-space and frequency-wave number domains. The inward continuation within each extrapolation interval consists of two steps. In the first step, a phase-shift term is applied to the data in the frequency-wave number domain for propagation in a reference medium. The second step consists of applying another phase-shift term to data in the frequency-space domain to approximately compensate for ultrasonic scattering effects of heterogeneities within the tissue being imaged (e.g., breast tissue). Results from various data input to the method indicate significant improvements are provided in both image quality and resolution.

  18. Dynamic mean field theory for lattice gas models of fluids confined in porous materials: Higher order theory based on the Bethe-Peierls and path probability method approximations

    SciTech Connect

    Edison, John R.; Monson, Peter A.

    2014-07-14

    Recently we have developed a dynamic mean field theory (DMFT) for lattice gas models of fluids in porous materials [P. A. Monson, J. Chem. Phys. 128(8), 084701 (2008)]. The theory can be used to describe the relaxation processes in the approach to equilibrium or metastable states for fluids in pores and is especially useful for studying system exhibiting adsorption/desorption hysteresis. In this paper we discuss the extension of the theory to higher order by means of the path probability method (PPM) of Kikuchi and co-workers. We show that this leads to a treatment of the dynamics that is consistent with thermodynamics coming from the Bethe-Peierls or Quasi-Chemical approximation for the equilibrium or metastable equilibrium states of the lattice model. We compare the results from the PPM with those from DMFT and from dynamic Monte Carlo simulations. We find that the predictions from PPM are qualitatively similar to those from DMFT but give somewhat improved quantitative accuracy, in part due to the superior treatment of the underlying thermodynamics. This comes at the cost of greater computational expense associated with the larger number of equations that must be solved.

  19. Discovering the Optimal Route for Alane Synthesis on Ti doped Al Surfaces Using Density Functional Theory Based Kinetic Monte Carlo Methods

    NASA Astrophysics Data System (ADS)

    Karim, Altaf; Muckerman, James T.

    2010-03-01

    Issues such as catalytic dissociation of hydrogen and the mobility of alane species on Ti-doped Al surfaces are major challenges in the synthesis of aluminum hydride. Our recently developed modeling framework (DFT-based KMC simulation) enabled us to study the steady-state conditions of dissociative adsorption of hydrogen, its diffusion, and its reaction with Al adatoms leading to the formation of alane species on Ti-doped Al surfaces. Our studies show that the doping of Ti atoms in the top layer of Al surfaces significantly reduces the mobility of alane species. On the other hand, the doping of Ti atoms beneath the top layer of Al surfaces enhances the mobility of alane species. The arrangement of dopant Ti atoms in different layers not only affects the diffusion barriers of alane species but it also affects hydrogen dissociation barriers when Ti-Ti pairs are arranged in different ways in the top layer. Using our theoretical methods, we identified a few configurations of dopant Ti atoms having lower barriers for alane diffusion and hydrogen dissociation. Further, we discovered the optimal values of Ti concentration, temperature, and pressure under which the rate of alane formation is maximized.

  20. Theory-Based Considerations Influence the Interpretation of Generic Sentences

    ERIC Educational Resources Information Center

    Cimpian, Andrei; Gelman, Susan A.; Brandone, Amanda C.

    2010-01-01

    Under what circumstances do people agree that a kind-referring generic sentence (e.g., "Swans are beautiful") is true? We hypothesised that theory-based considerations are sufficient, independently of prevalence/frequency information, to lead to acceptance of a generic statement. To provide evidence for this general point, we focused on…

  1. Theory-Based Evaluation: Reflections Ten Years On. Theory-Based Evaluation: Past, Present, and Future

    ERIC Educational Resources Information Center

    Rogers, Patricia J.; Weiss, Carol H.

    2007-01-01

    This chapter begins with a brief introduction by Rogers, in which she highlights the continued salience of Carol Weiss's decade-old questions about theory-based evaluation. Theory-based evaluation has developed significantly since Carol Weiss's chapter was first published ten years ago. In 1997 Weiss pointed to theory-based evaluation being mostly…

  2. Information technology equipment cooling method

    DOEpatents

    Schultz, Mark D.

    2015-10-20

    According to one embodiment, a system for removing heat from a rack of information technology equipment may include a sidecar indoor air to liquid heat exchanger that cools air utilized by the rack of information technology equipment to cool the rack of information technology equipment. The system may also include a liquid to liquid heat exchanger and an outdoor heat exchanger. The system may further include configurable pathways to connect and control fluid flow through the sidecar heat exchanger, the liquid to liquid heat exchanger, the rack of information technology equipment, and the outdoor heat exchanger based upon ambient temperature and/or ambient humidity to remove heat generated by the rack of information technology equipment.

  3. Multiple outcome measures and mixed methods for evaluating the effectiveness of theory-based behaviour-change interventions: a case study targeting health professionals' adoption of a national suicide prevention guideline.

    PubMed

    Hanbury, A; Wallace, L M; Clark, M

    2011-05-01

    Interest in behaviour-change interventions targeting health professionals' adoption of clinical guidelines is growing. Recommendations have been made for interventions to have a theoretical base, explore the local context and to use mixed and multiple methods of evaluation to establish intervention effectiveness. This article presents a case study of a behaviour-change intervention delivered to community mental health professionals in one Primary Care Trust, aimed at raising adherence to a national suicide prevention guideline. A discussion of how the theory-base was selected, the local context explored, and how the intervention was developed and delivered is provided. Time series analysis, mediational analysis and qualitative process evaluation were used to evaluate and explore intervention effectiveness. The time series analysis revealed that the intervention was not effective at increasing adherence to the guideline. The mediational analysis indicates that the intervention failed to successfully target the key barrier to adoption of the guidance, and the qualitative process evaluation identified certain intervention components that were well received by the health professionals, and also identified weaknesses in the delivery of the intervention. It is recommended that future research should seek to further develop the evidence-base for linking specific intervention strategies to specific behavioural barriers, explore the potential of theories that take into account broader social and organisational factors that influence health professionals' practice and focus on the process of data synthesis for identifying key factors to target with tailored interventions. Multiple and mixed evaluation techniques are recommended not only to explore whether an intervention is effective or not but also why it is effective or not. PMID:21491337

  4. Intervention mapping protocol for developing a theory-based diabetes self-management education program.

    PubMed

    Song, Misoon; Choi, Suyoung; Kim, Se-An; Seo, Kyoungsan; Lee, Soo Jin

    2015-01-01

    Development of behavior theory-based health promotion programs is encouraged with the paradigm shift from contents to behavior outcomes. This article describes the development process of the diabetes self-management program for older Koreans (DSME-OK) using intervention mapping (IM) protocol. The IM protocol includes needs assessment, defining goals and objectives, identifying theory and determinants, developing a matrix to form change objectives, selecting strategies and methods, structuring the program, and planning for evaluation and pilot testing. The DSME-OK adopted seven behavior objectives developed by the American Association of Diabetes Educators as behavioral outcomes. The program applied an information-motivation-behavioral skills model, and interventions were targeted to 3 determinants to change health behaviors. Specific methods were selected to achieve each objective guided by IM protocol. As the final step, program evaluation was planned including a pilot test. The DSME-OK was structured as the 3 determinants of the IMB model were intervened to achieve behavior objectives in each session. The program has 12 weekly 90-min sessions tailored for older adults. Using the IM protocol in developing a theory-based self-management program was beneficial in terms of providing a systematic guide to developing theory-based and behavior outcome-focused health education programs. PMID:26062288

  5. Information theoretic methods for image processing algorithm optimization

    NASA Astrophysics Data System (ADS)

    Prokushkin, Sergey F.; Galil, Erez

    2015-01-01

    Modern image processing pipelines (e.g., those used in digital cameras) are full of advanced, highly adaptive filters that often have a large number of tunable parameters (sometimes > 100). This makes the calibration procedure for these filters very complex, and the optimal results barely achievable in the manual calibration; thus an automated approach is a must. We will discuss an information theory based metric for evaluation of algorithm adaptive characteristics ("adaptivity criterion") using noise reduction algorithms as an example. The method allows finding an "orthogonal decomposition" of the filter parameter space into the "filter adaptivity" and "filter strength" directions. This metric can be used as a cost function in automatic filter optimization. Since it is a measure of a physical "information restoration" rather than perceived image quality, it helps to reduce the set of the filter parameters to a smaller subset that is easier for a human operator to tune and achieve a better subjective image quality. With appropriate adjustments, the criterion can be used for assessment of the whole imaging system (sensor plus post-processing).

  6. Theory-Based Programme Development and Evaluation in Physiotherapy

    PubMed Central

    Kay, Theresa; Klinck, Beth

    2008-01-01

    ABSTRACT Purpose: Programme evaluation has been defined as “the systematic process of collecting credible information for timely decision making about a particular program.” Where possible, findings are used to develop, revise, and improve programmes. Theory-based programme development and evaluation provides a comprehensive approach to programme evaluation. Summary of key points: In order to obtain meaningful information from evaluation activities, relevant programme components need to be understood. Theory-based programme development and evaluation starts with a comprehensive description of the programme. A useful tool to describe a programme is the Sidani and Braden Model of Program Theory, consisting of six programme components: problem definition, critical inputs, mediating factors, expected outcomes, extraneous factors, and implementation issues. Articulation of these key components may guide physiotherapy programme implementation and delivery and assist in the development of key evaluation questions and methodologies. Using this approach leads to a better understanding of client needs, programme processes, and programme outcomes and can help to identify barriers to and enablers of successful implementation. Two specific examples, representing public and private sectors, will illustrate the application of this approach to clinical practice. Conclusions: Theory-based programme development helps clinicians, administrators, and researchers develop an understanding of who benefits the most from which types of programmes and facilitates the implementation of processes to improve programmes. PMID:20145741

  7. Method for gathering and summarizing internet information

    DOEpatents

    Potok, Thomas E [Oak Ridge, TN; Elmore, Mark Thomas [Oak Ridge, TN; Reed, Joel Wesley [Knoxville, TN; Treadwell, Jim N; Samatova, Nagiza Faridovna [Oak Ridge, TN

    2008-01-01

    A computer method of gathering and summarizing large amounts of information comprises collecting information from a plurality of information sources (14, 51) according to respective maps (52) of the information sources (14), converting the collected information from a storage format to XML-language documents (26, 53) and storing the XML-language documents in a storage medium, searching for documents (55) according to a search query (13) having at least one term and identifying the documents (26) found in the search, and displaying the documents as nodes (33) of a tree structure (32) having links (34) and nodes (33) so as to indicate similarity of the documents to each other.

  8. Method for gathering and summarizing internet information

    DOEpatents

    Potok, Thomas E.; Elmore, Mark Thomas; Reed, Joel Wesley; Treadwell, Jim N.; Samatova, Nagiza Faridovna

    2010-04-06

    A computer method of gathering and summarizing large amounts of information comprises collecting information from a plurality of information sources (14, 51) according to respective maps (52) of the information sources (14), converting the collected information from a storage format to XML-language documents (26, 53) and storing the XML-language documents in a storage medium, searching for documents (55) according to a search query (13) having at least one term and identifying the documents (26) found in the search, and displaying the documents as nodes (33) of a tree structure (32) having links (34) and nodes (33) so as to indicate similarity of the documents to each other.

  9. Analysis of methods. [information systems evolution environment

    NASA Technical Reports Server (NTRS)

    Mayer, Richard J. (Editor); Ackley, Keith A.; Wells, M. Sue; Mayer, Paula S. D.; Blinn, Thomas M.; Decker, Louis P.; Toland, Joel A.; Crump, J. Wesley; Menzel, Christopher P.; Bodenmiller, Charles A.

    1991-01-01

    Information is one of an organization's most important assets. For this reason the development and maintenance of an integrated information system environment is one of the most important functions within a large organization. The Integrated Information Systems Evolution Environment (IISEE) project has as one of its primary goals a computerized solution to the difficulties involved in the development of integrated information systems. To develop such an environment a thorough understanding of the enterprise's information needs and requirements is of paramount importance. This document is the current release of the research performed by the Integrated Development Support Environment (IDSE) Research Team in support of the IISEE project. Research indicates that an integral part of any information system environment would be multiple modeling methods to support the management of the organization's information. Automated tool support for these methods is necessary to facilitate their use in an integrated environment. An integrated environment makes it necessary to maintain an integrated database which contains the different kinds of models developed under the various methodologies. In addition, to speed the process of development of models, a procedure or technique is needed to allow automatic translation from one methodology's representation to another while maintaining the integrity of both. The purpose for the analysis of the modeling methods included in this document is to examine these methods with the goal being to include them in an integrated development support environment. To accomplish this and to develop a method for allowing intra-methodology and inter-methodology model element reuse, a thorough understanding of multiple modeling methodologies is necessary. Currently the IDSE Research Team is investigating the family of Integrated Computer Aided Manufacturing (ICAM) DEFinition (IDEF) languages IDEF(0), IDEF(1), and IDEF(1x), as well as ENALIM, Entity

  10. Control Theory based Shape Design for the Incompressible Navier-Stokes Equations

    NASA Astrophysics Data System (ADS)

    Cowles, G.; Martinelli, L.

    2003-12-01

    A design method for shape optimization in incompressible turbulent viscous flow has been developed and validated for inverse design. The gradient information is determined using a control theory based algorithm. With such an approach, the cost of computing the gradient is negligible. An additional adjoint system must be solved which requires the cost of a single steady state flow solution. Thus, this method has an enormous advantage over traditional finite-difference based algorithms. The method of artificial compressibility is utilized to solve both the flow and adjoint systems. An algebraic turbulence model is used to compute the eddy viscosity. The method is validated using several inverse wing design test cases. In each case, the program must modify the shape of the initial wing such that its pressure distribution matches that of the target wing. Results are shown for the inversion of both finite thickness wings as well as zero thickness wings which can be considered a model of yacht sails.

  11. Research Investigation of Information Access Methods

    ERIC Educational Resources Information Center

    Heinrichs, John H.; Sharkey, Thomas W.; Lim, Jeen-Su

    2006-01-01

    This study investigates the satisfaction of library users at Wayne State University who utilize alternative information access methods. The LibQUAL+[TM] desired and perceived that satisfaction ratings are used to determine the user's "superiority gap." By focusing limited library resources to address "superiority gap" issues identified by each…

  12. Scenistic Methods for Training: Applications and Practice

    ERIC Educational Resources Information Center

    Lyons, Paul R.

    2011-01-01

    Purpose: This paper aims to complement an earlier article (2010) in "Journal of European Industrial Training" in which the description and theory bases of scenistic methods were presented. This paper also offers a description of scenistic methods and information on theory bases. However, the main thrust of this paper is to describe, give suggested…

  13. Theory-based telehealth and patient empowerment.

    PubMed

    Suter, Paula; Suter, W Newton; Johnston, Donna

    2011-04-01

    Health care technology holds great potential to improve the quality of health care delivery. One effective technology is remote patient monitoring, whereby patient data, such as vital signs or symptom reports, are captured from home monitoring devices and transmitted to health care professionals for review. The use of remote patient monitoring, often referred to as telehealth, has been widely adopted by health care providers, particularly home care agencies. Most agencies have invested in telehealth to facilitate the early identification of disease exacerbation, particularly for patients with chronic diseases such as heart failure and diabetes. This technology has been successfully harnessed by agencies to reduce rehospitalization rates through remote data interpretation and the provision of timely interventions. We propose that the use of telehealth by home care agencies and other health care providers be expanded to empower patients and promote disease self-management with resultant improved health care outcomes. This article describes how remote monitoring, in combination with the application of salient adult learning and cognitive behavioral theories and applied to telehealth care delivery and practice, can promote improved patient self-efficacy with disease management. We present theories applicable for improving health-related behaviors and illustrate how theory-based practices can be implemented in the field of home care. Home care teams that deliver theory-based telehealth function as valuable partners to physicians and hospitals in an integrated health care delivery system. PMID:21241182

  14. Method and apparatus for displaying information

    NASA Technical Reports Server (NTRS)

    Ingber, Donald E. (Inventor); Huang, Sui (Inventor); Eichler, Gabriel (Inventor)

    2010-01-01

    A method for displaying large amounts of information. The method includes the steps of forming a spatial layout of tiles each corresponding to a representative reference element; mapping observed elements onto the spatial layout of tiles of representative reference elements; assigning a respective value to each respective tile of the spatial layout of the representative elements; and displaying an image of the spatial layout of tiles of representative elements. Each tile includes atomic attributes of representative elements. The invention also relates to an apparatus for displaying large amounts of information. The apparatus includes a tiler forming a spatial layout of tiles, each corresponding to a representative reference element; a comparator mapping observed elements onto said spatial layout of tiles of representative reference elements; an assigner assigning a respective value to each respective tile of said spatial layout of representative reference elements; and a display displaying an image of the spatial layout of tiles of representative reference elements.

  15. Information Work Analysis: An Approach to Research on Information Interactions and Information Behaviour in Context

    ERIC Educational Resources Information Center

    Huvila, Isto

    2008-01-01

    Introduction: A work roles and role theory-based approach to conceptualise human information activity, denoted information work analysis is discussed. The present article explicates the approach and its special characteristics and benefits in comparison to earlier methods of analysing human information work. Method: The approach is discussed in…

  16. Knowledge information management toolkit and method

    DOEpatents

    Hempstead, Antoinette R.; Brown, Kenneth L.

    2006-08-15

    A system is provided for managing user entry and/or modification of knowledge information into a knowledge base file having an integrator support component and a data source access support component. The system includes processing circuitry, memory, a user interface, and a knowledge base toolkit. The memory communicates with the processing circuitry and is configured to store at least one knowledge base. The user interface communicates with the processing circuitry and is configured for user entry and/or modification of knowledge pieces within a knowledge base. The knowledge base toolkit is configured for converting knowledge in at least one knowledge base from a first knowledge base form into a second knowledge base form. A method is also provided.

  17. Compressed sensing theory-based channel estimation for optical orthogonal frequency division multiplexing communication system

    NASA Astrophysics Data System (ADS)

    Zhao, Hui; Li, Minghui; Wang, Ruyan; Liu, Yuanni; Song, Daiping

    2014-09-01

    Due to the spare multipath property of the channel, a channel estimation method, which is based on partial superimposed training sequence and compressed sensing theory, is proposed for line of sight optical orthogonal frequency division multiplexing communication systems. First, a continuous training sequence is added at variable power ratio to the cyclic prefix of orthogonal frequency division multiplexing symbols at the transmitter prior to transmission. Then the observation matrix of compressed sensing theory is structured by the use of the training symbols at receiver. Finally, channel state information is estimated using sparse signal reconstruction algorithm. Compared to traditional training sequences, the proposed partial superimposed training sequence not only improves the spectral efficiency, but also reduces the influence to information symbols. In addition, compared with classical least squares and linear minimum mean square error methods, the proposed compressed sensing theory based channel estimation method can improve both the estimation accuracy and the system performance. Simulation results are given to demonstrate the performance of the proposed method.

  18. Evaluating Theory-Based Evaluation: Information, Norms, and Adherence

    ERIC Educational Resources Information Center

    Jacobs, W. Jake; Sisco, Melissa; Hill, Dawn; Malter, Frederic; Figueredo, Aurelio Jose

    2012-01-01

    Programmatic social interventions attempt to produce appropriate social-norm-guided behavior in an open environment. A marriage of applicable psychological theory, appropriate program evaluation theory, and outcome of evaluations of specific social interventions assures the acquisition of cumulative theory and the production of successful social…

  19. Assessment of density functional theory based ΔSCF (self-consistent field) and linear response methods for longest wavelength excited states of extended π-conjugated molecular systems

    SciTech Connect

    Filatov, Michael; Huix-Rotllant, Miquel

    2014-07-14

    Computational investigation of the longest wavelength excitations in a series of cyanines and linear n-acenes is undertaken with the use of standard spin-conserving linear response time-dependent density functional theory (TD-DFT) as well as its spin-flip variant and a ΔSCF method based on the ensemble DFT. The spin-conserving linear response TD-DFT fails to accurately reproduce the lowest excitation energy in these π-conjugated systems by strongly overestimating the excitation energies of cyanines and underestimating the excitation energies of n-acenes. The spin-flip TD-DFT is capable of correcting the underestimation of excitation energies of n-acenes by bringing in the non-dynamic electron correlation into the ground state; however, it does not fully correct for the overestimation of the excitation energies of cyanines, for which the non-dynamic correlation does not seem to play a role. The ensemble DFT method employed in this work is capable of correcting for the effect of missing non-dynamic correlation in the ground state of n-acenes and for the deficient description of differential correlation effects between the ground and excited states of cyanines and yields the excitation energies of both types of extended π-conjugated systems with the accuracy matching high-level ab initio multireference calculations.

  20. Advanced Feedback Methods in Information Retrieval.

    ERIC Educational Resources Information Center

    Salton, G.; And Others

    1985-01-01

    In this study, automatic feedback techniques are applied to Boolean query statements in online information retrieval to generate improved query statements based on information contained in previously retrieved documents. Feedback operations are carried out using conventional Boolean logic and extended logic. Experimental output is included to…

  1. Theory-based categorization under speeded conditions

    PubMed Central

    Luhmann, Christian C.; Ahn, Woo-Kyoung; Palmeri, Thomas J.

    2009-01-01

    It is widely accepted that similarity influences rapid categorization, whereas theories can influence only more leisurely category judgments. In contrast, we argue that it is not the type of knowledge used that determines categorization speed, but rather the complexity of the categorization processes. In two experiments, participants learned four categories of items, each consisting of three causally related features. Participants gave more weight to cause features than to effect features, even under speeded response conditions. Furthermore, the time required to make judgments was equivalent, regardless of whether participants were using causal knowledge or base-rate information. We argue that both causal knowledge and base-rate information, once precompiled during learning, can be used at roughly the same speeds during categorization, thus demonstrating an important parallel between these two types of knowledge. PMID:17128608

  2. Observational attachment theory-based parenting measures predict children's attachment narratives independently from social learning theory-based measures.

    PubMed

    Matias, Carla; O'Connor, Thomas G; Futh, Annabel; Scott, Stephen

    2014-01-01

    Conceptually and methodologically distinct models exist for assessing quality of parent-child relationships, but few studies contrast competing models or assess their overlap in predicting developmental outcomes. Using observational methodology, the current study examined the distinctiveness of attachment theory-based and social learning theory-based measures of parenting in predicting two key measures of child adjustment: security of attachment narratives and social acceptance in peer nominations. A total of 113 5-6-year-old children from ethnically diverse families participated. Parent-child relationships were rated using standard paradigms. Measures derived from attachment theory included sensitive responding and mutuality; measures derived from social learning theory included positive attending, directives, and criticism. Child outcomes were independently-rated attachment narrative representations and peer nominations. Results indicated that Attachment theory-based and Social Learning theory-based measures were modestly correlated; nonetheless, parent-child mutuality predicted secure child attachment narratives independently of social learning theory-based measures; in contrast, criticism predicted peer-nominated fighting independently of attachment theory-based measures. In young children, there is some evidence that attachment theory-based measures may be particularly predictive of attachment narratives; however, no single model of measuring parent-child relationships is likely to best predict multiple developmental outcomes. Assessment in research and applied settings may benefit from integration of different theoretical and methodological paradigms. PMID:24283669

  3. Fast multiple alignment of ungapped DNA sequences using information theory and a relaxation method.

    PubMed

    Schneider, Thomas D; Mastronarde, David N

    1996-12-01

    An information theory based multiple alignment ("Malign") method was used to align the DNA binding sequences of the OxyR and Fis proteins, whose sequence conservation is so spread out that it is difficult to identify the sites. In the algorithm described here, the information content of the sequences is used as a unique global criterion for the quality of the alignment. The algorithm uses look-up tables to avoid recalculating computationally expensive functions such as the logarithm. Because there are no arbitrary constants and because the results are reported in absolute units (bits), the best alignment can be chosen without ambiguity. Starting from randomly selected alignments, a hill-climbing algorithm can track through the immense space of s(n) combinations where s is the number of sequences and n is the number of positions possible for each sequence. Instead of producing a single alignment, the algorithm is fast enough that one can afford to use many start points and to classify the solutions. Good convergence is indicated by the presence of a single well-populated solution class having higher information content than other classes. The existence of several distinct classes for the Fis protein indicates that those binding sites have self-similar features. PMID:19953199

  4. Governance Methods Used in Externalizing Information Technology

    ERIC Educational Resources Information Center

    Chan, Steven King-Lun

    2012-01-01

    Information technology (IT) is the largest capital expenditure in many firms and is an integral part of many organizations' strategies. However, the benefits that each company receives from its IT investments vary. One study by Weill (2004) found that the top performer in the sample was estimated to have as high as a 40% greater return on its…

  5. Informing Patients About Placebo Effects: Using Evidence, Theory, and Qualitative Methods to Develop a New Website

    PubMed Central

    Greville-Harris, Maddy; Bostock, Jennifer; Din, Amy; Graham, Cynthia A; Lewith, George; Liossi, Christina; O’Riordan, Tim; White, Peter; Yardley, Lucy

    2016-01-01

    Background According to established ethical principles and guidelines, patients in clinical trials should be fully informed about the interventions they might receive. However, information about placebo-controlled clinical trials typically focuses on the new intervention being tested and provides limited and at times misleading information about placebos. Objective We aimed to create an informative, scientifically accurate, and engaging website that could be used to improve understanding of placebo effects among patients who might be considering taking part in a placebo-controlled clinical trial. Methods Our approach drew on evidence-, theory-, and person-based intervention development. We used existing evidence and theory about placebo effects to develop content that was scientifically accurate. We used existing evidence and theory of health behavior to ensure our content would be communicated persuasively, to an audience who might currently be ignorant or misinformed about placebo effects. A qualitative ‘think aloud’ study was conducted in which 10 participants viewed prototypes of the website and spoke their thoughts out loud in the presence of a researcher. Results The website provides information about 10 key topics and uses text, evidence summaries, quizzes, audio clips of patients’ stories, and a short film to convey key messages. Comments from participants in the think aloud study highlighted occasional misunderstandings and off-putting/confusing features. These were addressed by modifying elements of content, style, and navigation to improve participants’ experiences of using the website. Conclusions We have developed an evidence-based website that incorporates theory-based techniques to inform members of the public about placebos and placebo effects. Qualitative research ensured our website was engaging and convincing for our target audience who might not perceive a need to learn about placebo effects. Before using the website in clinical trials

  6. 5 CFR 1640.6 - Methods of providing information.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 5 Administrative Personnel 3 2011-01-01 2011-01-01 false Methods of providing information. 1640.6 Section 1640.6 Administrative Personnel FEDERAL RETIREMENT THRIFT INVESTMENT BOARD PERIODIC PARTICIPANT STATEMENTS § 1640.6 Methods of providing information. The TSP will furnish the information described in...

  7. 5 CFR 1640.6 - Methods of providing information.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 5 Administrative Personnel 3 2012-01-01 2012-01-01 false Methods of providing information. 1640.6 Section 1640.6 Administrative Personnel FEDERAL RETIREMENT THRIFT INVESTMENT BOARD PERIODIC PARTICIPANT STATEMENTS § 1640.6 Methods of providing information. The TSP will furnish the information described in...

  8. Using Qualitative Methods to Inform Scale Development

    ERIC Educational Resources Information Center

    Rowan, Noell; Wulff, Dan

    2007-01-01

    This article describes the process by which one study utilized qualitative methods to create items for a multi dimensional scale to measure twelve step program affiliation. The process included interviewing fourteen addicted persons while in twelve step focused treatment about specific pros (things they like or would miss out on by not being…

  9. Applying Human Computation Methods to Information Science

    ERIC Educational Resources Information Center

    Harris, Christopher Glenn

    2013-01-01

    Human Computation methods such as crowdsourcing and games with a purpose (GWAP) have each recently drawn considerable attention for their ability to synergize the strengths of people and technology to accomplish tasks that are challenging for either to do well alone. Despite this increased attention, much of this transformation has been focused on…

  10. Discourse and Practice in Information Literacy and Information Seeking: Gaps and Opportunities

    ERIC Educational Resources Information Center

    Julien, H.; Williamson, K.

    2010-01-01

    Introduction: This paper argues for increased research consideration of the conceptual overlap between information seeking and information literacy, and for scholarly attention to theory-based empirical research that has potential value to practitioners. Method: The paper reviews information seeking and information literacy research, and…

  11. Application of geo-information science methods in ecotourism exploitation

    NASA Astrophysics Data System (ADS)

    Dong, Suocheng; Hou, Xiaoli

    2004-11-01

    Application of geo-information science methods in ecotourism development was discussed in the article. Since 1990s, geo-information science methods, which take the 3S (Geographic Information System, Global Positioning System, and Remote Sensing) as core techniques, has played an important role in resources reconnaissance, data management, environment monitoring, and regional planning. Geo-information science methods can easily analyze and convert geographic spatial data. The application of 3S methods is helpful to sustainable development in tourism. Various assignments are involved in the development of ecotourism, such as reconnaissance of ecotourism resources, drawing of tourism maps, dealing with mass data, and also tourism information inquire, employee management, quality management of products. The utilization of geo-information methods in ecotourism can make the development more efficient by promoting the sustainable development of tourism and the protection of eco-environment.

  12. Information in Our World: Conceptions of Information and Problems of Method in Information Science

    ERIC Educational Resources Information Center

    Ma, Lai

    2012-01-01

    Many concepts of information have been proposed and discussed in library and information science. These concepts of information can be broadly categorized as empirical and situational information. Unlike nomenclatures in many sciences, however, the concept of information in library and information science does not bear a generally accepted…

  13. Axiomatic Evaluation Method and Content Structure for Information Appliances

    ERIC Educational Resources Information Center

    Guo, Yinni

    2010-01-01

    Extensive studies have been conducted to determine how best to present information in order to enhance usability, but not what information is needed to be presented for effective decision making. Hence, this dissertation addresses the factor structure of the nature of information needed for presentation and proposes a more effective method than…

  14. Method and system of integrating information from multiple sources

    DOEpatents

    Alford, Francine A.; Brinkerhoff, David L.

    2006-08-15

    A system and method of integrating information from multiple sources in a document centric application system. A plurality of application systems are connected through an object request broker to a central repository. The information may then be posted on a webpage. An example of an implementation of the method and system is an online procurement system.

  15. 48 CFR 2905.101 - Methods of disseminating information.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 7 2012-10-01 2012-10-01 false Methods of disseminating information. 2905.101 Section 2905.101 Federal Acquisition Regulations System DEPARTMENT OF LABOR ACQUISITION PLANNING PUBLICIZING CONTRACT ACTIONS Dissemination of Information 2905.101 Methods of...

  16. Improving breast cancer control among Latinas: evaluation of a theory-based educational program.

    PubMed

    Mishra, S I; Chavez, L R; Magaña, J R; Nava, P; Burciaga Valdez, R; Hubbell, F A

    1998-10-01

    The study evaluated a theory-based breast cancer control program specially developed for less acculturated Latinas. The authors used a quasi-experimental design with random assignment of Latinas into experimental (n = 51) or control (n = 37) groups that completed one pretest and two posttest surveys. The experimental group received the educational program, which was based on Bandura's self-efficacy theory and Freire's empowerment pedagogy. Outcome measures included knowledge, perceived self-efficacy, attitudes, breast self-examination (BSE) skills, and mammogram use. At posttest 1, controlling for pretest scores, the experimental group was significantly more likely than the control group to have more medically recognized knowledge (sum of square [SS] = 17.0, F = 6.58, p < .01), have less medically recognized knowledge (SS = 128.8, F = 39.24, p < .001), greater sense of perceived self-efficacy (SS = 316.5, F = 9.63, p < .01), and greater adeptness in the conduct of BSE (SS = 234.8, F = 153.33, p < .001). Cancer control programs designed for less acculturated women should use informal and interactive educational methods that incorporate skill-enhancing and empowering techniques. PMID:9768384

  17. A queuing-theory-based interval-fuzzy robust two-stage programming model for environmental management under uncertainty

    NASA Astrophysics Data System (ADS)

    Sun, Y.; Li, Y. P.; Huang, G. H.

    2012-06-01

    In this study, a queuing-theory-based interval-fuzzy robust two-stage programming (QB-IRTP) model is developed through introducing queuing theory into an interval-fuzzy robust two-stage (IRTP) optimization framework. The developed QB-IRTP model can not only address highly uncertain information for the lower and upper bounds of interval parameters but also be used for analysing a variety of policy scenarios that are associated with different levels of economic penalties when the promised targets are violated. Moreover, it can reflect uncertainties in queuing theory problems. The developed method has been applied to a case of long-term municipal solid waste (MSW) management planning. Interval solutions associated with different waste-generation rates, different waiting costs and different arriving rates have been obtained. They can be used for generating decision alternatives and thus help managers to identify desired MSW management policies under various economic objectives and system reliability constraints.

  18. Theory Based Approaches to Learning. Implications for Adult Educators.

    ERIC Educational Resources Information Center

    Bolton, Elizabeth B.; Jones, Edward V.

    This paper presents a codification of theory-based approaches that are applicable to adult learning situations. It also lists some general guidelines that can be used when selecting a particular approach or theory as a basis for planning instruction. Adult education's emphasis on practicality and the relationship between theory and practice is…

  19. Theory-Based Approaches to the Concept of Life

    ERIC Educational Resources Information Center

    El-Hani, Charbel Nino

    2008-01-01

    In this paper, I argue that characterisations of life through lists of properties have several shortcomings and should be replaced by theory-based accounts that explain the coexistence of a set of properties in living beings. The concept of life should acquire its meaning from its relationships with other concepts inside a theory. I illustrate…

  20. Theory-Based University Admissions Testing for a New Millennium

    ERIC Educational Resources Information Center

    Sternberg, Robert J.

    2004-01-01

    This article describes two projects based on Robert J. Sternberg's theory of successful intelligence and designed to provide theory-based testing for university admissions. The first, Rainbow Project, provided a supplementary test of analytical, practical, and creative skills to augment the SAT in predicting college performance. The Rainbow…

  1. 48 CFR 5.101 - Methods of disseminating information.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Methods of disseminating... ACQUISITION PLANNING PUBLICIZING CONTRACT ACTIONS Dissemination of Information 5.101 Methods of disseminating... various methods of satisfying the requirements of 5.207(c). For example, the contracting officer may...

  2. The use of density functional theory-based reactivity descriptors in molecular similarity calculations

    NASA Astrophysics Data System (ADS)

    Boon, Greet; De Proft, Frank; Langenaeker, Wilfried; Geerlings, Paul

    1998-10-01

    Molecular similarity is studied via density functional theory-based similarity indices using a numerical integration method. Complementary to the existing similarity indices, we introduce a reactivity-related similarity index based on the local softness. After a study of some test systems, a series of peptide isosteres is studied in view of their importance in pharmacology. The whole of the present work illustrates the importance of the study of molecular similarity based on both shape and reactivity.

  3. XML-based product information processing method for product design

    NASA Astrophysics Data System (ADS)

    Zhang, Zhen Yu

    2012-01-01

    Design knowledge of modern mechatronics product is based on information processing as the center of the knowledge-intensive engineering, thus product design innovation is essentially the knowledge and information processing innovation. Analysis of the role of mechatronics product design knowledge and information management features, a unified model of XML-based product information processing method is proposed. Information processing model of product design includes functional knowledge, structural knowledge and their relationships. For the expression of product function element, product structure element, product mapping relationship between function and structure based on the XML model are proposed. The information processing of a parallel friction roller is given as an example, which demonstrates that this method is obviously helpful for knowledge-based design system and product innovation.

  4. XML-based product information processing method for product design

    NASA Astrophysics Data System (ADS)

    Zhang, Zhen Yu

    2011-12-01

    Design knowledge of modern mechatronics product is based on information processing as the center of the knowledge-intensive engineering, thus product design innovation is essentially the knowledge and information processing innovation. Analysis of the role of mechatronics product design knowledge and information management features, a unified model of XML-based product information processing method is proposed. Information processing model of product design includes functional knowledge, structural knowledge and their relationships. For the expression of product function element, product structure element, product mapping relationship between function and structure based on the XML model are proposed. The information processing of a parallel friction roller is given as an example, which demonstrates that this method is obviously helpful for knowledge-based design system and product innovation.

  5. Self-informant Agreement for Personality and Evaluative Person Descriptors: Comparing Methods for Creating Informant Measures

    PubMed Central

    Simms, Leonard J.; Zelazny, Kerry; Yam, Wern How; Gros, Daniel F.

    2011-01-01

    Little attention typically is paid to the way self-report measures are translated for use in self-informant agreement studies. We studied two possible methods for creating informant measures: (a) the traditional method in which self-report items were translated from the first- to the third-person and (b) an alternative meta-perceptual method in which informants were directed to rate their perception of the targets’ self-perception. We hypothesized that the latter method would yield stronger self-informant agreement for evaluative personality dimensions measured by indirect item markers. We studied these methods in a sample of 303 undergraduate friendship dyads. Results revealed mean-level differences between methods, similar self-informant agreement across methods, stronger agreement for Big Five dimensions than for evaluative dimensions, and incremental validity for meta-perceptual informant rating methods. Limited power reduced the interpretability of several sparse acquaintanceship effects. We conclude that traditional informant methods are appropriate for most personality traits, but meta-perceptual methods may be more appropriate when personality questionnaire items reflect indirect indicators of the trait being measured, which is particularly likely for evaluative traits. PMID:21541262

  6. Collecting Information for Rating Global Assessment of Functioning (GAF): Sources of Information and Methods for Information Collection

    PubMed Central

    Aas, I. H. Monrad

    2014-01-01

    Introduction: Global Assessment of Functioning (GAF) is an assessment instrument that is known worldwide. It is widely used for rating the severity of illness. Results from evaluations in psychiatry should characterize the patients. Rating of GAF is based on collected information. The aim of the study is to identify the factors involved in collecting information that is relevant for rating GAF, and gaps in knowledge where it is likely that further development would play a role for improved scoring. Methods: A literature search was conducted with a combination of thorough hand search and search in the bibliographic databases PubMed, PsycINFO, Google Scholar, and Campbell Collaboration Library of Systematic Reviews. Results: Collection of information for rating GAF depends on two fundamental factors: the sources of information and the methods for information collection. Sources of information are patients, informants, health personnel, medical records, letters of referral and police records about violence and substance abuse. Methods for information collection include the many different types of interview – unstructured, semi-structured, structured, interviews for Axis I and II disorders, semistructured interviews for rating GAF, and interviews of informants – as well as instruments for rating symptoms and functioning, and observation. The different sources of information, and methods for collection, frequently result in inconsistencies in the information collected. The variation in collected information, and lack of a generally accepted algorithm for combining collected information, is likely to be important for rated GAF values, but there is a fundamental lack of knowledge about the degree of importance. Conclusions: Research to improve GAF has not reached a high level. Rated GAF values are likely to be influenced by both the sources of information used and the methods employed for information collection, but the lack of research-based information about these

  7. 48 CFR 2905.101 - Methods of disseminating information.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 7 2010-10-01 2010-10-01 false Methods of disseminating information. 2905.101 Section 2905.101 Federal Acquisition Regulations System DEPARTMENT OF LABOR ACQUISITION... dissemination of information concerning procurement actions. The Division of Acquisition Management...

  8. Consent, Informal Organization and Job Rewards: A Mixed Methods Analysis

    ERIC Educational Resources Information Center

    Laubach, Marty

    2005-01-01

    This study uses a mixed methods approach to workplace dynamics. Ethnographic observations show that the consent deal underlies an informal stratification that divides the workplace into an "informal periphery," a "conventional core" and an "administrative clan." The "consent deal" is defined as an exchange of autonomy, voice and schedule…

  9. Information theory in living systems, methods, applications, and challenges.

    PubMed

    Gatenby, Robert A; Frieden, B Roy

    2007-02-01

    Living systems are distinguished in nature by their ability to maintain stable, ordered states far from equilibrium. This is despite constant buffeting by thermodynamic forces that, if unopposed, will inevitably increase disorder. Cells maintain a steep transmembrane entropy gradient by continuous application of information that permits cellular components to carry out highly specific tasks that import energy and export entropy. Thus, the study of information storage, flow and utilization is critical for understanding first principles that govern the dynamics of life. Initial biological applications of information theory (IT) used Shannon's methods to measure the information content in strings of monomers such as genes, RNA, and proteins. Recent work has used bioinformatic and dynamical systems to provide remarkable insights into the topology and dynamics of intracellular information networks. Novel applications of Fisher-, Shannon-, and Kullback-Leibler informations are promoting increased understanding of the mechanisms by which genetic information is converted to work and order. Insights into evolution may be gained by analysis of the the fitness contributions from specific segments of genetic information as well as the optimization process in which the fitness are constrained by the substrate cost for its storage and utilization. Recent IT applications have recognized the possible role of nontraditional information storage structures including lipids and ion gradients as well as information transmission by molecular flux across cell membranes. Many fascinating challenges remain, including defining the intercellular information dynamics of multicellular organisms and the role of disordered information storage and flow in disease. PMID:17083004

  10. An Extraction Method of an Informative DOM Node from a Web Page by Using Layout Information

    NASA Astrophysics Data System (ADS)

    Tsuruta, Masanobu; Masuyama, Shigeru

    We propose an informative DOM node extraction method from a Web page for preprocessing of Web content mining. Our proposed method LM uses layout data of DOM nodes generated by a generic Web browser, and the learning set consists of hundreds of Web pages and the annotations of informative DOM nodes of those Web pages. Our method does not require large scale crawling of the whole Web site to which the target Web page belongs. We design LM so that it uses the information of the learning set more efficiently in comparison to the existing method that uses the same learning set. By experiments, we evaluate the methods obtained by combining one that consists of the method for extracting the informative DOM node both the proposed method and the existing methods, and the existing noise elimination methods: Heur removes advertisements and link-lists by some heuristics and CE removes the DOM nodes existing in the Web pages in the same Web site to which the target Web page belongs. Experimental results show that 1) LM outperforms other methods for extracting the informative DOM node, 2) the combination method (LM, {CE(10), Heur}) based on LM (precision: 0.755, recall: 0.826, F-measure: 0.746) outperforms other combination methods.

  11. A Method of Integrated Description of Design Information for Reusability

    NASA Astrophysics Data System (ADS)

    Tsumaya, Akira; Nagae, Masao; Wakamatsu, Hidefumi; Shirase, Keiichi; Arai, Eiji

    Much of product design is executed concurrently these days. For such concurrent design, the method which can share and ueuse varioud kind of design information among designers is needed. However, complete understanding of the design information among designers have been a difficult issue. In this paper, design process model with use of designers’ intention is proposed. A method to combine the design process information and the design object information is also proposed. We introduce how to describe designers’ intention by providing some databases. Keyword Database consists of ontological data related to design object/activities. Designers select suitable keyword(s) from Keyword Database and explain the reason/ideas for their design activities by the description with use of keyword(s). We also developed the integration design information management system architecture by using a method of integrated description with designers’ intension. This system realizes connections between the information related to design process and that related to design object through designers’ intention. Designers can communicate with each other to understand how others make decision in design through that. Designers also can re-use both design process information data and design object information data through detabase management sub-system.

  12. A Method to Separate Stochastic and Deterministic Information from Electrocardiograms

    NASA Astrophysics Data System (ADS)

    Gutiérrez, R. M.; Sandoval, L. A.

    2005-01-01

    In this work we present a new idea to develop a method to separate stochastic and deterministic information contained in an electrocardiogram, ECG, which may provide new sources of information with diagnostic purposes. We assume that the ECG has information corresponding to many different processes related with the cardiac activity as well as contamination from different sources related with the measurement procedure and the nature of the observed system itself. The method starts with the application of an improved archetypal analysis to separate the mentioned stochastic and deterministic information. From the stochastic point of view we analyze Renyi entropies, and with respect to the deterministic perspective we calculate the autocorrelation function and the corresponding correlation time. We show that healthy and pathologic information may be stochastic and/or deterministic, can be identified by different measures and located in different parts of the ECG.

  13. A theory-based approach to thermal field-flow fractionation of polyacrylates.

    PubMed

    Runyon, J Ray; Williams, S Kim Ratanathanawongs

    2011-09-28

    A theory-based approach is presented for the development of thermal field-flow fractionation (ThFFF) of polyacrylates. The use of ThFFF for polymer analysis has been limited by an incomplete understanding of the thermal diffusion which plays an important role in retention and separation. Hence, a tedious trial-and-error approach to method development has been the normal practice when analyzing new materials. In this work, thermal diffusion theories based on temperature dependent osmotic pressure gradient and polymer-solvent interaction parameters were used to estimate thermal diffusion coefficients (D(T)) and retention times (t(r)) for different polymer-solvent pairs. These calculations identified methyl ethyl ketone as a solvent that would cause significant retention of poly(n-butyl acrylate) (PBA) and poly(methyl acrylate) (PMA). Experiments confirmed retention of these two polymers that have not been previously analyzed by ThFFF. Theoretical and experimental D(T)s and t(r)s for PBA, PMA, and polystyrene in different solvents agreed to within 20% and demonstrate the feasibility of this theory-based approach. PMID:21872869

  14. Adaptive windowed range-constrained Otsu method using local information

    NASA Astrophysics Data System (ADS)

    Zheng, Jia; Zhang, Dinghua; Huang, Kuidong; Sun, Yuanxi; Tang, Shaojie

    2016-01-01

    An adaptive windowed range-constrained Otsu method using local information is proposed for improving the performance of image segmentation. First, the reason why traditional thresholding methods do not perform well in the segmentation of complicated images is analyzed. Therein, the influences of global and local thresholdings on the image segmentation are compared. Second, two methods that can adaptively change the size of the local window according to local information are proposed by us. The characteristics of the proposed methods are analyzed. Thereby, the information on the number of edge pixels in the local window of the binarized variance image is employed to adaptively change the local window size. Finally, the superiority of the proposed method over other methods such as the range-constrained Otsu, the active contour model, the double Otsu, the Bradley's, and the distance-regularized level set evolution is demonstrated. It is validated by the experiments that the proposed method can keep more details and acquire much more satisfying area overlap measure as compared with the other conventional methods.

  15. System and method for acquisition management of subject position information

    DOEpatents

    Carrender, Curt

    2007-01-23

    A system and method for acquisition management of subject position information that utilizes radio frequency identification (RF ID) to store position information in position tags. Tag programmers receive position information from external positioning systems, such as the Global Positioning System (GPS), from manual inputs, such as keypads, or other tag programmers. The tag programmers program each position tag with the received position information. Both the tag programmers and the position tags can be portable or fixed. Implementations include portable tag programmers and fixed position tags for subject position guidance, and portable tag programmers for collection sample labeling. Other implementations include fixed tag programmers and portable position tags for subject route recordation. Position tags can contain other associated information such as destination address of an affixed subject for subject routing.

  16. System and method for acquisition management of subject position information

    DOEpatents

    Carrender, Curt

    2005-12-13

    A system and method for acquisition management of subject position information that utilizes radio frequency identification (RF ID) to store position information in position tags. Tag programmers receive position information from external positioning systems, such as the Global Positioning System (GPS), from manual inputs, such as keypads, or other tag programmers. The tag programmers program each position tag with the received position information. Both the tag programmers and the position tags can be portable or fixed. Implementations include portable tag programmers and fixed position tags for subject position guidance, and portable tag programmers for collection sample labeling. Other implementations include fixed tag programmers and portable position tags for subject route recordation. Position tags can contain other associated information such as destination address of an affixed subject for subject routing.

  17. Financial time series analysis based on information categorization method

    NASA Astrophysics Data System (ADS)

    Tian, Qiang; Shang, Pengjian; Feng, Guochen

    2014-12-01

    The paper mainly applies the information categorization method to analyze the financial time series. The method is used to examine the similarity of different sequences by calculating the distances between them. We apply this method to quantify the similarity of different stock markets. And we report the results of similarity in US and Chinese stock markets in periods 1991-1998 (before the Asian currency crisis), 1999-2006 (after the Asian currency crisis and before the global financial crisis), and 2007-2013 (during and after global financial crisis) by using this method. The results show the difference of similarity between different stock markets in different time periods and the similarity of the two stock markets become larger after these two crises. Also we acquire the results of similarity of 10 stock indices in three areas; it means the method can distinguish different areas' markets from the phylogenetic trees. The results show that we can get satisfactory information from financial markets by this method. The information categorization method can not only be used in physiologic time series, but also in financial time series.

  18. Formative research to develop theory-based messages for a Western Australian child drowning prevention television campaign: study protocol

    PubMed Central

    Denehy, Mel; Crawford, Gemma; Leavy, Justine; Nimmo, Lauren; Jancey, Jonine

    2016-01-01

    Introduction Worldwide, children under the age of 5 years are at particular risk of drowning. Responding to this need requires the development of evidence-informed drowning prevention strategies. Historically, drowning prevention strategies have included denying access, learning survival skills and providing supervision, as well as education and information which includes the use of mass media. Interventions underpinned by behavioural theory and formative evaluation tend to be more effective, yet few practical examples exist in the drowning and/or injury prevention literature. The Health Belief Model and Social Cognitive Theory will be used to explore participants' perspectives regarding proposed mass media messaging. This paper describes a qualitative protocol to undertake formative research to develop theory-based messages for a child drowning prevention campaign. Methods and analysis The primary data source will be focus group interviews with parents and caregivers of children under 5 years of age in metropolitan and regional Western Australia. Qualitative content analysis will be used to analyse the data. Ethics and dissemination This study will contribute to the drowning prevention literature to inform the development of future child drowning prevention mass media campaigns. Findings from the study will be disseminated to practitioners, policymakers and researchers via international conferences, peer and non-peer-reviewed journals and evidence summaries. The study was submitted and approved by the Curtin University Human Research Ethics Committee. PMID:27207621

  19. Determination of nuclear level densities from experimental information

    SciTech Connect

    Cole, B.J. ); Davidson, N.J. , P.O. Box 88, Manchester M60 1QD ); Miller, H.G. )

    1994-10-01

    A novel information theory based method for determining the density of states from prior information is presented. The energy dependence of the density of states is determined from the observed number of states per energy interval, and model calculations suggest that the method is sufficiently reliable to calculate the thermal properties of nuclei over a reasonable temperature range.

  20. Game theory based band selection for hyperspectral images

    NASA Astrophysics Data System (ADS)

    Shi, Aiye; He, Zhenyu; Huang, Fengchen

    2015-12-01

    This paper proposes a new evaluation criterion for band selection for hyperspectral imagery. The combination of information and class separability is used to be as a new evaluation criterion, at the same time, the correlation between bands is used as a constraint condition. In addition, the game theory is introduced into the band selection to coordinate the potential conflict of search the optimal band combination using information and class separability these two evaluation criteria. The experimental results show that the proposed method is effective on AVIRIS hyperspectral data.

  1. Foreign Language Methods and an Information Processing Model of Memory.

    ERIC Educational Resources Information Center

    Willebrand, Julia

    The major approaches to language teaching (audiolingual method, generative grammar, Community Language Learning and Silent Way) are investigated to discover whether or not they are compatible in structure with an information-processing model of memory (IPM). The model of memory used was described by Roberta Klatzky in "Human Memory: Structures and…

  2. Entropy theory based multi-criteria resampling of rain gauge networks for hydrological modelling - A case study of humid area in southern China

    NASA Astrophysics Data System (ADS)

    Xu, Hongliang; Xu, Chong-Yu; Sælthun, Nils Roar; Xu, Youpeng; Zhou, Bin; Chen, Hua

    2015-06-01

    Rain gauge networks are used to provide estimates of area average, spatial variability and point rainfalls at catchment scale and provide the most important input for hydrological models. Therefore, it is desired to design the optimal rain gauge networks with a minimal number of rain gauges to provide reliable data with both areal mean values and spatial-temporal variability. Based on a dense rain gauge network of 185 rain gauges in Xiangjiang River Basin, southern China, this study used an entropy theory based multi-criteria method which simultaneously considers the information derived from rainfall series, minimize the bias of areal mean rainfall as well as minimize the information overlapped by different gauges to resample the rain gauge networks with different gauge densities. The optimal networks were examined using two hydrological models: The lumped Xinanjiang Model and the distributed SWAT Model. The results indicate that the performances of the lumped model using different optimal networks are stable while the performances of the distributed model keep on improving as the number of rain gauges increases. The results reveal that the entropy theory based multi-criteria strategy provides an optimal design of rain gauge network which is of vital importance in regional hydrological study and water resources management.

  3. Evaluation methods for retrieving information from interferograms of biomedical objects

    NASA Astrophysics Data System (ADS)

    Podbielska, Halina; Rottenkolber, Matthias

    1996-04-01

    Interferograms in the form of fringe patterns can be produced in two-beam interferometers, holographic or speckle interferometers, in setups realizing moire techniques or in deflectometers. Optical metrology based on the principle of interference can be applied as a testing tool in biomedical research. By analyzing of the fringe pattern images, information about the shape or mechanical behavior of the object under study can be retrieved. Here, some of the techniques for creating fringe pattern images were presented along with methods of analysis. Intensity based analysis as well as methods of phase measurements, are mentioned. Applications of inteferometric methods, especially in the field of experimental orthopedics, endoscopy and ophthalmology are pointed out.

  4. A Model-Driven Development Method for Management Information Systems

    NASA Astrophysics Data System (ADS)

    Mizuno, Tomoki; Matsumoto, Keinosuke; Mori, Naoki

    Traditionally, a Management Information System (MIS) has been developed without using formal methods. By the informal methods, the MIS is developed on its lifecycle without having any models. It causes many problems such as lack of the reliability of system design specifications. In order to overcome these problems, a model theory approach was proposed. The approach is based on an idea that a system can be modeled by automata and set theory. However, it is very difficult to generate automata of the system to be developed right from the start. On the other hand, there is a model-driven development method that can flexibly correspond to changes of business logics or implementing technologies. In the model-driven development, a system is modeled using a modeling language such as UML. This paper proposes a new development method for management information systems applying the model-driven development method to a component of the model theory approach. The experiment has shown that a reduced amount of efforts is more than 30% of all the efforts.

  5. An Organizational Model to Distinguish between and Integrate Research and Evaluation Activities in a Theory Based Evaluation

    ERIC Educational Resources Information Center

    Sample McMeeking, Laura B.; Basile, Carole; Cobb, R. Brian

    2012-01-01

    Theory-based evaluation (TBE) is an evaluation method that shows how a program will work under certain conditions and has been supported as a viable, evidence-based option in cases where randomized trials or high-quality quasi-experiments are not feasible. Despite the model's widely accepted theoretical appeal there are few examples of its…

  6. Quantitative methods to direct exploration based on hydrogeologic information

    USGS Publications Warehouse

    Graettinger, A.J.; Lee, J.; Reeves, H.W.; Dethan, D.

    2006-01-01

    Quantitatively Directed Exploration (QDE) approaches based on information such as model sensitivity, input data covariance and model output covariance are presented. Seven approaches for directing exploration are developed, applied, and evaluated on a synthetic hydrogeologic site. The QDE approaches evaluate input information uncertainty, subsurface model sensitivity and, most importantly, output covariance to identify the next location to sample. Spatial input parameter values and covariances are calculated with the multivariate conditional probability calculation from a limited number of samples. A variogram structure is used during data extrapolation to describe the spatial continuity, or correlation, of subsurface information. Model sensitivity can be determined by perturbing input data and evaluating output response or, as in this work, sensitivities can be programmed directly into an analysis model. Output covariance is calculated by the First-Order Second Moment (FOSM) method, which combines the covariance of input information with model sensitivity. A groundwater flow example, modeled in MODFLOW-2000, is chosen to demonstrate the seven QDE approaches. MODFLOW-2000 is used to obtain the piezometric head and the model sensitivity simultaneously. The seven QDE approaches are evaluated based on the accuracy of the modeled piezometric head after information from a QDE sample is added. For the synthetic site used in this study, the QDE approach that identifies the location of hydraulic conductivity that contributes the most to the overall piezometric head variance proved to be the best method to quantitatively direct exploration. ?? IWA Publishing 2006.

  7. Hybrid methods to represent incomplete and uncertain information

    SciTech Connect

    Joslyn, C.

    1996-12-31

    Decision making is cast in the semiotic context of perception, decision, and action loops. Towards the goal of properly grounding hybrid representations of information and uncertainty from this semiotic perspective, we consider the roles of and relations among the mathematical components of General Information Theory (GIT), particularly among fuzzy sets, possibility theory, probability theory, and random sets. We do so by using a clear distinction between the syntactic, mathematical formalism and the semantic domains of application of each of these fields, placing the emphasis on available measurement and action methods appropriate for each formalism, to which and from which the decision-making process flows.

  8. Enhancing subsurface information from the fusion of multiple geophysical methods

    NASA Astrophysics Data System (ADS)

    Jafargandomi, A.; Binley, A.

    2011-12-01

    Characterization of hydrologic systems is a key element in understanding and predicting their behaviour. Geophysical methods especially electrical methods (e.g., electrical resistivity tomography (ERT), induced polarization (IP) and electromagnetic (EM)) are becoming popular for such purpose due to their non-invasive nature, high sensitivity to hydrological parameters and the speed of measurements. However, interrogation of each geophysical method provides only limited information about some of the subsurface parameters. Therefore, in order to achieve a comprehensive picture from the hydrologic system, fusion of multiple geophysical data sets can be beneficial. Although a number of fusion approaches have been proposed in the literature, an aspect that has been generally overlooked is the assessment of information content from each measurement approach. Such an assessment provides useful insight for the design of future surveys. We develop a fusion strategy based on the capability of multiple geophysical methods to provide enough resolution to identify subsurface material parameters and structure. We apply a Bayesian framework to analyse the information in multiple geophysical data sets. In this approach multiple geophysical data sets are fed into a Markov chain Monte Carlo (McMC) inversion algorithm and the information content of the post-inversion result (posterior probability distribution) is quantified. We use Shannon's information measure to quantify the information obtained from the inversion of different combinations of geophysical data sets. In this strategy, information from multiple methods is brought together via introducing joint likelihood function and/or constraining the prior information. We apply the fusion tool to one of the target sites of the EU FP7 project ModelProbe which aims to develop technologies and tools for soil contamination assessment and site characterization. The target site is located close to Trecate (Novara - NW Italy). At this

  9. Identifying informative subsets of the Gene Ontology with information bottleneck methods

    PubMed Central

    Jin, Bo; Lu, Xinghua

    2010-01-01

    Motivation: The Gene Ontology (GO) is a controlled vocabulary designed to represent the biological concepts pertaining to gene products. This study investigates the methods for identifying informative subsets of GO terms in an automatic and objective fashion. This task in turn requires addressing the following issues: how to represent the semantic context of GO terms, what metrics are suitable for measuring the semantic differences between terms, how to identify an informative subset that retains as much as possible of the original semantic information of GO. Results: We represented the semantic context of a GO term using the word-usage-profile associated with the term, which enables one to measure the semantic differences between terms based on the differences in their semantic contexts. We further employed the information bottleneck methods to automatically identify subsets of GO terms that retain as much as possible of the semantic information in an annotation database. The automatically retrieved informative subsets align well with an expert-picked GO slim subset, cover important concepts and proteins, and enhance literature-based GO annotation. Availability: http://carcweb.musc.edu/TextminingProjects/ Contact: xinghua@pitt.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:20702400

  10. Methods of obtaining meaningful information from disperse media holograms

    NASA Astrophysics Data System (ADS)

    Dyomin, Victor V.

    1997-05-01

    The problem of nondestructive testing of microstructure parameters, both aerosols and water suspension, is actual for biology, medicine, and environmental control. Among the methods of optical investigations and diagnostics of light scattering media the holographic method plays a special role. A hologram of scattering volume allows us to reproduce the optical wave field to obtain information on the parameters of microparticles: size, shape, and spatial position. Usually this is done by analysis of the particle images reconstructed from the hologram. On the basis of calculated and experimental results, characteristics of holographic methods are analyzed in this paper. These estimations demonstrate a possibility to use the above methods for investigation of media in biomedical science and clinical practice. A lot of micro-organisms and other living particles are transparent or semitransparent ones. In this case the reconstructed image of the particle will show a spot formed due to light focusing by the particle in addition to its cross section. This circumstance allowed us to propose a method of determining of refractive index of transparent and semitransparent microparticles, that, in turn, can provide identification of the particles type. The development of this method is presented. To make measurement of the size-distribution of particles one can do this simultaneously with the reconstruction of scattering optical field from the hologram. In this case a small angle optical meter (for example, focusing lens) can be placed just behind the illuminated hologram. The reconstructed field is composed of the initial one and its conjugate. Each of these components as well as interference between them can bear an additional information on the medium. The possibility of extraction of this information is also discussed.

  11. Hybrid methods for multisource information fusion and decision support

    NASA Astrophysics Data System (ADS)

    Braun, Jerome J.; Glina, Yan

    2006-04-01

    This paper presents the progress of an ongoing research effort in multisource information fusion for biodefense decision support. The effort concentrates on a novel machine-intelligence hybrid-of-hybrids decision support architecture termed FLASH (Fusion, Learning, Adaptive Super-Hybrid) we proposed. The highlights of FLASH discussed in the paper include its cognitive-processing orientation and the hybrid nature involving heterogeneous multiclassifier machine learning and approximate reasoning paradigms. Selected specifics of the FLASH internals, such as its feature selection techniques, supervised learning, clustering, recognition and reasoning methods, and their integration, are discussed. The results to date are presented, including the background type determination and bioattack detection computational experiments using data obtained with a multisensor fusion testbed we have also developed. The processing of imprecise information originating from sources other than sensors is considered. Finally, the paper discusses applicability of FLASH and its methods to complex battlespace management problems such as course-of-action decision support.

  12. Application of information theory methods to food web reconstruction

    USGS Publications Warehouse

    Moniz, L.J.; Cooch, E.G.; Ellner, S.P.; Nichols, J.D.; Nichols, J.M.

    2007-01-01

    In this paper we use information theory techniques on time series of abundances to determine the topology of a food web. At the outset, the food web participants (two consumers, two resources) are known; in addition we know that each consumer prefers one of the resources over the other. However, we do not know which consumer prefers which resource, and if this preference is absolute (i.e., whether or not the consumer will consume the non-preferred resource). Although the consumers and resources are identified at the beginning of the experiment, we also provide evidence that the consumers are not resources for each other, and the resources do not consume each other. We do show that there is significant mutual information between resources; the model is seasonally forced and some shared information between resources is expected. Similarly, because the model is seasonally forced, we expect shared information between consumers as they respond to the forcing of the resources. The model that we consider does include noise, and in an effort to demonstrate that these methods may be of some use in other than model data, we show the efficacy of our methods with decreasing time series size; in this particular case we obtain reasonably clear results with a time series length of 400 points. This approaches ecological time series lengths from real systems.

  13. Control theory based airfoil design using the Euler equations

    NASA Technical Reports Server (NTRS)

    Jameson, Antony; Reuther, James

    1994-01-01

    This paper describes the implementation of optimization techniques based on control theory for airfoil design. In our previous work it was shown that control theory could be employed to devise effective optimization procedures for two-dimensional profiles by using the potential flow equation with either a conformal mapping or a general coordinate system. The goal of our present work is to extend the development to treat the Euler equations in two-dimensions by procedures that can readily be generalized to treat complex shapes in three-dimensions. Therefore, we have developed methods which can address airfoil design through either an analytic mapping or an arbitrary grid perturbation method applied to a finite volume discretization of the Euler equations. Here the control law serves to provide computationally inexpensive gradient information to a standard numerical optimization method. Results are presented for both the inverse problem and drag minimization problem.

  14. A Danger-Theory-Based Immune Network Optimization Algorithm

    PubMed Central

    Li, Tao; Xiao, Xin; Shi, Yuanquan

    2013-01-01

    Existing artificial immune optimization algorithms reflect a number of shortcomings, such as premature convergence and poor local search ability. This paper proposes a danger-theory-based immune network optimization algorithm, named dt-aiNet. The danger theory emphasizes that danger signals generated from changes of environments will guide different levels of immune responses, and the areas around danger signals are called danger zones. By defining the danger zone to calculate danger signals for each antibody, the algorithm adjusts antibodies' concentrations through its own danger signals and then triggers immune responses of self-regulation. So the population diversity can be maintained. Experimental results show that the algorithm has more advantages in the solution quality and diversity of the population. Compared with influential optimization algorithms, CLONALG, opt-aiNet, and dopt-aiNet, the algorithm has smaller error values and higher success rates and can find solutions to meet the accuracies within the specified function evaluation times. PMID:23483853

  15. Acoustic emission source location and damage detection in a metallic structure using a graph-theory-based geodesic approach

    NASA Astrophysics Data System (ADS)

    Gangadharan, R.; Prasanna, G.; Bhat, M. R.; Murthy, C. R. L.; Gopalakrishnan, S.

    2009-11-01

    A geodesic-based approach using Lamb waves is proposed to locate the acoustic emission (AE) source and damage in an isotropic metallic structure. In the case of the AE (passive) technique, the elastic waves take the shortest path from the source to the sensor array distributed in the structure. The geodesics are computed on the meshed surface of the structure using graph theory based on Dijkstra's algorithm. By propagating the waves in reverse virtually from these sensors along the geodesic path and by locating the first intersection point of these waves, one can get the AE source location. The same approach is extended for detection of damage in a structure. The wave response matrix of the given sensor configuration for the healthy and the damaged structure is obtained experimentally. The healthy and damage response matrix is compared and their difference gives the information about the reflection of waves from the damage. These waves are backpropagated from the sensors and the above method is used to locate the damage by finding the point where intersection of geodesics occurs. In this work, the geodesic approach is shown to be suitable to obtain a practicable source location solution in a more general set-up on any arbitrary surface containing finite discontinuities. Experiments were conducted on aluminum specimens of simple and complex geometry to validate this new method.

  16. Extending the Li&Ma method to include PSF information

    NASA Astrophysics Data System (ADS)

    Nievas-Rosillo, M.; Contreras, J. L.

    2016-02-01

    The so called Li&Ma formula is still the most frequently used method for estimating the significance of observations carried out by Imaging Atmospheric Cherenkov Telescopes. In this work a straightforward extension of the method for point sources that profits from the good imaging capabilities of current instruments is proposed. It is based on a likelihood ratio under the assumption of a well-known PSF and a smooth background. Its performance is tested with Monte Carlo simulations based on real observations and its sensitivity is compared to standard methods which do not incorporate PSF information. The gain of significance that can be attributed to the inclusion of the PSF is around 10% and can be boosted if a background model is assumed or a finer binning is used.

  17. Method to find community structures based on information centrality

    NASA Astrophysics Data System (ADS)

    Fortunato, Santo; Latora, Vito; Marchiori, Massimo

    2004-11-01

    Community structures are an important feature of many social, biological, and technological networks. Here we study a variation on the method for detecting such communities proposed by Girvan and Newman and based on the idea of using centrality measures to define the community boundaries [M. Girvan and M. E. J. Newman, Proc. Natl. Acad. Sci. U.S.A. 99, 7821 (2002)]. We develop an algorithm of hierarchical clustering that consists in finding and removing iteratively the edge with the highest information centrality. We test the algorithm on computer generated and real-world networks whose community structure is already known or has been studied by means of other methods. We show that our algorithm, although it runs to completion in a time O(n4) , is very effective especially when the communities are very mixed and hardly detectable by the other methods.

  18. Improved prediction of tacrolimus concentrations early after kidney transplantation using theory-based pharmacokinetic modelling

    PubMed Central

    Størset, Elisabet; Holford, Nick; Hennig, Stefanie; Bergmann, Troels K; Bergan, Stein; Bremer, Sara; Åsberg, Anders; Midtvedt, Karsten; Staatz, Christine E

    2014-01-01

    Aims The aim was to develop a theory-based population pharmacokinetic model of tacrolimus in adult kidney transplant recipients and to externally evaluate this model and two previous empirical models. Methods Data were obtained from 242 patients with 3100 tacrolimus whole blood concentrations. External evaluation was performed by examining model predictive performance using Bayesian forecasting. Results Pharmacokinetic disposition parameters were estimated based on tacrolimus plasma concentrations, predicted from whole blood concentrations, haematocrit and literature values for tacrolimus binding to red blood cells. Disposition parameters were allometrically scaled to fat free mass. Tacrolimus whole blood clearance/bioavailability standardized to haematocrit of 45% and fat free mass of 60 kg was estimated to be 16.1 l h−1 [95% CI 12.6, 18.0 l h−1]. Tacrolimus clearance was 30% higher (95% CI 13, 46%) and bioavailability 18% lower (95% CI 2, 29%) in CYP3A5 expressers compared with non-expressers. An Emax model described decreasing tacrolimus bioavailability with increasing prednisolone dose. The theory-based model was superior to the empirical models during external evaluation displaying a median prediction error of −1.2% (95% CI −3.0, 0.1%). Based on simulation, Bayesian forecasting led to 65% (95% CI 62, 68%) of patients achieving a tacrolimus average steady-state concentration within a suggested acceptable range. Conclusion A theory-based population pharmacokinetic model was superior to two empirical models for prediction of tacrolimus concentrations and seemed suitable for Bayesian prediction of tacrolimus doses early after kidney transplantation. PMID:25279405

  19. Emotion identification method using RGB information of human face

    NASA Astrophysics Data System (ADS)

    Kita, Shinya; Mita, Akira

    2015-03-01

    Recently, the number of single households is drastically increased due to the growth of the aging society and the diversity of lifestyle. Therefore, the evolution of building spaces is demanded. Biofied Building we propose can help to avoid this situation. It helps interaction between the building and residents' conscious and unconscious information using robots. The unconscious information includes emotion, condition, and behavior. One of the important information is thermal comfort. We assume we can estimate it from human face. There are many researchs about face color analysis, but a few of them are conducted in real situations. In other words, the existing methods were not used with disturbance such as room lumps. In this study, Kinect was used with face-tracking. Room lumps and task lumps were used to verify that our method could be applicable to real situation. In this research, two rooms at 22 and 28 degrees C were prepared. We showed that the transition of thermal comfort by changing temperature can be observed from human face. Thus, distinction between the data of 22 and 28 degrees C condition from face color was proved to be possible.

  20. Dissemination of a theory-based online bone health program: Two intervention approaches.

    PubMed

    Nahm, Eun-Shim; Resnick, Barbara; Bellantoni, Michele; Zhu, Shijun; Brown, Clayton; Brennan, Patricia F; Charters, Kathleen; Brown, Jeanine; Rietschel, Matthew; Pinna, Joanne; An, Minjeong; Park, Bu Kyung; Plummer, Lisa

    2015-06-01

    With the increasing nationwide emphasis on eHealth, there has been a rapid growth in the use of the Internet to deliver health promotion interventions. Although there has been a great deal of research in this field, little information is available regarding the methodologies to develop and implement effective online interventions. This article describes two social cognitive theory-based online health behavior interventions used in a large-scale dissemination study (N = 866), their implementation processes, and the lessons learned during the implementation processes. The two interventions were a short-term (8-week) intensive online Bone Power program and a longer term (12-month) Bone Power Plus program, including the Bone Power program followed by a 10-month online booster intervention (biweekly eHealth newsletters). This study used a small-group approach (32 intervention groups), and to effectively manage those groups, an eLearning management program was used as an upper layer of the Web intervention. Both interventions were implemented successfully with high retention rates (80.7% at 18 months). The theory-based approaches and the online infrastructure used in this study showed a promising potential as an effective platform for online behavior studies. Further replication studies with different samples and settings are needed to validate the utility of this intervention structure. PMID:26021668

  1. A theory-based approach to teaching young children about health: A recipe for understanding

    PubMed Central

    Nguyen, Simone P.; McCullough, Mary Beth; Noble, Ashley

    2011-01-01

    The theory-theory account of conceptual development posits that children’s concepts are integrated into theories. Concept learning studies have documented the central role that theories play in children’s learning of experimenter-defined categories, but have yet to extensively examine complex, real-world concepts such as health. The present study examined whether providing young children with coherent and causally-related information in a theory-based lesson would facilitate their learning about the concept of health. This study used a pre-test/lesson/post-test design, plus a five month follow-up. Children were randomly assigned to one of three conditions: theory (i.e., 20 children received a theory-based lesson); nontheory (i.e., 20 children received a nontheory-based lesson); and control (i.e., 20 children received no lesson). Overall, the results showed that children in the theory condition had a more accurate conception of health than children in the nontheory and control conditions, suggesting the importance of theories in children’s learning of complex, real-world concepts. PMID:21894237

  2. Dissolved oxygen prediction using a possibility-theory based fuzzy neural network

    NASA Astrophysics Data System (ADS)

    Khan, U. T.; Valeo, C.

    2015-11-01

    A new fuzzy neural network method to predict minimum dissolved oxygen (DO) concentration in a highly urbanised riverine environment (in Calgary, Canada) is proposed. The method uses abiotic (non-living, physical and chemical attributes) as inputs to the model, since the physical mechanisms governing DO in the river are largely unknown. A new two-step method to construct fuzzy numbers using observations is proposed. Then an existing fuzzy neural network is modified to account for fuzzy number inputs and also uses possibility-theory based intervals to train the network. Results demonstrate that the method is particularly well suited to predict low DO events in the Bow River. Model output and a defuzzification technique is used to estimate the risk of low DO so that water resource managers can implement strategies to prevent the occurrence of low DO.

  3. Dissolved oxygen prediction using a possibility theory based fuzzy neural network

    NASA Astrophysics Data System (ADS)

    Khan, Usman T.; Valeo, Caterina

    2016-06-01

    A new fuzzy neural network method to predict minimum dissolved oxygen (DO) concentration in a highly urbanised riverine environment (in Calgary, Canada) is proposed. The method uses abiotic factors (non-living, physical and chemical attributes) as inputs to the model, since the physical mechanisms governing DO in the river are largely unknown. A new two-step method to construct fuzzy numbers using observations is proposed. Then an existing fuzzy neural network is modified to account for fuzzy number inputs and also uses possibility theory based intervals to train the network. Results demonstrate that the method is particularly well suited to predicting low DO events in the Bow River. Model performance is compared with a fuzzy neural network with crisp inputs, as well as with a traditional neural network. Model output and a defuzzification technique are used to estimate the risk of low DO so that water resource managers can implement strategies to prevent the occurrence of low DO.

  4. A method to stabilize linear systems using eigenvalue gradient information

    NASA Technical Reports Server (NTRS)

    Wieseman, C. D.

    1985-01-01

    Formal optimization methods and eigenvalue gradient information are used to develop a stabilizing control law for a closed loop linear system that is initially unstable. The method was originally formulated by using direct, constrained optimization methods with the constraints being the real parts of the eigenvalues. However, because of problems in trying to achieve stabilizing control laws, the problem was reformulated to be solved differently. The method described uses the Davidon-Fletcher-Powell minimization technique to solve an indirect, constrained minimization problem in which the performance index is the Kreisselmeier-Steinhauser function of the real parts of all the eigenvalues. The method is applied successfully to solve two different problems: the determination of a fourth-order control law stabilizes a single-input single-output active flutter suppression system and the determination of a second-order control law for a multi-input multi-output lateral-directional flight control system. Various sets of design variables and initial starting points were chosen to show the robustness of the method.

  5. Information bias in health research: definition, pitfalls, and adjustment methods

    PubMed Central

    Althubaiti, Alaa

    2016-01-01

    As with other fields, medical sciences are subject to different sources of bias. While understanding sources of bias is a key element for drawing valid conclusions, bias in health research continues to be a very sensitive issue that can affect the focus and outcome of investigations. Information bias, otherwise known as misclassification, is one of the most common sources of bias that affects the validity of health research. It originates from the approach that is utilized to obtain or confirm study measurements. This paper seeks to raise awareness of information bias in observational and experimental research study designs as well as to enrich discussions concerning bias problems. Specifying the types of bias can be essential to limit its effects and, the use of adjustment methods might serve to improve clinical evaluation and health care practice. PMID:27217764

  6. Information processing systems, reasoning modules, and reasoning system design methods

    DOEpatents

    Hohimer, Ryan E; Greitzer, Frank L; Hampton, Shawn D

    2014-03-04

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  7. Information processing systems, reasoning modules, and reasoning system design methods

    DOEpatents

    Hohimer, Ryan E.; Greitzer, Frank L.; Hampton, Shawn D.

    2015-08-18

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  8. Information processing systems, reasoning modules, and reasoning system design methods

    DOEpatents

    Hohimer, Ryan E.; Greitzer, Frank L.; Hampton, Shawn D.

    2016-08-23

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  9. a Task-Oriented Disaster Information Correlation Method

    NASA Astrophysics Data System (ADS)

    Linyao, Q.; Zhiqiang, D.; Qing, Z.

    2015-07-01

    With the rapid development of sensor networks and Earth observation technology, a large quantity of disaster-related data is available, such as remotely sensed data, historic data, case data, simulated data, and disaster products. However, the efficiency of current data management and service systems has become increasingly difficult due to the task variety and heterogeneous data. For emergency task-oriented applications, the data searches primarily rely on artificial experience based on simple metadata indices, the high time consumption and low accuracy of which cannot satisfy the speed and veracity requirements for disaster products. In this paper, a task-oriented correlation method is proposed for efficient disaster data management and intelligent service with the objectives of 1) putting forward disaster task ontology and data ontology to unify the different semantics of multi-source information, 2) identifying the semantic mapping from emergency tasks to multiple data sources on the basis of uniform description in 1), and 3) linking task-related data automatically and calculating the correlation between each data set and a certain task. The method goes beyond traditional static management of disaster data and establishes a basis for intelligent retrieval and active dissemination of disaster information. The case study presented in this paper illustrates the use of the method on an example flood emergency relief task.

  10. A rooftop extraction method using color feature, height map information and road information

    NASA Astrophysics Data System (ADS)

    Xiang, Yongzhou; Sun, Ying; Li, Chao

    2012-11-01

    This paper presents a new method for rooftop extraction that integrates color features, height map, and road information in a level set based segmentation framework. The proposed method consists of two steps: rooftop detection and rooftop segmentation. The first step requires the user to provide a few example rooftops from which the color distribution of rooftop pixels is estimated. For better robustness, we obtain superpixels of the input satellite image, and then classify each superpixel as rooftop or non-rooftop based on its color features. Using the height map, we can remove those detected rooftop candidates with small height values. Level set based segmentation of each detected rooftop is then performed based on color and height information, by incorporating a shape-prior term that allows the evolving contour to take on the desired rectangle shape. This requires performing rectangle fitting to the evolving contour, which can be guided by the road information to improve the fitting accuracy. The performance of the proposed method has been evaluated on a satellite image of 1 km×1 km in area, with a resolution of one meter per pixel. The method achieves detection rate of 88.0% and false alarm rate of 9.5%. The average Dice's coefficient over 433 detected rooftops is 73.4%. These results demonstrate that by integrating the height map in rooftop detection and by incorporating road information and rectangle fitting in a level set based segmentation framework, the proposed method provides an effective and useful tool for rooftop extraction from satellite images.

  11. The analysis of network transmission method for welding robot information

    NASA Astrophysics Data System (ADS)

    Cheng, Weide; Zhang, Hua; Liu, Donghua; Wang, Hongbo

    2012-01-01

    On the asis of User Datagram Protocol (UserDatagram Protocol, UDP), to do some improvement and design a welding robot network communication protocol (welding robot network communicate protocol: WRNCP), working on the fields of the transport layer and application layer of TCP / IP protocol. According to the characteristics of video data, to design the radio push-type (Broadcast Push Model, BPM) transmission method, improving the efficiency and stability of video transmission.and to designed the network information transmission system, used for real-time control of welding robot network.

  12. The analysis of network transmission method for welding robot information

    NASA Astrophysics Data System (ADS)

    Cheng, Weide; Zhang, Hua; Liu, Donghua; Wang, Hongbo

    2011-12-01

    On the asis of User Datagram Protocol (UserDatagram Protocol, UDP), to do some improvement and design a welding robot network communication protocol (welding robot network communicate protocol: WRNCP), working on the fields of the transport layer and application layer of TCP / IP protocol. According to the characteristics of video data, to design the radio push-type (Broadcast Push Model, BPM) transmission method, improving the efficiency and stability of video transmission.and to designed the network information transmission system, used for real-time control of welding robot network.

  13. Methods and systems for advanced spaceport information management

    NASA Technical Reports Server (NTRS)

    Fussell, Ronald M. (Inventor); Ely, Donald W. (Inventor); Meier, Gary M. (Inventor); Halpin, Paul C. (Inventor); Meade, Phillip T. (Inventor); Jacobson, Craig A. (Inventor); Blackwell-Thompson, Charlie (Inventor)

    2007-01-01

    Advanced spaceport information management methods and systems are disclosed. In one embodiment, a method includes coupling a test system to the payload and transmitting one or more test signals that emulate an anticipated condition from the test system to the payload. One or more responsive signals are received from the payload into the test system and are analyzed to determine whether one or more of the responsive signals comprises an anomalous signal. At least one of the steps of transmitting, receiving, analyzing and determining includes transmitting at least one of the test signals and the responsive signals via a communications link from a payload processing facility to a remotely located facility. In one particular embodiment, the communications link is an Internet link from a payload processing facility to a remotely located facility (e.g. a launch facility, university, etc.).

  14. Methods and Systems for Advanced Spaceport Information Management

    NASA Technical Reports Server (NTRS)

    Fussell, Ronald M. (Inventor); Ely, Donald W. (Inventor); Meier, Gary M. (Inventor); Halpin, Paul C. (Inventor); Meade, Phillip T. (Inventor); Jacobson, Craig A. (Inventor); Blackwell-Thompson, Charlie (Inventor)

    2007-01-01

    Advanced spaceport information management methods and systems are disclosed. In one embodiment, a method includes coupling a test system to the payload and transmitting one or more test signals that emulate an anticipated condition from the test system to the payload. One or more responsive signals are received from the payload into the test system and are analyzed to determine whether one or more of the responsive signals comprises an anomalous signal. At least one of the steps of transmitting, receiving, analyzing and determining includes transmitting at least one of the test signals and the responsive signals via a communications link from a payload processing facility to a remotely located facility. In one particular embodiment, the communications link is an Internet link from a payload processing facility to a remotely located facility (e.g. a launch facility, university, etc.).

  15. Urban drainage control applying rational method and geographic information technologies

    NASA Astrophysics Data System (ADS)

    Aldalur, Beatriz; Campo, Alicia; Fernández, Sandra

    2013-09-01

    The objective of this study is to develop a method of controlling urban drainages in the town of Ingeniero White motivated by the problems arising as a result of floods, water logging and the combination of southeasterly and high tides. A Rational Method was applied to control urban watersheds and used tools of Geographic Information Technology (GIT). A Geographic Information System was developed on the basis of 28 panchromatic aerial photographs of 2005. They were georeferenced with control points measured with Global Positioning Systems (basin: 6 km2). Flow rates of basins and sub-basins were calculated and it was verified that the existing open channels have a low slope with the presence of permanent water and generate stagnation of water favored by the presence of trash. It is proposed for the output of storm drains, the use of an existing channel to evacuate the flow. The solution proposed in this work is complemented by the placement of three pumping stations: one on a channel to drain rain water which will allow the drain of the excess water from the lower area where is located the Ingeniero White city and the two others that will drain the excess liquid from the port area.

  16. [Spectral discrimination method information divergence combined with gradient angle].

    PubMed

    Zhang, Xiu-bao; Yuan, Yan; Jing, Juan-juan; Sun, Cheng-ming; Wang, Qian

    2011-03-01

    The present paper proposes a spectral discrimination method combining spectral information divergence with spectral gradient angle (SID x tan(SGA(pi/2)) which overcomes the shortages of the existing methods which can not take the whole spectral shape and local characteristics into account simultaneously. Using the simulation spectra as input data, according to the interferogram acquirement principle and spectrum recovery algorithm of the temporally and spatially modulated Fourier transform imaging spectrometer (TSMFTIS), we simulated the distortion spectra recovery process of the TMSFTIS in different maximum mix ratio and distinguished the difference between the recovered spectra and the true spectrum by different spectral discrimination methods. The experiment results show that the SID x tan(SGA(pi/2)) can not only identify the similarity of the whole spectral shapes, but also distinguish local differences of the spectral characteristics. A comparative study was conducted among the different discrimination methods. The results have validated that the SID x tan(SGA(pi/2)) has a significant improvement in the discriminatory ability. PMID:21595255

  17. A diffusive information preservation method for small Knudsen number flows

    NASA Astrophysics Data System (ADS)

    Fei, Fei; Fan, Jing

    2013-06-01

    The direct simulation Monte Carlo (DSMC) method is a powerful particle-based method for modeling gas flows. It works well for relatively large Knudsen (Kn) numbers, typically larger than 0.01, but quickly becomes computationally intensive as Kn decreases due to its time step and cell size limitations. An alternative approach was proposed to relax or remove these limitations, based on replacing pairwise collisions with a stochastic model corresponding to the Fokker-Planck equation [J. Comput. Phys., 229, 1077 (2010); J. Fluid Mech., 680, 574 (2011)]. Similar to the DSMC method, the downside of that approach suffers from computationally statistical noise. To solve the problem, a diffusion-based information preservation (D-IP) method has been developed. The main idea is to track the motion of a simulated molecule from the diffusive standpoint, and obtain the flow velocity and temperature through sampling and averaging the IP quantities. To validate the idea and the corresponding model, several benchmark problems with Kn ˜ 10-3-10-4 have been investigated. It is shown that the IP calculations are not only accurate, but also efficient because they make possible using a time step and cell size over an order of magnitude larger than the mean collision time and mean free path, respectively.

  18. A diffusive information preservation method for small Knudsen number flows

    SciTech Connect

    Fei, Fei; Fan, Jing

    2013-06-15

    The direct simulation Monte Carlo (DSMC) method is a powerful particle-based method for modeling gas flows. It works well for relatively large Knudsen (Kn) numbers, typically larger than 0.01, but quickly becomes computationally intensive as Kn decreases due to its time step and cell size limitations. An alternative approach was proposed to relax or remove these limitations, based on replacing pairwise collisions with a stochastic model corresponding to the Fokker–Planck equation [J. Comput. Phys., 229, 1077 (2010); J. Fluid Mech., 680, 574 (2011)]. Similar to the DSMC method, the downside of that approach suffers from computationally statistical noise. To solve the problem, a diffusion-based information preservation (D-IP) method has been developed. The main idea is to track the motion of a simulated molecule from the diffusive standpoint, and obtain the flow velocity and temperature through sampling and averaging the IP quantities. To validate the idea and the corresponding model, several benchmark problems with Kn ∼ 10{sup −3}–10{sup −4} have been investigated. It is shown that the IP calculations are not only accurate, but also efficient because they make possible using a time step and cell size over an order of magnitude larger than the mean collision time and mean free path, respectively.

  19. Testing a Theory-Based Mobility Monitoring Protocol Using In-Home Sensors: A Feasibility Study

    PubMed Central

    Reeder, Blaine; Chung, Jane; Lazar, Amanda; Joe, Jonathan; Demiris, George; Thompson, Hilaire J.

    2014-01-01

    Mobility is a key factor in the performance of many everyday tasks required for independent living as a person grows older. The purpose of this mixed methods study was to test a theory-based mobility monitoring protocol by comparing sensor-based measures to self-report measures of mobility and assessing the acceptability of in-home sensors with older adults. Standardized instruments to measure physical, psychosocial and cognitive parameters were administered to 8 community-dwelling older adults at baseline, 3 month and 6 month visits (examples: FES, GDS-SF, Mini-cog). Semi-structured interviews to characterize acceptability of the technology were conducted at 3 month and 6 month visits. Technical issues prevented comparison of sensor-based measures with self-report measures. In-home sensor technology for monitoring mobility is acceptable to older adults. Implementing our theory-based mobility monitoring protocol in a field study in the homes of older adults is a feasible undertaking but requires more robust technology for sensor-based measure validation. PMID:23938159

  20. Towards a theory-based positive youth development programme.

    PubMed

    Brink, Andrea Jw; Wissing, Marié P

    2013-01-01

    The aim of this study was to develop and describe an intervention programme for young adolescents, guided by the Positive Youth Development Intervention (PYDI) model, which provides a perspective on the facilitation of development in a more positive trajectory. The key concepts and processes suggested by the PYDI model were further analysed and broadened using existing literature for operationalisation and application purposes. Self-regulation is the central process effectuating developmental change, within the contexts of: a) the navigation of stressors; and b) the formulation and effective pursuit of relevant personal goals. Self-regulation, together with a developmental perspective, provided guidelines regarding the relevant skills and knowledge. These are facilitating: a) identity development; b) formulation of goals congruent with the latter; c) decision-making skills; d) coping skills; e) regulation of affect and cognition; and f) socialisation skills. The relevant content areas and the manner of the facilitation of these are indicated. The theory-based programme can be implemented and its effect empirically evaluated. Levels of hope, problem-solving efficacy and social efficacy may serve as, inter alia, indicators of developmental change. PMID:25860303

  1. System and Method for RFID-Enabled Information Collection

    NASA Technical Reports Server (NTRS)

    Fink, Patrick W. (Inventor); Lin, Gregory Y. (Inventor); Kennedy, Timothy F. (Inventor); Ngo, Phong H. (Inventor); Byerly, Diane (Inventor)

    2016-01-01

    Methods, apparatuses and systems for radio frequency identification (RFID)-enabled information collection are disclosed, including an enclosure, a collector coupled to the enclosure, an interrogator, a processor, and one or more RFID field sensors, each having an individual identification, disposed within the enclosure. In operation, the interrogator transmits an incident signal to the collector, causing the collector to generate an electromagnetic field within the enclosure. The electromagnetic field is affected by one or more influences. RFID sensors respond to the electromagnetic field by transmitting reflected signals containing the individual identifications of the responding RFID sensors to the interrogator. The interrogator receives the reflected signals, measures one or more returned signal strength indications ("RSSI") of the reflected signals and sends the RSSI measurements and identification of the responding RFID sensors to the processor to determine one or more facts about the influences. Other embodiments are also described.

  2. Caveats: numerical requirements in graph theory based quantitation of tissue architecture.

    PubMed

    Sudbø, J; Marcelpoil, R; Reith, A

    2000-01-01

    Graph theory based methods represent one approach to an objective and reproducible structural analysis of tissue architecture. By these methods, neighborhood relations between a number of objects (e.g., cells) are explored and inherent to these methods are therefore certain requirements as to the number of objects to be included in the analysis. However, the question of how many objects are required to achieve reproducible values in repeated computations of proposed structural features, has previously not been adressed specifically. After digitising HE stained slides and storing them as grey level images, cell nuclei were segmented and their geometrical centre of gravity were computed, serving as the basis for construction of the Voronoi diagram (VD) and its subgraphs. Variations in repeated computations of structural features derived from these graphs were related to the number of cell nuclei included in the analysis. We demonstrate a large variation in the values of the structural features from one computation to another in one and the same section when only a limited number of cells (100-500) are included in the analysis. This variation decreased with increasing number of cells analyzed. The exact number of cells required to achieve reproducible values differ significantly between tissues, but not between separate cases of similar lesions. There are no significant differences between normal and malignantly changed tissues in oral mucosa with respect to how many cells must be included. For graph theory based analysis of tissue architecture, care must be taken to include an adequate number of objects; for some of the structural features we have tested, more than 3000 cells. PMID:11310642

  3. Constructing DEM from characteristic terrain information using HASM method

    NASA Astrophysics Data System (ADS)

    Song, Dunjiang; Yue, Tianxiang; Du, Zhengping; Wang, Qingguo

    2009-09-01

    In the construction of DEM, terrain features (e.g. valleys or stream lines, ridges, peaks, saddle points) are important for improving DEM accuracy and saw many applications in hydrology, precision agriculture, military trajectory planning, etc. HASM (High Accuracy Surface Modeling) is a method for surface modeling, which is based on the theory of surface. Presently, HASM is only used for scattered point's interpolation. So the work in this paper attempts to construct DEM based on the characteristic terrain information as stream lines and scattered points by HASM method. The course is described as the following steps. Firstly TIN (Triangulated Irregular Network) from the scattered points is generated. Secondly, each segment of the stream lines is well oriented to represent stream lines' flow direction, and a tree data structure (that has parent, children and brothers) is used to represent the whole stream lines' segments. A segment is a curve which does not intersect with other segments. A Water Course Flow (WCF) line is a set of segment lines connected piecewise but without overlapping or repetition, from the most upper reaches to the most lower reaches. From the stream lines' tree data structure, all the possible WCF lines are enumerated, and the start point and end point of each WCF lines is predicted from searching among the TIN. Thirdly, given a cell size, a 2-D matrix for the research region is built, and the values of the cells who were traversed by the stream lines by linear interpolation among each WCF lines. Fourthly, all the valued cells that were passed through by the stream line and that were from the scattered points are gathered as known scattered sampling points, and then HASM is used to construct the final DEM. A case study on the typical landform of plateau of China, KongTong gully of Dongzhi Plateau, Qingyang, Gausu province, is presented. The original data is manually vecterized from scanned maps 1:10,000, includes scattered points, stream lines

  4. Constructing DEM from characteristic terrain information using HASM method

    NASA Astrophysics Data System (ADS)

    Song, Dunjiang; Yue, Tianxiang; Du, Zhengping; Wang, Qingguo

    2010-11-01

    In the construction of DEM, terrain features (e.g. valleys or stream lines, ridges, peaks, saddle points) are important for improving DEM accuracy and saw many applications in hydrology, precision agriculture, military trajectory planning, etc. HASM (High Accuracy Surface Modeling) is a method for surface modeling, which is based on the theory of surface. Presently, HASM is only used for scattered point's interpolation. So the work in this paper attempts to construct DEM based on the characteristic terrain information as stream lines and scattered points by HASM method. The course is described as the following steps. Firstly TIN (Triangulated Irregular Network) from the scattered points is generated. Secondly, each segment of the stream lines is well oriented to represent stream lines' flow direction, and a tree data structure (that has parent, children and brothers) is used to represent the whole stream lines' segments. A segment is a curve which does not intersect with other segments. A Water Course Flow (WCF) line is a set of segment lines connected piecewise but without overlapping or repetition, from the most upper reaches to the most lower reaches. From the stream lines' tree data structure, all the possible WCF lines are enumerated, and the start point and end point of each WCF lines is predicted from searching among the TIN. Thirdly, given a cell size, a 2-D matrix for the research region is built, and the values of the cells who were traversed by the stream lines by linear interpolation among each WCF lines. Fourthly, all the valued cells that were passed through by the stream line and that were from the scattered points are gathered as known scattered sampling points, and then HASM is used to construct the final DEM. A case study on the typical landform of plateau of China, KongTong gully of Dongzhi Plateau, Qingyang, Gausu province, is presented. The original data is manually vecterized from scanned maps 1:10,000, includes scattered points, stream lines

  5. A cloud theory-based particle swarm optimization for multiple decision maker vehicle routing problems with fuzzy random time windows

    NASA Astrophysics Data System (ADS)

    Ma, Yanfang; Xu, Jiuping

    2015-06-01

    This article puts forward a cloud theory-based particle swarm optimization (CTPSO) algorithm for solving a variant of the vehicle routing problem, namely a multiple decision maker vehicle routing problem with fuzzy random time windows (MDVRPFRTW). A new mathematical model is developed for the proposed problem in which fuzzy random theory is used to describe the time windows and bi-level programming is applied to describe the relationship between the multiple decision makers. To solve the problem, a cloud theory-based particle swarm optimization (CTPSO) is proposed. More specifically, this approach makes improvements in initialization, inertia weight and particle updates to overcome the shortcomings of the basic particle swarm optimization (PSO). Parameter tests and results analysis are presented to highlight the performance of the optimization method, and comparison of the algorithm with the basic PSO and the genetic algorithm demonstrates its efficiency.

  6. Indigenous Knowledge and Culturally Responsive Methods in Information Research

    ERIC Educational Resources Information Center

    Becvar, Katherine; Srinivasan, Ramesh

    2009-01-01

    Research and professional practice in librarianship has increasingly turned to community-focused information services (CIS), which allow people to participate in creating and sharing information about themselves and their communities. These information services have a great potential to empower and engage marginalized communities; however, in this…

  7. Agent-based method for distributed clustering of textual information

    DOEpatents

    Potok, Thomas E [Oak Ridge, TN; Reed, Joel W [Knoxville, TN; Elmore, Mark T [Oak Ridge, TN; Treadwell, Jim N [Louisville, TN

    2010-09-28

    A computer method and system for storing, retrieving and displaying information has a multiplexing agent (20) that calculates a new document vector (25) for a new document (21) to be added to the system and transmits the new document vector (25) to master cluster agents (22) and cluster agents (23) for evaluation. These agents (22, 23) perform the evaluation and return values upstream to the multiplexing agent (20) based on the similarity of the document to documents stored under their control. The multiplexing agent (20) then sends the document (21) and the document vector (25) to the master cluster agent (22), which then forwards it to a cluster agent (23) or creates a new cluster agent (23) to manage the document (21). The system also searches for stored documents according to a search query having at least one term and identifying the documents found in the search, and displays the documents in a clustering display (80) of similarity so as to indicate similarity of the documents to each other.

  8. Models for Theory-Based M.A. and Ph.D. Programs.

    ERIC Educational Resources Information Center

    Botan, Carl; Vasquez, Gabriel

    1999-01-01

    Presents work accomplished at the 1998 National Communication Association Summer Conference. Outlines reasons for theory-based education in public relations. Presents an integrated model of student outcomes, curriculum, pedagogy, and assessment for theory-based master's and doctoral programs, including assumptions made and rationale for such…

  9. Informative Parameters of Dynamic Geo-electricity Methods

    NASA Astrophysics Data System (ADS)

    Tursunmetov, R.

    With growing complexity of geological tasks and revealing abnormality zones con- nected with ore, oil, gas and water availability, methods of dynamic geo-electricity started to be used. In these methods geological environment is considered as inter- phase irregular one. Main dynamic element of this environment is double electric layer, which develops on the boundary between solid and liquid phase. In ore or wa- ter saturated environment double electric layers become electrochemical or electro- kinetic active elements of geo-electric environment, which, in turn, form natural elec- tric field. Mentioned field influences artificially created field distribution and inter- action bear complicated super-position or non-linear character. Therefore, geological environment is considered as active one, which is able to accumulate and transform artificially superpositioned fields. Main dynamic property of this environment is non- liner behavior of specific electric resistance and soil polarization depending on current density and measurements frequency, which serve as informative parameters for dy- namic geo-electricity methods. Study of disperse soil electric properties in impulse- frequency regime with study of temporal and frequency characteristics of electric field is of main interest for definition of geo-electric abnormality. Volt-amperic characteris- tics of electromagnetic field study has big practical significance. These characteristics are determined by electric-chemically active ore and water saturated fields. Mentioned parameters depend on initiated field polarity, in particular on ore saturated zone's character, composition and mineralization and natural electric field availability un- der cathode and anode mineralization. Non-linear behavior of environment's dynamic properties impacts initiated field structure that allows to define abnormal zone loca- tion. And, finally, study of soil anisotropy dynamic properties in space will allow to identify filtration flows

  10. Item Characteristic Curve Estimation of Signal Detection Theory-Based Personality Data: A Two-Stage Approach to Item Response Modeling.

    ERIC Educational Resources Information Center

    Williams, Kevin M.; Zumbo, Bruno D.

    2003-01-01

    Developed an item characteristic curve estimation of signal detection theory based personality data. Results for 266 college students taking the Overclaiming Questionnaire (D. Paulhus and N. Bruce, 1990) suggest that this method is a reasonable approach to describing item functioning and that there are advantages to this method over traditional…

  11. Remote multi-position information gathering system and method

    DOEpatents

    Hirschfeld, Tomas B.

    1986-01-01

    A technique for gathering specific information from various remote locations, especially fluorimetric information characteristic of particular materials at the various locations is disclosed herein. This technique uses a single source of light disposed at still a different, central location and an overall optical network including an arrangement of optical fibers cooperating with the light source for directing individual light beams into the different information bearing locations. The incoming light beams result in corresponding displays of light, e.g., fluorescent light, containing the information to be obtained. The optical network cooperates with these light displays at the various locations for directing outgoing light beams containing the same information as their cooperating displays from these locations to the central location. Each of these outgoing beams is applied to a detection arrangement, e.g., a fluorescence spectroscope, for retrieving the information contained thereby.

  12. Remote multi-position information gathering system and method

    DOEpatents

    Hirschfeld, Tomas B.

    1989-01-01

    A technique for gathering specific information from various remote locations, especially fluorimetric information characteristic of particular materials at the various locations is disclosed herein. This technique uses a single source of light disposed at still a different, central location and an overall optical network including an arrangement of optical fibers cooperating with the light source for directing individual light beams into the different information bearing locations. The incoming light beams result in corresponding displays of light, e.g., fluorescent light, containing the information to be obtained. The optical network cooperates with these light displays at the various locations for directing outgoing light beams containing the same information as their cooperating displays from these locations to the central location. Each of these outgoing beams is applied to a detection arrangement, e.g., a fluorescence spectroscope, for retrieving the information contained thereby.

  13. Remote multi-position information gathering system and method

    DOEpatents

    Hirschfeld, T.B.

    1986-12-02

    A technique for gathering specific information from various remote locations, especially fluorimetric information characteristic of particular materials at the various locations is disclosed herein. This technique uses a single source of light disposed at still a different, central location and an overall optical network including an arrangement of optical fibers cooperating with the light source for directing individual light beams into the different information bearing locations. The incoming light beams result in corresponding displays of light, e.g., fluorescent light, containing the information to be obtained. The optical network cooperates with these light displays at the various locations for directing outgoing light beams containing the same information as their cooperating displays from these locations to the central location. Each of these outgoing beams is applied to a detection arrangement, e.g., a fluorescence spectroscope, for retrieving the information contained thereby. 9 figs.

  14. Remote multi-position information gathering system and method

    DOEpatents

    Hirschfeld, Tomas B.

    1986-01-01

    A technique for gathering specific information from various remote locations, especially fluorimetric information characteristic of particular materials at the various locations is disclosed herein. This technique uses a single source of light disposed at still a different, central location and an overall optical network including an arrangement of optical fibers cooperating with the light source for directing individual light beams into the different information bearing locations. The incoming light beams result in corresponding displays of light, e.g., fluorescent light, containing the information to be obtained. The optical network cooperates with these light displays at the various locations for directing ongoing light beams containing the same information as their cooperating displays from these locations to the central location. Each of these outgoing beams is applied to a detection arrangement, e.g., a fluorescence spectroscope, for retrieving the information contained thereby.

  15. Remote multi-position information gathering system and method

    DOEpatents

    Hirschfeld, T.B.

    1989-01-24

    A technique for gathering specific information from various remote locations, especially fluorimetric information characteristic of particular materials at the various locations is disclosed herein. This technique uses a single source of light disposed at still a different, central location and an overall optical network including an arrangement of optical fibers cooperating with the light source for directing individual light beams into the different information bearing locations. The incoming light beams result in corresponding displays of light, e.g., fluorescent light, containing the information to be obtained. The optical network cooperates with these light displays at the various locations for directing outgoing light beams containing the same information as their cooperating displays from these locations to the central location. Each of these outgoing beams is applied to a detection arrangement, e.g., a fluorescence spectroscope, for retrieving the information contained thereby. 9 figs.

  16. "Method Is Education:" Making Informal Education Social and Substantive

    ERIC Educational Resources Information Center

    Stern, Miriam Heller

    2007-01-01

    This article presents the author's response to Joseph Reimer's essay titled, "Beyond More Jews Doing Jewish: Clarifying the Goals of Informal Jewish Education." Joseph Reimer states that the challenge for informal education is to move beyond socialization to clarify and achieve "deeper" educational goals. Distinguishing between socialization and…

  17. Method and system for analyzing and classifying electronic information

    DOEpatents

    McGaffey, Robert W.; Bell, Michael Allen; Kortman, Peter J.; Wilson, Charles H.

    2003-04-29

    A data analysis and classification system that reads the electronic information, analyzes the electronic information according to a user-defined set of logical rules, and returns a classification result. The data analysis and classification system may accept any form of computer-readable electronic information. The system creates a hash table wherein each entry of the hash table contains a concept corresponding to a word or phrase which the system has previously encountered. The system creates an object model based on the user-defined logical associations, used for reviewing each concept contained in the electronic information in order to determine whether the electronic information is classified. The data analysis and classification system extracts each concept in turn from the electronic information, locates it in the hash table, and propagates it through the object model. In the event that the system can not find the electronic information token in the hash table, that token is added to a missing terms list. If any rule is satisfied during propagation of the concept through the object model, the electronic information is classified.

  18. Using the Work System Method with Freshman Information Systems Students

    ERIC Educational Resources Information Center

    Recker, Jan; Alter, Steven

    2012-01-01

    Recent surveys of information technology management professionals show that understanding business domains in terms of business productivity and cost reduction potential, knowledge of different vertical industry segments and their information requirements, understanding of business processes and client-facing skills are more critical for…

  19. Informal Learning of Social Workers: A Method of Narrative Inquiry

    ERIC Educational Resources Information Center

    Gola, Giancarlo

    2009-01-01

    Purpose: The purpose of this paper is to investigate social workers' processes of informal learning, through their narration of their professional experience, in order to understand how social workers learn. Informal learning is any individual practice or activity that is able to produce continuous learning; it is often non-intentional and…

  20. Impact of NDE reliability developments on risk-informed methods

    SciTech Connect

    Walker, S.M.; Ammirato, F.V.

    1996-12-01

    Risk informed inspection procedures are being developed to more effectively and economically manage degradation in plant piping systems. A key element of this process is applying nondestructive examination (NDE) procedures capable of detecting specific damage mechanisms that may be operative in particular locations. Thus, the needs of risk informed analysis are closely coupled with a firm understanding of the capability of NDE.

  1. Graph theory-based measures as predictors of gene morbidity.

    PubMed

    Massanet-Vila, Raimon; Caminal, Pere; Perera, Alexandre

    2010-01-01

    Previous studies have suggested that some graph properties of protein interaction networks might be related with gene morbidity. In particular, it has been suggested that when a polymorphism affects a gene, it is more likely to produce a disease if the node degree in the interaction network is higher than for other genes. However, these results do not take into account the possible bias introduced by the variance in the amount of information available for different genes. This work models the relationship between the morbidity associated with a gene and the degrees of the nodes in the protein interaction network controlling the amount of information available in the literature. A set of 7461 genes and 3665 disease identifiers reported in the Online Mendelian Inheritance in Man (OMIM) was mined jointly with 9630 nodes and 38756 interactions of the Human Proteome Resource Database (HPRD). The information available from a gene was measured through PubMed mining. Results suggest that the correlation between the degree of a node in the protein interaction network and its morbidity is largely contributed by the information available from the gene. Even though the results suggest a positive correlation between the degree of a node and its morbidity while controlling the information factor, we believe this correlation has to be taken with caution for it can be affected by other factors not taken into account in this study. PMID:21096114

  2. 5 CFR 1640.6 - Methods of providing information.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... part to participants by making it available on the TSP Web site. A participant can request paper copies of that information from the TSP by calling the ThriftLine, submitting a request through the TSP...

  3. 5 CFR 1640.6 - Methods of providing information.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... part to participants by making it available on the TSP Web site. A participant can request paper copies of that information from the TSP by calling the ThriftLine, submitting a request through the TSP...

  4. 5 CFR 1640.6 - Methods of providing information.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... part to participants by making it available on the TSP Web site. A participant can request paper copies of that information from the TSP by calling the ThriftLine, submitting a request through the TSP...

  5. Study protocol: a randomised controlled trial of a theory-based online intervention to improve sun safety among Australian adults

    PubMed Central

    2014-01-01

    Background The effects of exposure to ultraviolet radiation are a significant concern in Australia which has one of the highest incidences of skin cancer in the world. Despite most skin cancers being preventable by encouraging consistent adoption of sun-protective behaviours, incidence rates are not decreasing. There is a dearth of research examining the factors involved in engaging in sun-protective behaviours. Further, online multi-behavioural theory-based interventions have yet to be explored fully as a medium for improving sun-protective behaviour in adults. This paper presents the study protocol of a randomised controlled trial of an online intervention based on the Theory of Planned Behaviour (TPB) that aims to improve sun safety among Australian adults. Methods/Design Approximately 420 adults aged 18 and over and predominantly from Queensland, Australia, will be recruited and randomised to the intervention (n = 200), information only (n = 200) or the control group (n = 20). The intervention focuses on encouraging supportive attitudes and beliefs toward sun-protective behaviour, fostering perceptions of normative support for sun protection, and increasing perceptions of control/self-efficacy over sun protection. The intervention will be delivered online over a single session. Data will be collected immediately prior to the intervention (Time 1), immediately following the intervention (Time 1b), and one week (Time 2) and one month (Time 3) post-intervention. Primary outcomes are intentions to sun protect and sun-protective behaviour. Secondary outcomes are the participants’ attitudes toward sun protection, perceptions of normative support for sun protection (i.e. subjective norms, group norms, personal norms and image norms) and perceptions of control/self-efficacy toward sun protection. Discussion The study will contribute to an understanding of the effectiveness of a TPB-based online intervention to improve Australian adults’ sun

  6. Institutionalizing Retention Activity: Toward a Theory-Based Model.

    ERIC Educational Resources Information Center

    Saunders, Martha Dunagin

    2003-01-01

    Examines Appreciative Inquiry, a relatively new approach to organizational change and growth, as a method for institutionalizing retention activity. Results of a case study in a college of arts and sciences suggest the method to be effective in creating a shared vision for the organization, energized participants, improved morale, and increased…

  7. Game Theory Based Trust Model for Cloud Environment

    PubMed Central

    Gokulnath, K.; Uthariaraj, Rhymend

    2015-01-01

    The aim of this work is to propose a method to establish trust at bootload level in cloud computing environment. This work proposes a game theoretic based approach for achieving trust at bootload level of both resources and users perception. Nash equilibrium (NE) enhances the trust evaluation of the first-time users and providers. It also restricts the service providers and the users to violate service level agreement (SLA). Significantly, the problem of cold start and whitewashing issues are addressed by the proposed method. In addition appropriate mapping of cloud user's application to cloud service provider for segregating trust level is achieved as a part of mapping. Thus, time complexity and space complexity are handled efficiently. Experiments were carried out to compare and contrast the performance of the conventional methods and the proposed method. Several metrics like execution time, accuracy, error identification, and undecidability of the resources were considered. PMID:26380365

  8. Informativeness Improvement of Hardness Test Methods for Metal Product Assessment

    NASA Astrophysics Data System (ADS)

    Osipov, S.; Podshivalov, I.; Osipov, O.; Zhantybaev, A.

    2016-06-01

    The paper presents a combination of theoretical suggestions, results, and observations allowing to improve the informativeness of hardness testing process in solving problems of metal product assessment while in operation. The hardness value of metal surface obtained by a single measurement is considered to be random. Various measures of location and scattering of the random variable were experimentally estimated for a number of test samples using the correlation analysis, and their close interaction was studied. It was stated that in metal assessment, the main informative characteristics of hardness testing process are its average value and mean-square deviation for measures of location and scattering, respectively.

  9. Statistical methods of combining information: Applications to sensor data fusion

    SciTech Connect

    Burr, T.

    1996-12-31

    This paper reviews some statistical approaches to combining information from multiple sources. Promising new approaches will be described, and potential applications to combining not-so-different data sources such as sensor data will be discussed. Experiences with one real data set are described.

  10. Paper Trail: One Method of Information Literacy Assessment

    ERIC Educational Resources Information Center

    Nutefall, Jennifer

    2004-01-01

    Assessing students' information literacy skills can be difficult depending on the involvement of the librarian in a course. To overcome this, librarians created an assignment called the Paper Trail, where students wrote a short essay about their research process and reflected on what they would do differently. Through reviewing and grading these…

  11. Game theory-based mode cooperative selection mechanism for device-to-device visible light communication

    NASA Astrophysics Data System (ADS)

    Liu, Yuxin; Huang, Zhitong; Li, Wei; Ji, Yuefeng

    2016-03-01

    Various patterns of device-to-device (D2D) communication, from Bluetooth to Wi-Fi Direct, are emerging due to the increasing requirements of information sharing between mobile terminals. This paper presents an innovative pattern named device-to-device visible light communication (D2D-VLC) to alleviate the growing traffic problem. However, the occlusion problem is a difficulty in D2D-VLC. This paper proposes a game theory-based solution in which the best-response dynamics and best-response strategies are used to realize a mode-cooperative selection mechanism. This mechanism uses system capacity as the utility function to optimize system performance and selects the optimal communication mode for each active user from three candidate modes. Moreover, the simulation and experimental results show that the mechanism can attain a significant improvement in terms of effectiveness and energy saving compared with the cases where the users communicate via only the fixed transceivers (light-emitting diode and photo diode) or via only D2D.

  12. Development of StopAdvisor: A theory-based interactive internet-based smoking cessation intervention.

    PubMed

    Michie, Susan; Brown, Jamie; Geraghty, Adam W A; Miller, Sascha; Yardley, Lucy; Gardner, Benjamin; Shahab, Lion; McEwen, Andy; Stapleton, John A; West, Robert

    2012-09-01

    Reviews of internet-based behaviour-change interventions have shown that they can be effective but there is considerable heterogeneity and effect sizes are generally small. In order to advance science and technology in this area, it is essential to be able to build on principles and evidence of behaviour change in an incremental manner. We report the development of an interactive smoking cessation website, StopAdvisor, designed to be attractive and effective across the social spectrum. It was informed by a broad motivational theory (PRIME), empirical evidence, web-design expertise, and user-testing. The intervention was developed using an open-source web-development platform, 'LifeGuide', designed to facilitate optimisation and collaboration. We identified 19 theoretical propositions, 33 evidence- or theory-based behaviour change techniques, 26 web-design principles and nine principles from user-testing. These were synthesised to create the website, 'StopAdvisor' (see http://www.lifeguideonline.org/player/play/stopadvisordemonstration). The systematic and transparent application of theory, evidence, web-design expertise and user-testing within an open-source development platform can provide a basis for multi-phase optimisation contributing to an 'incremental technology' of behaviour change. PMID:24073123

  13. Development and validation of a theory-based multimedia application for educating Persian patients on hemodialysis.

    PubMed

    Feizalahzadeh, Hossein; Tafreshi, Mansoureh Zagheri; Moghaddasi, Hamid; Farahani, Mansoureh A; Khosrovshahi, Hamid Tayebi; Zareh, Zahra; Mortazavi, Fakhrsadat

    2014-05-01

    Although patients on hemodialysis require effective education for self-care, several issues associated with the process raise barriers that make learning difficult. Computer-based education can reduce these problems and improve the quality of education. This study aims to develop and validate a theory-based multimedia application to educate Persian patients on hemodialysis. The study consisted of five phases: (1) content development, (2) prototype development 1, (3) evaluation by users, (4) evaluation by a multidisciplinary group of experts, and (5) prototype development 2. Data were collected through interviews and literature review with open-ended questions and two survey forms that consisted of a five-level scale. In the Results section, patient needs on hemodialysis self-care and related content were categorized into seven sections, including kidney function and failure, hemodialysis, vascular access, nutrition, medication, physical activity, and living with hemodialysis. The application designed includes seven modules consisting of user-controlled small multimedia units. During navigation through this application, the users were provided step-by-step information on self-care. Favorable scores were obtained from evaluations by users and experts. The researchers concluded that this application can facilitate hemodialysis education and learning process for the patients by focusing on their self-care needs using the multimedia design principles. PMID:24642877

  14. 10 CFR 207.3 - Method of collecting energy information under ESECA.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 3 2012-01-01 2012-01-01 false Method of collecting energy information under ESECA. 207.3 Section 207.3 Energy DEPARTMENT OF ENERGY OIL COLLECTION OF INFORMATION Collection of Information Under the Energy Supply and Environmental Coordination Act of 1974 § 207.3 Method of collecting...

  15. 10 CFR 207.3 - Method of collecting energy information under ESECA.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 3 2013-01-01 2013-01-01 false Method of collecting energy information under ESECA. 207.3 Section 207.3 Energy DEPARTMENT OF ENERGY OIL COLLECTION OF INFORMATION Collection of Information Under the Energy Supply and Environmental Coordination Act of 1974 § 207.3 Method of collecting...

  16. 10 CFR 207.3 - Method of collecting energy information under ESECA.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 3 2014-01-01 2014-01-01 false Method of collecting energy information under ESECA. 207.3 Section 207.3 Energy DEPARTMENT OF ENERGY OIL COLLECTION OF INFORMATION Collection of Information Under the Energy Supply and Environmental Coordination Act of 1974 § 207.3 Method of collecting...

  17. The Effects of Presentation Method and Information Density on Visual Search Ability and Working Memory Load

    ERIC Educational Resources Information Center

    Chang, Ting-Wen; Kinshuk; Chen, Nian-Shing; Yu, Pao-Ta

    2012-01-01

    This study investigates the effects of successive and simultaneous information presentation methods on learner's visual search ability and working memory load for different information densities. Since the processing of information in the brain depends on the capacity of visual short-term memory (VSTM), the limited information processing capacity…

  18. An efficient steganography method for hiding patient confidential information.

    PubMed

    Al-Dmour, Hayat; Al-Ani, Ahmed; Nguyen, Hung

    2014-01-01

    This paper deals with the important issue of security and confidentiality of patient information when exchanging or storing medical images. Steganography has recently been viewed as an alternative or complement to cryptography, as existing cryptographic systems are not perfect due to their vulnerability to certain types of attack. We propose in this paper a new steganography algorithm for hiding patient confidential information. It utilizes Pixel Value Differencing (PVD) to identify contrast regions in the image and a Hamming code that embeds 3 secret message bits into 4 bits of the cover image. In order to preserve the content of the region of interest (ROI), the embedding is only performed using the Region of Non-Interest (RONI). PMID:25569937

  19. Effects of a social cognitive theory-based hip fracture prevention web site for older adults.

    PubMed

    Nahm, Eun-Shim; Barker, Bausell; Resnick, Barbara; Covington, Barbara; Magaziner, Jay; Brennan, Patricia Flatley

    2010-01-01

    The purposes of this study were to develop a Social Cognitive Theory-based, structured Hip Fracture Prevention Web site for older adults and conduct a preliminary evaluation of its effectiveness. The Theory-based, structured Hip Fracture Prevention Web site is composed of learning modules and a moderated discussion board. A total of 245 older adults recruited from two Web sites and a newspaper advertisement were randomized into the Theory-based, structured Hip Fracture Prevention Web site and the conventional Web sites groups. Outcomes included (1) knowledge (hip fractures and osteoporosis), (2) self-efficacy and outcome expectations, and (3) calcium intake and exercise and were assessed at baseline, end of treatment (2 weeks), and follow-up (3 months). Both groups showed significant improvement in most outcomes. For calcium intake, only the Theory-based, structured Hip Fracture Prevention Web site group showed improvement. None of the group and time interactions were significant. The Theory-based, structured Hip Fracture Prevention Web site group, however, was more satisfied with the intervention. The discussion board usage was significantly correlated with outcome gains. Despite several limitations, the findings showed some preliminary effectiveness of Web-based health interventions for older adults and the use of a Theory-based, structured Hip Fracture Prevention Web site as a sustainable Web structure for online health behavior change interventions. PMID:20978408

  20. Imaging systems and methods for obtaining and using biometric information

    DOEpatents

    McMakin, Douglas L [Richland, WA; Kennedy, Mike O [Richland, WA

    2010-11-30

    Disclosed herein are exemplary embodiments of imaging systems and methods of using such systems. In one exemplary embodiment, one or more direct images of the body of a clothed subject are received, and a motion signature is determined from the one or more images. In this embodiment, the one or more images show movement of the body of the subject over time, and the motion signature is associated with the movement of the subject's body. In certain implementations, the subject can be identified based at least in part on the motion signature. Imaging systems for performing any of the disclosed methods are also disclosed herein. Furthermore, the disclosed imaging, rendering, and analysis methods can be implemented, at least in part, as one or more computer-readable media comprising computer-executable instructions for causing a computer to perform the respective methods.

  1. Information storage medium and method of recording and retrieving information thereon

    DOEpatents

    Marchant, D. D.; Begej, Stefan

    1986-01-01

    Information storage medium comprising a semiconductor doped with first and second impurities or dopants. Preferably, one of the impurities is introduced by ion implantation. Conductive electrodes are photolithographically formed on the surface of the medium. Information is recorded on the medium by selectively applying a focused laser beam to discrete regions of the medium surface so as to anneal discrete regions of the medium containing lattice defects introduced by the ion-implanted impurity. Information is retrieved from the storage medium by applying a focused laser beam to annealed and non-annealed regions so as to produce a photovoltaic signal at each region.

  2. A High Accuracy Method for Semi-supervised Information Extraction

    SciTech Connect

    Tratz, Stephen C.; Sanfilippo, Antonio P.

    2007-04-22

    Customization to specific domains of dis-course and/or user requirements is one of the greatest challenges for today’s Information Extraction (IE) systems. While demonstrably effective, both rule-based and supervised machine learning approaches to IE customization pose too high a burden on the user. Semi-supervised learning approaches may in principle offer a more resource effective solution but are still insufficiently accurate to grant realistic application. We demonstrate that this limitation can be overcome by integrating fully-supervised learning techniques within a semi-supervised IE approach, without increasing resource requirements.

  3. Treatment of adolescent sexual offenders: theory-based practice.

    PubMed

    Sermabeikian, P; Martinez, D

    1994-11-01

    The treatment of adolescent sexual offenders (ASO) has its theoretical underpinnings in social learning theory. Although social learning theory has been frequently cited in literature, a comprehensive application of this theory, as applied to practice, has not been mapped out. The social learning and social cognitive theories of Bandura appear to be particularly relevant to the group treatment of this population. The application of these theories to practice, as demonstrated in a program model, is discussed as a means of demonstrating how theory-driven practice methods can be developed. PMID:7850605

  4. Extracting ocean surface information from altimeter returns - The deconvolution method

    NASA Technical Reports Server (NTRS)

    Rodriguez, Ernesto; Chapman, Bruce

    1989-01-01

    An evaluation of the deconvolution method for estimating ocean surface parameters from ocean altimeter waveforms is presented. It is shown that this method presents a fast, accurate way of determining the ocean surface parameters from noisy altimeter data. Three parameters may be estimated by using this method, including the altimeter-height error, the ocean-surface standard deviation, and the ocean-surface skewness. By means of a Monte Carlo experiment, an 'optimum' deconvolution algorithm and the accuracies with which the above parameters may be estimated using this algorithm are determined. Then the influence of instrument effects, such as errors in calibration and pointing-angle estimation, on the estimated parameters is examined. Finally, the deconvolution algorithm is used to estimate height and ocean-surface parameters from Seasat data.

  5. A Theory-Based Exercise App to Enhance Exercise Adherence: A Pilot Study

    PubMed Central

    Voth, Elizabeth C; Oelke, Nelly D

    2016-01-01

    Background Use of mobile health (mHealth) technology is on an exponential rise. mHealth apps have the capability to reach a large number of individuals, but until now have lacked the integration of evidence-based theoretical constructs to increase exercise behavior in users. Objective The purpose of this study was to assess the effectiveness of a theory-based, self-monitoring app on exercise and self-monitoring behavior over 8 weeks. Methods A total of 56 adults (mean age 40 years, SD 13) were randomly assigned to either receive the mHealth app (experimental; n=28) or not to receive the app (control; n=28). All participants engaged in an exercise goal-setting session at baseline. Experimental condition participants received weekly short message service (SMS) text messages grounded in social cognitive theory and were encouraged to self-monitor exercise bouts on the app on a daily basis. Exercise behavior, frequency of self-monitoring exercise behavior, self-efficacy to self-monitor, and self-management of exercise behavior were collected at baseline and at postintervention. Results Engagement in exercise bouts was greater in the experimental condition (mean 7.24, SD 3.40) as compared to the control condition (mean 4.74, SD 3.70, P=.03, d=0.70) at week 8 postintervention. Frequency of self-monitoring increased significantly over the 8-week investigation between the experimental and control conditions (P<.001, partial η2=.599), with participants in the experimental condition self-monitoring significantly more at postintervention (mean 6.00, SD 0.93) in comparison to those in the control condition (mean 1.95, SD 2.58, P<.001, d=2.10). Self-efficacy to self-monitor and perceived self-management of exercise behavior were unaffected by this intervention. Conclusions The successful integration of social cognitive theory into an mHealth exercise self-monitoring app provides support for future research to feasibly integrate theoretical constructs into existing exercise apps

  6. Methods to include foreign information in national evaluations

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Genomic evaluations (GEBV) with higher reliability often result from including genotypes and phenotypes from foreign bulls in the reference population. Multi-step methods evaluate domestic phenotypes first using only pedigree relationships (EBV), then add foreign data available from multi-trait acro...

  7. Looking inside the black box: a theory-based process evaluation alongside a randomised controlled trial of printed educational materials (the Ontario printed educational message, OPEM) to improve referral and prescribing practices in primary care in Ontario, Canada

    PubMed Central

    Grimshaw, Jeremy M; Zwarenstein, Merrick; Tetroe, Jacqueline M; Godin, Gaston; Graham, Ian D; Lemyre, Louise; Eccles, Martin P; Johnston, Marie; Francis, Jillian J; Hux, Jan; O'Rourke, Keith; Légaré, France; Presseau, Justin

    2007-01-01

    Background Randomised controlled trials of implementation strategies tell us whether (or not) an intervention results in changes in professional behaviour but little about the causal mechanisms that produce any change. Theory-based process evaluations collect data on theoretical constructs alongside randomised trials to explore possible causal mechanisms and effect modifiers. This is similar to measuring intermediate endpoints in clinical trials to further understand the biological basis of any observed effects (for example, measuring lipid profiles alongside trials of lipid lowering drugs where the primary endpoint could be reduction in vascular related deaths). This study protocol describes a theory-based process evaluation alongside the Ontario Printed Educational Message (OPEM) trial. We hypothesize that the OPEM interventions are most likely to operate through changes in physicians' behavioural intentions due to improved attitudes or subjective norms with little or no change in perceived behavioural control. We will test this hypothesis using a well-validated social cognition model, the theory of planned behaviour (TPB) that incorporates these constructs. Methods/design We will develop theory-based surveys using standard methods based upon the TPB for the second and third replications, and survey a subsample of Ontario family physicians from each arm of the trial two months before and six months after the dissemination of the index edition of informed, the evidence based newsletter used for the interventions. In the third replication, our study will converge with the "TRY-ME" protocol (a second study conducted alongside the OPEM trial), in which the content of educational messages was constructed using both standard methods and methods informed by psychological theory. We will modify Dillman's total design method to maximise response rates. Preliminary analyses will initially assess the internal reliability of the measures and use regression to explore the

  8. Risk-Informed Safety Margin Characterization Methods Development Work

    SciTech Connect

    Smith, Curtis L; Ma, Zhegang; Tom Riley; Mandelli, Diego; Nielsen, Joseph W; Alfonsi, Andrea; Rabiti, Cristian

    2014-09-01

    This report summarizes the research activity developed during the Fiscal year 2014 within the Risk Informed Safety Margin and Characterization (RISMC) pathway within the Light Water Reactor Sustainability (LWRS) campaign. This research activity is complementary to the one presented in the INL/EXT-??? report which shows advances Probabilistic Risk Assessment Analysis using RAVEN and RELAP-7 in conjunction to novel flooding simulation tools. Here we present several analyses that prove the values of the RISMC approach in order to assess risk associated to nuclear power plants (NPPs). We focus on simulation based PRA which, in contrast to classical PRA, heavily employs system simulator codes. Firstly we compare, these two types of analyses, classical and RISMC, for a Boiling water reactor (BWR) station black out (SBO) initiating event. Secondly we present an extended BWR SBO analysis using RAVEN and RELAP-5 which address the comments and suggestions received about he original analysis presented in INL/EXT-???. This time we focus more on the stochastic analysis such probability of core damage and on the determination of the most risk-relevant factors. We also show some preliminary results regarding the comparison between RELAP5-3D and the new code RELAP-7 for a simplified Pressurized Water Reactors system. Lastly we present some conceptual ideas regarding the possibility to extended the RISMC capabilities from an off-line tool (i.e., as PRA analysis tool) to an online-tool. In this new configuration, RISMC capabilities can be used to assist and inform reactor operator during real accident scenarios.

  9. Parallel implementation of multireference coupled-cluster theories based on the reference-level parallelism

    SciTech Connect

    Brabec, Jiri; Pittner, Jiri; van Dam, Hubertus JJ; Apra, Edoardo; Kowalski, Karol

    2012-02-01

    A novel algorithm for implementing general type of multireference coupled-cluster (MRCC) theory based on the Jeziorski-Monkhorst exponential Ansatz [B. Jeziorski, H.J. Monkhorst, Phys. Rev. A 24, 1668 (1981)] is introduced. The proposed algorithm utilizes processor groups to calculate the equations for the MRCC amplitudes. In the basic formulation each processor group constructs the equations related to a specific subset of references. By flexible choice of processor groups and subset of reference-specific sufficiency conditions designated to a given group one can assure optimum utilization of available computing resources. The performance of this algorithm is illustrated on the examples of the Brillouin-Wigner and Mukherjee MRCC methods with singles and doubles (BW-MRCCSD and Mk-MRCCSD). A significant improvement in scalability and in reduction of time to solution is reported with respect to recently reported parallel implementation of the BW-MRCCSD formalism [J.Brabec, H.J.J. van Dam, K. Kowalski, J. Pittner, Chem. Phys. Lett. 514, 347 (2011)].

  10. Mixture theory-based poroelasticity as a model of interstitial tissue growth.

    PubMed

    Cowin, Stephen C; Cardoso, Luis

    2012-01-01

    This contribution presents an alternative approach to mixture theory-based poroelasticity by transferring some poroelastic concepts developed by Maurice Biot to mixture theory. These concepts are a larger RVE and the subRVE-RVE velocity average tensor, which Biot called the micro-macro velocity average tensor. This velocity average tensor is assumed here to depend upon the pore structure fabric. The formulation of mixture theory presented is directed toward the modeling of interstitial growth, that is to say changing mass and changing density of an organism. Traditional mixture theory considers constituents to be open systems, but the entire mixture is a closed system. In this development the mixture is also considered to be an open system as an alternative method of modeling growth. Growth is slow and accelerations are neglected in the applications. The velocity of a solid constituent is employed as the main reference velocity in preference to the mean velocity concept from the original formulation of mixture theory. The standard development of statements of the conservation principles and entropy inequality employed in mixture theory are modified to account for these kinematic changes and to allow for supplies of mass, momentum and energy to each constituent and to the mixture as a whole. The objective is to establish a basis for the development of constitutive equations for growth of tissues. PMID:22184481

  11. Mixture theory-based poroelasticity as a model of interstitial tissue growth

    PubMed Central

    Cowin, Stephen C.; Cardoso, Luis

    2011-01-01

    This contribution presents an alternative approach to mixture theory-based poroelasticity by transferring some poroelastic concepts developed by Maurice Biot to mixture theory. These concepts are a larger RVE and the subRVE-RVE velocity average tensor, which Biot called the micro-macro velocity average tensor. This velocity average tensor is assumed here to depend upon the pore structure fabric. The formulation of mixture theory presented is directed toward the modeling of interstitial growth, that is to say changing mass and changing density of an organism. Traditional mixture theory considers constituents to be open systems, but the entire mixture is a closed system. In this development the mixture is also considered to be an open system as an alternative method of modeling growth. Growth is slow and accelerations are neglected in the applications. The velocity of a solid constituent is employed as the main reference velocity in preference to the mean velocity concept from the original formulation of mixture theory. The standard development of statements of the conservation principles and entropy inequality employed in mixture theory are modified to account for these kinematic changes and to allow for supplies of mass, momentum and energy to each constituent and to the mixture as a whole. The objective is to establish a basis for the development of constitutive equations for growth of tissues. PMID:22184481

  12. ROOM: A recursive object oriented method for information systems development

    SciTech Connect

    Thelliez, T.; Donahue, S.

    1994-02-09

    Although complementary for the development of complex systems, top-down structured design and object oriented approach are still opposed and not integrated. As the complexity of the systems are still growing, and the so-called software crisis still not solved, it is urgent to provide a framework mixing the two paradigms. This paper presents an elegant attempt in this direction through our Recursive Object-Oriented Method (ROOM) in which a top-down approach divides the complexity of the system and an object oriented method studies a given level of abstraction. Illustrating this recursive schema with a simple example, we demonstrate that we achieve the goal of creating loosely coupled and reusable components.

  13. Using a Marginal Structural Model to Design a Theory-Based Mass Media Campaign

    PubMed Central

    Taguri, Masataka; Ishikawa, Yoshiki

    2016-01-01

    Background The essential first step in the development of mass media health campaigns is to identify specific beliefs of the target audience. The challenge is to prioritize suitable beliefs derived from behavioral theory. The purpose of this study was to identify suitable beliefs to target in a mass media campaign to change behavior using a new method to estimate the possible effect size of a small set of beliefs. Methods Data were drawn from the 2010 Japanese Young Female Smoker Survey (n = 500), conducted by the Japanese Ministry of Health, Labor and Welfare. Survey measures included intention to quit smoking, psychological beliefs (attitude, norms, and perceived control) based on the theory of planned behavior and socioeconomic status (age, education, household income, and marital status). To identify suitable candidate beliefs for a mass media health campaign, we estimated the possible effect size required to change the intention to quit smoking among the population of young Japanese women using the population attributable fraction from a marginal structural model. Results Thirteen percent of study participants intended to quit smoking. The marginal structural model estimated a population attributable fraction of 47 psychological beliefs (21 attitudes, 6 norms, and 19 perceived controls) after controlling for socioeconomic status. The belief, “I could quit smoking if my husband or significant other recommended it” suggested a promising target for a mass media campaign (population attributable fraction = 0.12, 95% CI = 0.02–0.23). Messages targeting this belief could possibly improve intention rates by up to 12% among this population. The analysis also suggested the potential for regulatory action. Conclusions This study proposed a method by which campaign planners can develop theory-based mass communication strategies to change health behaviors at the population level. This method might contribute to improving the quality of future mass health

  14. a Probability-Based Statistical Method to Extract Water Body of TM Images with Missing Information

    NASA Astrophysics Data System (ADS)

    Lian, Shizhong; Chen, Jiangping; Luo, Minghai

    2016-06-01

    Water information cannot be accurately extracted using TM images because true information is lost in some images because of blocking clouds and missing data stripes, thereby water information cannot be accurately extracted. Water is continuously distributed in natural conditions; thus, this paper proposed a new method of water body extraction based on probability statistics to improve the accuracy of water information extraction of TM images with missing information. Different disturbing information of clouds and missing data stripes are simulated. Water information is extracted using global histogram matching, local histogram matching, and the probability-based statistical method in the simulated images. Experiments show that smaller Areal Error and higher Boundary Recall can be obtained using this method compared with the conventional methods.

  15. Theory based design and optimization of materials for spintronics applications

    NASA Astrophysics Data System (ADS)

    Xu, Tianyi

    The Spintronics industry has developed rapidly in the past decade. Finding the right material is very important for Spintronics applications, which requires good understanding of the physics behind specific phenomena. In this dissertation, we will focus on two types of perpendicular transport phenomena, the current-perpendicular-to-plane giant-magneto-resistance (CPP-GMR) phenomenon and the tunneling phenomenon in the magnetic tunnel junctions. The Valet-Fert model is a very useful semi-classical approach for understanding the transport and spin-flip process in CPP-GMR. We will present a finite element based implementation for the Valet-Fert model which enables a practical way to calculate the electron transport in real CPP-GMR spin valves. It is very important to find high spin polarized materials for CPP-GMR spin valves. The half-metal, due to its full spin polarization, is of interest. We will propose a rational way to find half-metals based on the gap theorem. Then we will focus on the high-MR TMR phenomenon. The tunneling theory of electron transport in mesoscopic systems will be covered. Then we will calculate the transport properties of certain junctions with the help of Green's function under the Landauer-Buttiker formalism, also known as the scattering formalism. The damping constant determines the switching rate of a device. We can calculate it using a method based on the Extended Huckel Tight-Binding theory (EHTB). The symmetry filtering effect is very helpful for finding materials for TMR junctions. Based upon which, we find a good candidate material, MnAl, for TMR applications.

  16. An Information Theory-Based Approach to Assessing the Sustainability and Stability of an Island System

    EPA Science Inventory

    It is well-documented that a sustainable system is based on environmental stewardship, economic viability and social equity. What is often overlooked is the need for continuity such that desirable system behavior is maintained with mechanisms in place that facilitate the ability ...

  17. Successful Aging with Sickle Cell Disease: Using Qualitative Methods to Inform Theory

    PubMed Central

    Jenerette, Coretta M.; Lauderdale, Gloria

    2009-01-01

    Little is known about the lives of adults with sickle cell disease (SCD). This article reports findings from a qualitative pilot study, which used life review as a method to explore influences on health outcomes among middle-aged and older adults with SCD, Six females with SCD, recruited from two urban sickle cell clinics in the U.S., engaged in semi-structured, in-depth life review interviews. MaxQDA2 software was used for qualitative data coding and analysis. Three major themes were identified: vulnerability factors, self-care management resources, and health outcomes. These themes are consistent with the Theory of Self-Care Management for Sickle Cell Disease. Identifying vulnerability factors, self-care management resources, and health outcomes in adults with SCD may aid in developing theory-based interventions to meet health care needs of younger individuals with SCD. The life review process is a useful means to gain insight into successful aging with SCD and other chronic illnesses. PMID:19838320

  18. 10 CFR 207.3 - Method of collecting energy information under ESECA.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 3 2011-01-01 2011-01-01 false Method of collecting energy information under ESECA. 207.3 Section 207.3 Energy DEPARTMENT OF ENERGY OIL COLLECTION OF INFORMATION Collection of Information Under... production, processing, refining, transportation by pipeline, or distribution (at other than the retail...

  19. 10 CFR 207.3 - Method of collecting energy information under ESECA.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 3 2010-01-01 2010-01-01 false Method of collecting energy information under ESECA. 207.3 Section 207.3 Energy DEPARTMENT OF ENERGY OIL COLLECTION OF INFORMATION Collection of Information Under... production, processing, refining, transportation by pipeline, or distribution (at other than the retail...

  20. Improving Diabetes care through Examining, Advising, and prescribing (IDEA): protocol for a theory-based cluster randomised controlled trial of a multiple behaviour change intervention aimed at primary healthcare professionals

    PubMed Central

    2014-01-01

    Background New clinical research findings may require clinicians to change their behaviour to provide high-quality care to people with type 2 diabetes, likely requiring them to change multiple different clinical behaviours. The present study builds on findings from a UK-wide study of theory-based behavioural and organisational factors associated with prescribing, advising, and examining consistent with high-quality diabetes care. Aim To develop and evaluate the effectiveness and cost of an intervention to improve multiple behaviours in clinicians involved in delivering high-quality care for type 2 diabetes. Design/methods We will conduct a two-armed cluster randomised controlled trial in 44 general practices in the North East of England to evaluate a theory-based behaviour change intervention. We will target improvement in six underperformed clinical behaviours highlighted in quality standards for type 2 diabetes: prescribing for hypertension; prescribing for glycaemic control; providing physical activity advice; providing nutrition advice; providing on-going education; and ensuring that feet have been examined. The primary outcome will be the proportion of patients appropriately prescribed and examined (using anonymised computer records), and advised (using anonymous patient surveys) at 12 months. We will use behaviour change techniques targeting motivational, volitional, and impulsive factors that we have previously demonstrated to be predictive of multiple health professional behaviours involved in high-quality type 2 diabetes care. We will also investigate whether the intervention was delivered as designed (fidelity) by coding audiotaped workshops and interventionist delivery reports, and operated as hypothesised (process evaluation) by analysing responses to theory-based postal questionnaires. In addition, we will conduct post-trial qualitative interviews with practice teams to further inform the process evaluation, and a post-trial economic analysis to

  1. Explanation of Second-Order Asymptotic Theory Via Information Spectrum Method

    NASA Astrophysics Data System (ADS)

    Hayashi, Masahito

    We explain second-order asymptotic theory via the information spectrum method. From a unified viewpoint based on the generality of the information spectrum method, we consider second-order asymptotic theory for use in fixed-length data compression, uniform random number generation, and channel coding. Additionally, we discuss its application to quantum cryptography, folklore in source coding, and security analysis.

  2. A Review of Web Information Seeking Research: Considerations of Method and Foci of Interest

    ERIC Educational Resources Information Center

    Martzoukou, Konstantina

    2005-01-01

    Introduction: This review shows that Web information seeking research suffers from inconsistencies in method and a lack of homogeneity in research foci. Background: Qualitative and quantitative methods are needed to produce a comprehensive view of information seeking. Studies also recommend observation as one of the most fundamental ways of…

  3. “Please Don’t Send Us Spam!” A Participative, Theory-Based Methodology for Developing an mHealth Intervention

    PubMed Central

    2016-01-01

    Background Mobile health solutions have the potential of reducing burdens on health systems and empowering patients with important information. However, there is a lack of theory-based mHealth interventions. Objective The purpose of our study was to develop a participative, theory-based, mobile phone, audio messaging intervention attractive to recently circumcised men at voluntary medical male circumcision (VMMC) clinics in the Cape Town area in South Africa. We aimed to shift some of the tasks related to postoperative counselling on wound management and goal setting on safe sex. We place an emphasis on describing the full method of message generation to allow for replication. Methods We developed an mHealth intervention using a staggered qualitative methodology: (1) focus group discussions with 52 recently circumcised men and their partners to develop initial voice messages they felt were relevant and appropriate, (2) thematic analysis and expert consultation to select the final messages for pilot testing, and (3) cognitive interviews with 12 recent VMMC patients to judge message comprehension and rank the messages. Message content and phasing were guided by the theory of planned behavior and the health action process approach. Results Patients and their partners came up with 245 messages they thought would help men during the wound-healing period. Thematic analysis revealed 42 different themes. Expert review and cognitive interviews with more patients resulted in 42 messages with a clear division in terms of needs and expectations between the initial wound-healing recovery phase (weeks 1–3) and the adjustment phase (weeks 4–6). Discussions with patients also revealed potential barriers to voice messaging, such as lack of technical knowledge of mobile phones and concerns about the invasive nature of the intervention. Patients’ own suggested messages confirmed Ajzen’s theory of planned behavior that if a health promotion intervention can build trust and be

  4. Dynamic stepping information process method in mobile bio-sensing computing environments.

    PubMed

    Lee, Tae-Gyu; Lee, Seong-Hoon

    2014-01-01

    Recently, the interest toward human longevity free from diseases is being converged as one system frame along with the development of mobile computing environment, diversification of remote medical system and aging society. Such converged system enables implementation of a bioinformatics system created as various supplementary information services by sensing and gathering health conditions and various bio-information of mobile users to set up medical information. The existing bio-information system performs static and identical process without changes after the bio-information process defined at the initial system configuration executes the system. However, such static process indicates ineffective execution in the application of mobile bio-information system performing mobile computing. Especially, an inconvenient duty of having to perform initialization of new definition and execution is accompanied during the process configuration of bio-information system and change of method. This study proposes a dynamic process design and execution method to overcome such ineffective process. PMID:24704651

  5. Preventing Postpartum Smoking Relapse Among Inner City Women: Development of a Theory-Based and Evidence-Guided Text Messaging Intervention

    PubMed Central

    Wen, Kuang-Yi; Kilby, Linda; Fleisher, Linda; Belton, Tanisha D; Roy, Gem; Hernandez, Enrique

    2014-01-01

    Background Underserved women are at high risk for smoking relapse after childbirth due to their unique socioeconomic and postpartum stressors and barriers. Mobile text messaging technology allows delivery of relapse prevention programs targeted to their personal needs over time. Objective To describe the development of a social-cognitive theory-based and evidence-guided text messaging intervention for preventing postpartum smoking relapse among inner city women. Methods Guided by the cognitive-social health information processing framework, user-centered design, and health communication best practices, the intervention was developed through a systematic process that included needs assessment, followed by an iterative cycling through message drafting, health literacy evaluation and rewriting, review by target community members and a scientific advisory panel, and message revision, concluding with usability testing. Results All message content was theory-grounded, derived by needs assessment analysis and evidence-based materials, reviewed and revised by the target population, health literacy experts, and scientific advisors. The final program, “Txt2Commit,” was developed as a fully automated system, designed to deliver 3 proactive messages per day for a 1-month postpartum smoking relapse intervention, with crave and lapse user-initiated message functions available when needed. Conclusions The developmental process suggests that the application of theory and best practices in the design of text messaging smoking cessation interventions is not only feasible but necessary for ensuring that the interventions are evidence based and user-centered. PMID:24698804

  6. A method for extracting task-oriented information from biological text sources.

    PubMed

    Kuttiyapillai, Dhanasekaran; Rajeswari, R

    2015-01-01

    A method for information extraction which processes the unstructured data from document collection has been introduced. A dynamic programming technique adopted to find relevant genes from sequences which are longest and accurate is used for finding matching sequences and identifying effects of various factors. The proposed method could handle complex information sequences which give different meanings in different situations, eliminating irrelevant information. The text contents were pre-processed using a general-purpose method and were applied with entity tagging component. The bottom-up scanning of key-value pairs improves content finding to generate relevant sequences to the testing task. This paper highlights context-based extraction method for extracting food safety information, which is identified from articles, guideline documents and laboratory results. The graphical disease model verifies weak component through utilisation of development data set. This improves the accuracy of information retrieval in biological text analysis and reporting applications. PMID:26510293

  7. A theory-based online health behavior intervention for new university students: study protocol

    PubMed Central

    2013-01-01

    Background Too few young people engage in behaviors that reduce the risk of morbidity and premature mortality, such as eating healthily, being physically active, drinking sensibly and not smoking. The present research developed an online intervention to target these health behaviors during the significant life transition from school to university when health beliefs and behaviors may be more open to change. This paper describes the intervention and the proposed approach to its evaluation. Methods/design Potential participants (all undergraduates about to enter the University of Sheffield) will be emailed an online questionnaire two weeks before starting university. On completion of the questionnaire, respondents will be randomly assigned to receive either an online health behavior intervention (U@Uni) or a control condition. The intervention employs three behavior change techniques (self-affirmation, theory-based messages, and implementation intentions) to target four heath behaviors (alcohol consumption, physical activity, fruit and vegetable intake, and smoking). Subsequently, all participants will be emailed follow-up questionnaires approximately one and six months after starting university. The questionnaires will assess the four targeted behaviors and associated cognitions (e.g., intentions, self-efficacy) as well as socio-demographic variables, health status, Body Mass Index (BMI), health service use and recreational drug use. A sub-sample of participants will provide a sample of hair to assess changes in biochemical markers of health behavior. A health economic evaluation of the cost effectiveness of the intervention will also be conducted. Discussion The findings will provide evidence on the effectiveness of online interventions as well as the potential for intervening during significant life transitions, such as the move from school to university. If successful, the intervention could be employed at other universities to promote healthy behaviors among new

  8. Increasing smoke alarm operability through theory-based health education: a randomised trial

    PubMed Central

    Miller, Ted R; Bergen, Gwen; Ballesteros, Michael F; Bhattacharya, Soma; Gielen, Andrea Carlson; Sheppard, Monique S

    2015-01-01

    Background Although working smoke alarms halve deaths in residential fires, many households do not keep alarms operational. We tested whether theory-based education increases alarm operability. Methods Randomised multiarm trial, with a single arm randomly selected for use each day, in low-income neighbourhoods in Maryland, USA. Intervention arms: (1) Full Education combining a health belief module with a social-cognitive theory module that provided hands-on practice installing alarm batteries and using the alarm’s hush button; (2) Hands-on Practice social-cognitive module supplemented by typical fire department education; (3) Current Norm receiving typical fire department education only. Four hundred and thirty-six homes recruited through churches or by knocking on doors in 2005–2008. Followup visits checked alarm operability in 370 homes (85%) 1–3.5 years after installation. Main outcome measures: number of homes with working alarms defined as alarms with working batteries or hard-wired and number of working alarms per home. Regressions controlled for alarm status preintervention; demographics and beliefs about fire risks and alarm effectiveness. Results Homes in the Full Education and Practice arms were more likely to have a functioning smoke alarm at follow-up (OR=2.77, 95% CI 1.09 to 7.03) and had an average of 0.32 more working alarms per home (95% CI 0.09 to 0.56). Working alarms per home rose 16%. Full Education and Practice had similar effectiveness (p=0.97 on both outcome measures). Conclusions Without exceeding typical fire department installation time, installers can achieve greater smoke alarm operability. Hands-on practice is key. Two years after installation, for every three homes that received hands-on practice, one had an additional working alarm. Trial registration number http://www.clinicaltrials.gov number NCT00139126. PMID:25165090

  9. A fuzzy clustering vessel segmentation method incorporating line-direction information

    NASA Astrophysics Data System (ADS)

    Wang, Zhimin; Xiong, Wei; Huang, Weimin; Zhou, Jiayin; Venkatesh, Sudhakar K.

    2012-02-01

    A data clustering based vessel segmentation method is proposed for automatic liver vasculature segmentation in CT images. It consists of a novel similarity measure which incorporates the spatial context, vesselness information and line-direction information in a unique way. By combining the line-direction information and spatial information into the data clustering process, the proposed method is able to take care of the fine details of the vessel tree and suppress the image noise and artifacts at the same time. The proposed algorithm has been evaluated on the real clinical contrast-enhanced CT images, and achieved excellent segmentation accuracy without any experimentally set parameters.

  10. A research on scenic information prediction method based on RBF neural network

    NASA Astrophysics Data System (ADS)

    Li, Jingwen; Yin, Shouqiang; Wang, Ke

    2015-12-01

    Based on the rapid development of the wisdom tourism, it is conform to the trend of the development of the wisdom tourism through the scientific method to realize the prediction of the scenic information. The article,using the super nonlinear fitting ability of RBF neural network[1-2],builds a prediction and inference method of comprehensive information for the complex geographic time, space and attribute of scenic through the hyper-surface data organization of the scenic geographic entity information[3]. And it uses Guilin scenic area as an example to deduce the process of the forecasting of the whole information.

  11. Inter-instrumental method transfer of chiral capillary electrophoretic methods using robustness test information.

    PubMed

    De Cock, Bart; Borsuk, Agnieszka; Dejaegher, Bieke; Stiens, Johan; Mangelings, Debby; Vander Heyden, Yvan

    2014-08-01

    Capillary electrophoresis (CE) is an electrodriven separation technique that is often used for the separation of chiral molecules. Advantages of CE are its flexibility, low cost and efficiency. On the other hand, the precision and transfer of CE methods are well-known problems of the technique. Reasons for the more complicated method transfer are the more diverse instrumental differences, such as total capillary lengths and capillary cooling systems; and the higher response variability of CE methods compared to other techniques, such as liquid chromatography (HPLC). Therefore, a larger systematic change in peak resolutions, migration times and peak areas, with a loss of separation and efficiency may be seen when a CE method is transferred to another laboratory or another type of instrument. A swift and successful method transfer is required because development and routine use of analytical methods are usually not performed in the same laboratory and/or on the same type of equipment. The aim of our study was to develop transfer rules to facilitate CE method transfers between different laboratories and instruments. In our case study, three β-blockers were chirally separated and inter-instrumental transfers were performed. The first step of our study was to optimise the precision of the chiral CE method. Next, a robustness test was performed to identify the instrumental and experimental parameters that were most influencing the considered responses. The precision- and the robustness study results were used to adapt instrumental and/or method settings to improve the transfer between different instruments. Finally, the comparison of adapted and non-adapted transfers allowed deriving some rules to facilitate CE method transfers. PMID:24931445

  12. A formal, mathematics oriented method for identifying security risks in information systems.

    PubMed

    van Piggelen, H U

    1997-01-01

    IT security presently lacks the benefits of physics where certain unifying grand principles can be applied. The aim of the method is to provide a technology independent method of identifying components of a system in general, and of information systems in particular. The need for the proposed method is derived from ad hoc character of theories used in the present formal security textbooks. None of these can give the user any guarantee of completeness. The new method is scientifically derived as a method, presented, explained and applied to several interesting topics in the field of health care information systems. Some simple mathematical formulae can be introduced. PMID:10179535

  13. Using Emergence Theory-Based Curriculum to Teach Compromise Skills to Students with Autistic Spectrum Disorders

    ERIC Educational Resources Information Center

    Fein, Lance; Jones, Don

    2015-01-01

    This study addresses the compromise skills that are taught to students diagnosed with autistic spectrum disorders (ASD) and related social and communication deficits. A private school in the southeastern United States implemented an emergence theory-based curriculum to address these skills, yet no formal analysis was conducted to determine its…

  14. Development and Evaluation of a Theory-Based Physical Activity Guidebook for Breast Cancer Survivors

    ERIC Educational Resources Information Center

    Vallance, Jeffrey K.; Courneya, Kerry S.; Taylor, Lorian M.; Plotnikoff, Ronald C.; Mackey, John R.

    2008-01-01

    This study's objective was to develop and evaluate the suitability and appropriateness of a theory-based physical activity (PA) guidebook for breast cancer survivors. Guidebook content was constructed based on the theory of planned behavior (TPB) using salient exercise beliefs identified by breast cancer survivors in previous research. Expert…

  15. Theory-Based Interactive Mathematics Instruction: Development and Validation of Computer-Video Modules.

    ERIC Educational Resources Information Center

    Henderson, Ronald W.; And Others

    Theory-based prototype computer-video instructional modules were developed to serve as an instructional supplement for students experiencing difficulty in learning mathematics, with special consideration given to students underrepresented in mathematics (particularly women and minorities). Modules focused on concepts and operations for factors,…

  16. A Theory-Based Approach to Reading Assessment in the Army. Technical Report 625.

    ERIC Educational Resources Information Center

    Oxford-Carpenter, Rebecca L.; Schultz-Shiner, Linda J.

    Noting that the United States Army Research Institute for the Behavioral and Social Sciences (ARI) has been involved in research on reading assessment in the Army from both practical and theoretical perspectives, this paper addresses practical Army problems in reading assessment from a theory base that reflects the most recent and most sound…

  17. Assessment of Prevalence of Persons with Down Syndrome: A Theory-Based Demographic Model

    ERIC Educational Resources Information Center

    de Graaf, Gert; Vis, Jeroen C.; Haveman, Meindert; van Hove, Geert; de Graaf, Erik A. B.; Tijssen, Jan G. P.; Mulder, Barbara J. M.

    2011-01-01

    Background: The Netherlands are lacking reliable empirical data in relation to the development of birth and population prevalence of Down syndrome. For the UK and Ireland there are more historical empirical data available. A theory-based model is developed for predicting Down syndrome prevalence in the Netherlands from the 1950s onwards. It is…

  18. Validating a Theory-Based Survey to Evaluate Teaching Effectiveness in Higher Education

    ERIC Educational Resources Information Center

    Amrein-Beardsley, A.; Haladyna, T.

    2012-01-01

    Surveys to evaluate instructor effectiveness are commonly used in higher education. Yet the survey items included are often drawn from other surveys without reference to a theory of adult learning. The authors present the results from a validation study of such a theory-based survey. They evidence that an evaluation survey based on a theory that…

  19. Liminality in cultural transition: applying ID-EA to advance a concept into theory-based practice.

    PubMed

    Baird, Martha B; Reed, Pamela G

    2015-01-01

    As global migration increases worldwide, nursing interventions are needed to address the effects of migration on health. The concept of liminality emerged as a pivotal concept in the situation-specific theory of well-being in refugee women experiencing cultural transition. As a relatively new concept in the discipline of nursing, liminality is explored using a method, called ID-EA, which we developed to advance a theoretical concept for application to nursing practice. Liminality in the context of cultural transition is further developed using the five steps of inquiry of the ID-EA method. The five steps are as follows: (1) inductive inquiry: qualitative research, (2) deductive inquiry: literature review, (3) synthesis of inductive and deductive inquiry, (4) evaluation inquiry, and (5) application-to-practice inquiry. The overall goal of this particular work was to develop situation-specific, theory-based interventions that facilitate cultural transitions for immigrants and refugees. PMID:25799694

  20. Genetic Algorithm and Graph Theory Based Matrix Factorization Method for Online Friend Recommendation

    PubMed Central

    Li, Qu; Yang, Jianhua; Xu, Ning

    2014-01-01

    Online friend recommendation is a fast developing topic in web mining. In this paper, we used SVD matrix factorization to model user and item feature vector and used stochastic gradient descent to amend parameter and improve accuracy. To tackle cold start problem and data sparsity, we used KNN model to influence user feature vector. At the same time, we used graph theory to partition communities with fairly low time and space complexity. What is more, matrix factorization can combine online and offline recommendation. Experiments showed that the hybrid recommendation algorithm is able to recommend online friends with good accuracy. PMID:24757410

  1. 49 CFR 1135.2 - Revenue Shortfall Allocation Method: Annual State tax information.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 8 2010-10-01 2010-10-01 false Revenue Shortfall Allocation Method: Annual State... RECOVERY PROCEDURES § 1135.2 Revenue Shortfall Allocation Method: Annual State tax information. (a) To enable the Board to calculate the revenue shortfall allocation method (RSAM), which is one of the...

  2. 78 FR 34427 - 2012 Tax Information for Use In The Revenue Shortfall Allocation Method

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-07

    ... Surface Transportation Board 2012 Tax Information for Use In The Revenue Shortfall Allocation Method... Shortfall Allocation Method (RSAM). DATES: Comments are due by July 9, 2013. If any comment opposing AAR's... Revenue Shortfall Allocation Method, EP 646 (Sub-No. 2) (STB served Nov. 21, 2008). RSAM is intended...

  3. 76 FR 40448 - 2010 Tax Information for Use in the Revenue Shortfall Allocation Method

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-08

    ... Surface Transportation Board 2010 Tax Information for Use in the Revenue Shortfall Allocation Method... Allocation Method (RSAM). DATES: Comments are due by August 8, 2011. If any comment opposing AAR's... Shortfall Allocation Method, EP 646 (Sub-No. 2) (STB served Nov. 21, 2008). RSAM is intended to measure...

  4. A method for fast selecting feature wavelengths from the spectral information of crop nitrogen

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Research on a method for fast selecting feature wavelengths from the nitrogen spectral information is necessary, which can determine the nitrogen content of crops. Based on the uniformity of uniform design, this paper proposed an improved particle swarm optimization (PSO) method. The method can ch...

  5. 78 FR 68076 - Request for Information on Alternative Skin Sensitization Test Methods and Testing Strategies and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-13

    ...The Interagency Coordinating Committee on the Validation of Alternative Methods (ICCVAM) is developing a U.S. plan for the evaluation of alternative skin sensitization test methods and testing strategies. The National Toxicology Program (NTP) Interagency Center for the Evaluation of Alternative Toxicological Methods (NICEATM) requests information that ICCVAM might use to develop this plan and......

  6. Method of bistable optical information storage using antiferroelectric phase PLZT ceramics

    DOEpatents

    Land, C.E.

    1990-07-31

    A method for bistable storage of binary optical information includes an antiferroelectric (AFE) lead lanthanum zirconate titanate (PLZT) layer having a stable antiferroelectric first phase and a ferroelectric (FE) second phase obtained by applying a switching electric field across the surface of the device. Optical information is stored by illuminating selected portions of the layer to photoactivate an FE to AFE transition in those portions. Erasure of the stored information is obtained by reapplying the switching field. 8 figs.

  7. Method of bistable optical information storage using antiferroelectric phase PLZT ceramics

    DOEpatents

    Land, Cecil E.

    1990-01-01

    A method for bistable storage of binary optical information includes an antiferroelectric (AFE) lead lanthanum zirconate titanate (PLZT) layer having a stable antiferroelectric first phase and a ferroelectric (FE) second phase obtained by applying a switching electric field across the surface of the device. Optical information is stored by illuminating selected portions of the layer to photoactivate an FE to AFE transition in those portions. Erasure of the stored information is obtained by reapplying the switching field.

  8. Extraction Method for Earthquake-Collapsed Building Information Based on High-Resolution Remote Sensing

    NASA Astrophysics Data System (ADS)

    Chen, Peng; Wu, Jian; Liu, Yaolin; Wang, Jing

    2014-03-01

    At present, the extraction of earthquake disaster information from remote sensing data relies on visual interpretation. However, this technique cannot effectively and quickly obtain precise and efficient information for earthquake relief and emergency management. Collapsed buildings in the town of Zipingpu after the Wenchuan earthquake were used as a case study to validate two kinds of rapid extraction methods for earthquake-collapsed building information based on pixel-oriented and object-oriented theories. The pixel-oriented method is based on multi-layer regional segments that embody the core layers and segments of the object-oriented method. The key idea is to mask layer by layer all image information, including that on the collapsed buildings. Compared with traditional techniques, the pixel-oriented method is innovative because it allows considerably rapid computer processing. As for the object-oriented method, a multi-scale segment algorithm was applied to build a three-layer hierarchy. By analyzing the spectrum, texture, shape, location, and context of individual object classes in different layers, the fuzzy determined rule system was established for the extraction of earthquake-collapsed building information. We compared the two sets of results using three variables: precision assessment, visual effect, and principle. Both methods can extract earthquake-collapsed building information quickly and accurately. The object-oriented method successfully overcomes the pepper salt noise caused by the spectral diversity of high-resolution remote sensing data and solves the problem of same object, different spectrums and that of same spectrum, different objects. With an overall accuracy of 90.38%, the method achieves more scientific and accurate results compared with the pixel-oriented method (76.84%). The object-oriented image analysis method can be extensively applied in the extraction of earthquake disaster information based on high-resolution remote sensing.

  9. Mathematical, Logical, and Formal Methods in Information Retrieval: An Introduction to the Special Issue.

    ERIC Educational Resources Information Center

    Crestani, Fabio; Dominich, Sandor; Lalmas, Mounia; van Rijsbergen, Cornelis Joost

    2003-01-01

    Discusses the importance of research on the use of mathematical, logical, and formal methods in information retrieval to help enhance retrieval effectiveness and clarify underlying concepts of information retrieval. Highlights include logic; probability; spaces; and future research needs. (Author/LRW)

  10. Theories and Methods for Research on Informal Learning and Work: Towards Cross-Fertilization

    ERIC Educational Resources Information Center

    Sawchuk, Peter H.

    2008-01-01

    The topic of informal learning and work has quickly become a staple in contemporary work and adult learning research internationally. The narrow conceptualization of work is briefly challenged before the article turns to a review of the historical origins as well as contemporary theories and methods involved in researching informal learning and…

  11. Evaluation of Semantic-Based Information Retrieval Methods in the Autism Phenotype Domain

    PubMed Central

    Hassanpour, Saeed; O’Connor, Martin J.; Das, Amar K.

    2011-01-01

    Biomedical ontologies are increasingly being used to improve information retrieval methods. In this paper, we present a novel information retrieval approach that exploits knowledge specified by the Semantic Web ontology and rule languages OWL and SWRL. We evaluate our approach using an autism ontology that has 156 SWRL rules defining 145 autism phenotypes. Our approach uses a vector space model to correlate how well these phenotypes relate to the publications used to define them. We compare a vector space phenotype representation using class hierarchies with one that extends this method to incorporate additional semantics encoded in SWRL rules. From a PubMed-extracted corpus of 75 articles, we show that average rank of a related paper using the class hierarchy method is 4.6 whereas the average rank using the extended rule-based method is 3.3. Our results indicate that incorporating rule-based definitions in information retrieval methods can improve search for relevant publications. PMID:22195112

  12. Evaluation of semantic-based information retrieval methods in the autism phenotype domain.

    PubMed

    Hassanpour, Saeed; O'Connor, Martin J; Das, Amar K

    2011-01-01

    Biomedical ontologies are increasingly being used to improve information retrieval methods. In this paper, we present a novel information retrieval approach that exploits knowledge specified by the Semantic Web ontology and rule languages OWL and SWRL. We evaluate our approach using an autism ontology that has 156 SWRL rules defining 145 autism phenotypes. Our approach uses a vector space model to correlate how well these phenotypes relate to the publications used to define them. We compare a vector space phenotype representation using class hierarchies with one that extends this method to incorporate additional semantics encoded in SWRL rules. From a PubMed-extracted corpus of 75 articles, we show that average rank of a related paper using the class hierarchy method is 4.6 whereas the average rank using the extended rule-based method is 3.3. Our results indicate that incorporating rule-based definitions in information retrieval methods can improve search for relevant publications. PMID:22195112

  13. An automatic abrupt information extraction method based on singular value decomposition and higher-order statistics

    NASA Astrophysics Data System (ADS)

    He, Tian; Ye, Wu; Pan, Qiang; Liu, Xiandong

    2016-02-01

    One key aspect of local fault diagnosis is how to effectively extract abrupt features from the vibration signals. This paper proposes a method to automatically extract abrupt information based on singular value decomposition and higher-order statistics. In order to observe the distribution law of singular values, a numerical analysis to simulate the noise, periodic signal, abrupt signal and singular value distribution is conducted. Based on higher-order statistics and spectrum analysis, a method to automatically choose the upper and lower borders of the singular value interval reflecting the abrupt information is built. And the selected singular values derived from this method are used to reconstruct abrupt signals. It is proven that the method is able to obtain accurate results by processing the rub-impact fault signal measured from the experiments. The analytical and experimental results indicate that the proposed method is feasible for automatically extracting abrupt information caused by faults like the rotor-stator rub-impact.

  14. An Adaptive Altitude Information Fusion Method for Autonomous Landing Processes of Small Unmanned Aerial Rotorcraft

    PubMed Central

    Lei, Xusheng; Li, Jingjing

    2012-01-01

    This paper presents an adaptive information fusion method to improve the accuracy and reliability of the altitude measurement information for small unmanned aerial rotorcraft during the landing process. Focusing on the low measurement performance of sensors mounted on small unmanned aerial rotorcraft, a wavelet filter is applied as a pre-filter to attenuate the high frequency noises in the sensor output. Furthermore, to improve altitude information, an adaptive extended Kalman filter based on a maximum a posteriori criterion is proposed to estimate measurement noise covariance matrix in real time. Finally, the effectiveness of the proposed method is proved by static tests, hovering flight and autonomous landing flight tests. PMID:23201993

  15. Method and apparatus for bistable optical information storage for erasable optical disks

    DOEpatents

    Land, Cecil E.; McKinney, Ira D.

    1990-01-01

    A method and an optical device for bistable storage of optical information, together with reading and erasure of the optical information, using a photoactivated shift in a field dependent phase transition between a metastable or a bias-stabilized ferroelectric (FE) phase and a stable antiferroelectric (AFE) phase in an lead lanthanum zirconate titanate (PLZT). An optical disk contains the PLZT. Writing and erasing of optical information can be accomplished by a light beam normal to the disk. Reading of optical information can be accomplished by a light beam at an incidence angle of 15 to 60 degrees to the normal of the disk.

  16. Method and apparatus for bistable optical information storage for erasable optical disks

    DOEpatents

    Land, C.E.; McKinney, I.D.

    1988-05-31

    A method and an optical device for bistable storage of optical information, together with reading and erasure of the optical information, using a photoactivated shift in a field dependent phase transition between a metastable or a bias-stabilized ferroelectric (FE) phase and a stable antiferroelectric (AFE) phase in a lead lanthanum zirconate titanate (PLZT). An optical disk contains the PLZT. Writing and erasing of optical information can be accomplished by a light beam normal to the disk. Reading of optical information can be accomplished by a light beam at an incidence angle of 15 to 60 degrees to the normal of the disk. 10 figs.

  17. Novel copyright information hiding method based on random phase matrix of Fresnel diffraction transforms

    NASA Astrophysics Data System (ADS)

    Cao, Chao; Chen, Ru-jun

    2009-10-01

    In this paper, we present a new copyright information hide method for digital images in Moiré fringe formats. The copyright information is embedded into the protected image and the detecting image based on Fresnel phase matrix. Firstly, using Fresnel diffraction transform, the random phase matrix of copyright information is generated. Then, according to Moiré fringe principle, the protected image and the detecting image are modulated respectively based on the random phase matrix, and the copyright information is embedded into them. When the protected image and the detecting image are overlapped, the copyright information can reappear. Experiment results show that our method has good concealment performance, and is a new way for copyright protection.

  18. Assessing Bayesian model averaging uncertainty of groundwater modeling based on information entropy method

    NASA Astrophysics Data System (ADS)

    Zeng, Xiankui; Wu, Jichun; Wang, Dong; Zhu, Xiaobin; Long, Yuqiao

    2016-07-01

    Because of groundwater conceptualization uncertainty, multi-model methods are usually used and the corresponding uncertainties are estimated by integrating Markov Chain Monte Carlo (MCMC) and Bayesian model averaging (BMA) methods. Generally, the variance method is used to measure the uncertainties of BMA prediction. The total variance of ensemble prediction is decomposed into within-model and between-model variances, which represent the uncertainties derived from parameter and conceptual model, respectively. However, the uncertainty of a probability distribution couldn't be comprehensively quantified by variance solely. A new measuring method based on information entropy theory is proposed in this study. Due to actual BMA process hard to meet the ideal mutually exclusive collectively exhaustive condition, BMA predictive uncertainty could be decomposed into parameter, conceptual model, and overlapped uncertainties, respectively. Overlapped uncertainty is induced by the combination of predictions from correlated model structures. In this paper, five simple analytical functions are firstly used to illustrate the feasibility of the variance and information entropy methods. A discrete distribution example shows that information entropy could be more appropriate to describe between-model uncertainty than variance. Two continuous distribution examples show that the two methods are consistent in measuring normal distribution, and information entropy is more appropriate to describe bimodal distribution than variance. The two examples of BMA uncertainty decomposition demonstrate that the two methods are relatively consistent in assessing the uncertainty of unimodal BMA prediction. Information entropy is more informative in describing the uncertainty decomposition of bimodal BMA prediction. Then, based on a synthetical groundwater model, the variance and information entropy methods are used to assess the BMA uncertainty of groundwater modeling. The uncertainty assessments of

  19. Comparison of high and low intensity contact between secondary and primary care to detect people at ultra-high risk for psychosis: study protocol for a theory-based, cluster randomized controlled trial

    PubMed Central

    2013-01-01

    Background The early detection and referral to specialized services of young people at ultra-high risk (UHR) for psychosis may reduce the duration of untreated psychosis and, therefore, improve prognosis. General practitioners (GPs) are usually the healthcare professionals contacted first on the help-seeking pathway of these individuals. Methods/Design This is a cluster randomized controlled trial (cRCT) of primary care practices in Cambridgeshire and Peterborough, UK. Practices are randomly allocated into two groups in order to establish which is the most effective and cost-effective way to identify people at UHR for psychosis. One group will receive postal information about the local early intervention in psychosis service, including how to identify young people who may be in the early stages of a psychotic illness. The second group will receive the same information plus an additional, ongoing theory-based educational intervention with dedicated liaison practitioners to train clinical staff at each site. The primary outcome of this trial is count data over a 2-year period: the yield - number of UHR for psychosis referrals to a specialist early intervention in psychosis service - per primary care practice. Discussion There is little guidance on the essential components of effective and cost-effective educational interventions in primary mental health care. Furthermore, no study has demonstrated an effect of a theory-based intervention to help GPs identify young people at UHR for psychosis. This study protocol is underpinned by a robust scientific rationale that intends to address these limitations. Trial registration Current Controlled Trials ISRCTN70185866 PMID:23866815

  20. Development and Content Validation of the Information Assessment Method for Patients and Consumers

    PubMed Central

    Bartlett, Gillian; Grad, Roland M; Tang, David L; Johnson-Lafleur, Janique; Shulha, Michael; Barbosa Galvão, Maria Cristiane; Ricarte, Ivan LM; Stephenson, Randolph; Shohet, Linda; Hutsul, Jo-Anne; Repchinsky, Carol A; Rosenberg, Ellen; Burnand, Bernard; Légaré, France; Dunikowski, Lynn; Murray, Susan; Boruff, Jill; Frati, Francesca; Kloda, Lorie; Macaulay, Ann; Lagarde, François; Doray, Geneviève

    2014-01-01

    Background Online consumer health information addresses health problems, self-care, disease prevention, and health care services and is intended for the general public. Using this information, people can improve their knowledge, participation in health decision-making, and health. However, there are no comprehensive instruments to evaluate the value of health information from a consumer perspective. Objective We collaborated with information providers to develop and validate the Information Assessment Method for all (IAM4all) that can be used to collect feedback from information consumers (including patients), and to enable a two-way knowledge translation between information providers and consumers. Methods Content validation steps were followed to develop the IAM4all questionnaire. The first version was based on a theoretical framework from information science, a critical literature review and prior work. Then, 16 laypersons were interviewed on their experience with online health information and specifically their impression of the IAM4all questionnaire. Based on the summaries and interpretations of interviews, questionnaire items were revised, added, and excluded, thus creating the second version of the questionnaire. Subsequently, a panel of 12 information specialists and 8 health researchers participated in an online survey to rate each questionnaire item for relevance, clarity, representativeness, and specificity. The result of this expert panel contributed to the third, current, version of the questionnaire. Results The current version of the IAM4all questionnaire is structured by four levels of outcomes of information seeking/receiving: situational relevance, cognitive impact, information use, and health benefits. Following the interviews and the expert panel survey, 9 questionnaire items were confirmed as relevant, clear, representative, and specific. To improve readability and accessibility for users with a lower level of literacy, 19 items were reworded

  1. Mixed Methods Research of Adult Family Care Home Residents and Informal Caregivers

    ERIC Educational Resources Information Center

    Jeanty, Guy C.; Hibel, James

    2011-01-01

    This article describes a mixed methods approach used to explore the experiences of adult family care home (AFCH) residents and informal caregivers (IC). A rationale is presented for using a mixed methods approach employing the sequential exploratory design with this poorly researched population. The unique challenges attendant to the sampling…

  2. Farmers' Preferences for Methods of Receiving Information on New or Innovative Farming Practices.

    ERIC Educational Resources Information Center

    Riesenberg, Lou E.; Gor, Christopher Obel

    1989-01-01

    Survey of 386 Idaho farmers (response rate 58 percent) identified preferred methods of receiving information on new or innovative farming practices. Analysis revealed preference for interpersonal methods (demonstrations, tours, and field trips) over mass media such as computer-assisted instruction (CAI) and home study, although younger farmers,…

  3. A Qualitative Study about Performance Based Assesment Methods Used in Information Technologies Lesson

    ERIC Educational Resources Information Center

    Daghan, Gökhan; Akkoyunlu, Buket

    2014-01-01

    In this study, Information Technologies teachers' views and usage cases on performance based assesment methods (PBAMs) are examined. It is aimed to find out which of the PBAMs are used frequently or not used, preference reasons of these methods and opinions about the applicability of them. Study is designed with the phenomenological design…

  4. Using Financial Information in Continuing Education. Accepted Methods and New Approaches.

    ERIC Educational Resources Information Center

    Matkin, Gary W.

    This book, which is intended as a resource/reference guide for experienced financial managers and course planners, examines accepted methods and new approaches for using financial information in continuing education. The introduction reviews theory and practice, traditional and new methods, planning and organizational management, and technology.…

  5. 77 FR 23674 - Proposed Information Collection Requests: Measures and Methods for the National Reporting System...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-20

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF EDUCATION Proposed Information Collection Requests: Measures and Methods for the National Reporting System for Adult... records. Title of Collection: Measures and Methods for the National Reporting System for Adult...

  6. A Method for the Analysis of Information Use in Source-Based Writing

    ERIC Educational Resources Information Center

    Sormunen, Eero; Heinstrom, Jannica; Romu, Leena; Turunen, Risto

    2012-01-01

    Introduction: Past research on source-based writing assignments has hesitated to scrutinize how students actually use information afforded by sources. This paper introduces a method for the analysis of text transformations from sources to texts composed. The method is aimed to serve scholars in building a more detailed understanding of how…

  7. A theory-based logic model for innovation policy and evaluation.

    SciTech Connect

    Jordan, Gretchen B.

    2010-04-01

    Current policy and program rationale, objectives, and evaluation use a fragmented picture of the innovation process. This presents a challenge since in the United States officials in both the executive and legislative branches of government see innovation, whether that be new products or processes or business models, as the solution to many of the problems the country faces. The logic model is a popular tool for developing and describing the rationale for a policy or program and its context. This article sets out to describe generic logic models of both the R&D process and the diffusion process, building on existing theory-based frameworks. Then a combined, theory-based logic model for the innovation process is presented. Examples of the elements of the logic, each a possible leverage point or intervention, are provided, along with a discussion of how this comprehensive but simple model might be useful for both evaluation and policy development.

  8. An Energy-Efficient Game-Theory-Based Spectrum Decision Scheme for Cognitive Radio Sensor Networks

    PubMed Central

    Salim, Shelly; Moh, Sangman

    2016-01-01

    A cognitive radio sensor network (CRSN) is a wireless sensor network in which sensor nodes are equipped with cognitive radio. In this paper, we propose an energy-efficient game-theory-based spectrum decision (EGSD) scheme for CRSNs to prolong the network lifetime. Note that energy efficiency is the most important design consideration in CRSNs because it determines the network lifetime. The central part of the EGSD scheme consists of two spectrum selection algorithms: random selection and game-theory-based selection. The EGSD scheme also includes a clustering algorithm, spectrum characterization with a Markov chain, and cluster member coordination. Our performance study shows that EGSD outperforms the existing popular framework in terms of network lifetime and coordination overhead. PMID:27376290

  9. An Energy-Efficient Game-Theory-Based Spectrum Decision Scheme for Cognitive Radio Sensor Networks.

    PubMed

    Salim, Shelly; Moh, Sangman

    2016-01-01

    A cognitive radio sensor network (CRSN) is a wireless sensor network in which sensor nodes are equipped with cognitive radio. In this paper, we propose an energy-efficient game-theory-based spectrum decision (EGSD) scheme for CRSNs to prolong the network lifetime. Note that energy efficiency is the most important design consideration in CRSNs because it determines the network lifetime. The central part of the EGSD scheme consists of two spectrum selection algorithms: random selection and game-theory-based selection. The EGSD scheme also includes a clustering algorithm, spectrum characterization with a Markov chain, and cluster member coordination. Our performance study shows that EGSD outperforms the existing popular framework in terms of network lifetime and coordination overhead. PMID:27376290

  10. Mechanisms of behavioural maintenance: Long-term effects of theory-based interventions to promote safe water consumption.

    PubMed

    Inauen, Jennifer; Mosler, Hans-Joachim

    2016-01-01

    Theory-based interventions can enhance people's safe water consumption, but the sustainability of these interventions and the mechanisms of maintenance remain unclear. We investigated these questions based on an extended theory of planned behaviour. Seven hundred and ten (445 analysed) randomly selected households participated in two cluster-randomised controlled trials in Bangladesh. Study 1 promoted switching to neighbours' arsenic-safe wells, and Study 2 promoted switching to arsenic-safe deep wells. Both studies included two intervention phases. Structured interviews were conducted at baseline (T1), and at 1-month (T2), 2-month (T3) and 9-month (T4) follow-ups. In intervention phase 1 (between T1 and T2), commitment-based behaviour change techniques--reminders, implementation intentions and public commitment--were combined with information and compared to an information-only control group. In phase 2 (between T2 and T3), half of each phase 1 intervention group was randomly assigned to receive either commitment-based techniques once more or coping planning with reminders and information. Initial well-switching rates of up to 60% significantly declined by T4: 38.3% of T2 safe water users stopped consuming arsenic-safe water. The decline depended on the intervention. Perceived behavioural control, intentions, commitment strength and coping planning were associated with maintenance. In line with previous studies, the results indicate that commitment and reminders engender long-term behavioural change. PMID:26304476