Science.gov

Sample records for information theory-based methods

  1. Optimal cross-sectional sampling for river modelling with bridges: An information theory-based method

    NASA Astrophysics Data System (ADS)

    Ridolfi, E.; Alfonso, L.; Di Baldassarre, G.; Napolitano, F.

    2016-06-01

    The description of river topography has a crucial role in accurate one-dimensional (1D) hydraulic modelling. Specifically, cross-sectional data define the riverbed elevation, the flood-prone area, and thus, the hydraulic behavior of the river. Here, the problem of the optimal cross-sectional spacing is solved through an information theory-based concept. The optimal subset of locations is the one with the maximum information content and the minimum amount of redundancy. The original contribution is the introduction of a methodology to sample river cross sections in the presence of bridges. The approach is tested on the Grosseto River (IT) and is compared to existing guidelines. The results show that the information theory-based approach can support traditional methods to estimate rivers' cross-sectional spacing.

  2. A reexamination of information theory-based methods for DNA-binding site identification

    PubMed Central

    Erill, Ivan; O'Neill, Michael C

    2009-01-01

    Background Searching for transcription factor binding sites in genome sequences is still an open problem in bioinformatics. Despite substantial progress, search methods based on information theory remain a standard in the field, even though the full validity of their underlying assumptions has only been tested in artificial settings. Here we use newly available data on transcription factors from different bacterial genomes to make a more thorough assessment of information theory-based search methods. Results Our results reveal that conventional benchmarking against artificial sequence data leads frequently to overestimation of search efficiency. In addition, we find that sequence information by itself is often inadequate and therefore must be complemented by other cues, such as curvature, in real genomes. Furthermore, results on skewed genomes show that methods integrating skew information, such as Relative Entropy, are not effective because their assumptions may not hold in real genomes. The evidence suggests that binding sites tend to evolve towards genomic skew, rather than against it, and to maintain their information content through increased conservation. Based on these results, we identify several misconceptions on information theory as applied to binding sites, such as negative entropy, and we propose a revised paradigm to explain the observed results. Conclusion We conclude that, among information theory-based methods, the most unassuming search methods perform, on average, better than any other alternatives, since heuristic corrections to these methods are prone to fail when working on real data. A reexamination of information content in binding sites reveals that information content is a compound measure of search and binding affinity requirements, a fact that has important repercussions for our understanding of binding site evolution. PMID:19210776

  3. A fuzzy-theory-based method for studying the effect of information transmission on nonlinear crowd dispersion dynamics

    NASA Astrophysics Data System (ADS)

    Fu, Libi; Song, Weiguo; Lo, Siuming

    2017-01-01

    Emergencies involved in mass events are related to a variety of factors and processes. An important factor is the transmission of information on danger that has an influence on nonlinear crowd dynamics during the process of crowd dispersion. Due to much uncertainty in this process, there is an urgent need to propose a method to investigate the influence. In this paper, a novel fuzzy-theory-based method is presented to study crowd dynamics under the influence of information transmission. Fuzzy functions and rules are designed for the ambiguous description of human states. Reasonable inference is employed to decide the output values of decision making such as pedestrian movement speed and directions. Through simulation under four-way pedestrian situations, good crowd dispersion phenomena are achieved. Simulation results under different conditions demonstrate that information transmission cannot always induce successful crowd dispersion in all situations. This depends on whether decision strategies in response to information on danger are unified and effective, especially in dense crowds. Results also suggest that an increase in drift strength at low density and the percentage of pedestrians, who choose one of the furthest unoccupied Von Neumann neighbors from the dangerous source as the drift direction at high density, is helpful in crowd dispersion. Compared with previous work, our comprehensive study improves an in-depth understanding of nonlinear crowd dynamics under the effect of information on danger.

  4. Novel information theory based method for superimposition of lateral head radiographs and cone beam computed tomography images

    PubMed Central

    Jacquet, W; Nyssen, E; Bottenberg, P; de Groen, P; Vande Vannet, B

    2010-01-01

    Objectives The aim was to introduce a novel alignment criterion, focus mutual information (FMI), for the superimposition of lateral cephalometric radiographs and three dimensional (3D) cone beam computed images as well as the assessment of the alignment characteristics of the new method and comparison of the novel methodology with the region of interest (ROI) approach. Methods Implementation of a FMI criterion-based methodology that only requires the approximate indication of stable structures in one single image. The robustness of the method was first addressed in a phantom experiment comparing the new technique with a ROI approach. Two consecutive cephalometric radiographs were then obtained, one before and one after functional twin block application. These images were then superimposed using alignment by FMI where the following were focused on, in several ways: (1) cranial base and acoustic meatus, (2) palatal plane and (3) mandibular symphysis. The superimposed images were subtracted and coloured. The applicability to cone beam CT (CBCT) is illustrated by the alignment of CBCT images acquired before and after craniofacial surgery. Results The phantom experiment clearly shows superior alignment when compared to the ROI approach (Wilcoxon n = 17, Z = −3.290, and P = 0.001), and robustness with respect to the choice of parameters (one-sample t-test n = 50, t = −12.355, and P = 0.000). The treatment effects are revealed clearly in the subtraction image of well-aligned cephalometric radiographs. The colouring scheme of the subtraction image emphasises the areas of change and visualizes the remodelling of the soft tissue. Conclusions FMI allows for cephalometry without tracing, it avoids the error inherent to the use of landmarks and the interaction of the practitioner is kept to a minimum. The robustness to focal distribution variations limits the influence of possible examiner inaccuracy. PMID:20395459

  5. Evaluating hydrological model performance using information theory-based metrics

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The accuracy-based model performance metrics not necessarily reflect the qualitative correspondence between simulated and measured streamflow time series. The objective of this work was to use the information theory-based metrics to see whether they can be used as complementary tool for hydrologic m...

  6. IMMAN: free software for information theory-based chemometric analysis.

    PubMed

    Urias, Ricardo W Pino; Barigye, Stephen J; Marrero-Ponce, Yovani; García-Jacas, César R; Valdes-Martiní, José R; Perez-Gimenez, Facundo

    2015-05-01

    The features and theoretical background of a new and free computational program for chemometric analysis denominated IMMAN (acronym for Information theory-based CheMoMetrics ANalysis) are presented. This is multi-platform software developed in the Java programming language, designed with a remarkably user-friendly graphical interface for the computation of a collection of information-theoretic functions adapted for rank-based unsupervised and supervised feature selection tasks. A total of 20 feature selection parameters are presented, with the unsupervised and supervised frameworks represented by 10 approaches in each case. Several information-theoretic parameters traditionally used as molecular descriptors (MDs) are adapted for use as unsupervised rank-based feature selection methods. On the other hand, a generalization scheme for the previously defined differential Shannon's entropy is discussed, as well as the introduction of Jeffreys information measure for supervised feature selection. Moreover, well-known information-theoretic feature selection parameters, such as information gain, gain ratio, and symmetrical uncertainty are incorporated to the IMMAN software ( http://mobiosd-hub.com/imman-soft/ ), following an equal-interval discretization approach. IMMAN offers data pre-processing functionalities, such as missing values processing, dataset partitioning, and browsing. Moreover, single parameter or ensemble (multi-criteria) ranking options are provided. Consequently, this software is suitable for tasks like dimensionality reduction, feature ranking, as well as comparative diversity analysis of data matrices. Simple examples of applications performed with this program are presented. A comparative study between IMMAN and WEKA feature selection tools using the Arcene dataset was performed, demonstrating similar behavior. In addition, it is revealed that the use of IMMAN unsupervised feature selection methods improves the performance of both IMMAN and WEKA

  7. Correlation theory-based signal processing method for CMF signals

    NASA Astrophysics Data System (ADS)

    Shen, Yan-lin; Tu, Ya-qing

    2016-06-01

    Signal processing precision of Coriolis mass flowmeter (CMF) signals affects measurement accuracy of Coriolis mass flowmeters directly. To improve the measurement accuracy of CMFs, a correlation theory-based signal processing method for CMF signals is proposed, which is comprised of the correlation theory-based frequency estimation method and phase difference estimation method. Theoretical analysis shows that the proposed method eliminates the effect of non-integral period sampling signals on frequency and phase difference estimation. The results of simulations and field experiments demonstrate that the proposed method improves the anti-interference performance of frequency and phase difference estimation and has better estimation performance than the adaptive notch filter, discrete Fourier transform and autocorrelation methods in terms of frequency estimation and the data extension-based correlation, Hilbert transform, quadrature delay estimator and discrete Fourier transform methods in terms of phase difference estimation, which contributes to improving the measurement accuracy of Coriolis mass flowmeters.

  8. Evaluation of the Performance of Information Theory-Based Methods and Cross-Correlation to Estimate the Functional Connectivity in Cortical Networks

    PubMed Central

    Garofalo, Matteo; Nieus, Thierry; Massobrio, Paolo; Martinoia, Sergio

    2009-01-01

    Functional connectivity of in vitro neuronal networks was estimated by applying different statistical algorithms on data collected by Micro-Electrode Arrays (MEAs). First we tested these “connectivity methods” on neuronal network models at an increasing level of complexity and evaluated the performance in terms of ROC (Receiver Operating Characteristic) and PPC (Positive Precision Curve), a new defined complementary method specifically developed for functional links identification. Then, the algorithms better estimated the actual connectivity of the network models, were used to extract functional connectivity from cultured cortical networks coupled to MEAs. Among the proposed approaches, Transfer Entropy and Joint-Entropy showed the best results suggesting those methods as good candidates to extract functional links in actual neuronal networks from multi-site recordings. PMID:19652720

  9. Scale effects on information theory-based measures applied to streamflow patterns in two rural watersheds

    NASA Astrophysics Data System (ADS)

    Pan, Feng; Pachepsky, Yakov A.; Guber, Andrey K.; McPherson, Brian J.; Hill, Robert L.

    2012-01-01

    varying from 0.2 to 0.6 in the two watersheds. We conclude that temporal effects must be evaluated and accounted for when the information theory-based methods are used for performance evaluation and comparison of hydrological models.

  10. Trends in information theory-based chemical structure codification.

    PubMed

    Barigye, Stephen J; Marrero-Ponce, Yovani; Pérez-Giménez, Facundo; Bonchev, Danail

    2014-08-01

    This report offers a chronological review of the most relevant applications of information theory in the codification of chemical structure information, through the so-called information indices. Basically, these are derived from the analysis of the statistical patterns of molecular structure representations, which include primitive global chemical formulae, chemical graphs, or matrix representations. Finally, new approaches that attempt to go "back to the roots" of information theory, in order to integrate other information-theoretic measures in chemical structure coding are discussed.

  11. Kinetic theory based new upwind methods for inviscid compressible flows

    NASA Technical Reports Server (NTRS)

    Deshpande, S. M.

    1986-01-01

    Two new upwind methods called the Kinetic Numerical Method (KNM) and the Kinetic Flux Vector Splitting (KFVS) method for the solution of the Euler equations have been presented. Both of these methods can be regarded as some suitable moments of an upwind scheme for the solution of the Boltzmann equation provided the distribution function is Maxwellian. This moment-method strategy leads to a unification of the Riemann approach and the pseudo-particle approach used earlier in the development of upwind methods for the Euler equations. A very important aspect of the moment-method strategy is that the new upwind methods satisfy the entropy condition because of the Boltzmann H-Theorem and suggest a possible way of extending the Total Variation Diminishing (TVD) principle within the framework of the H-Theorem. The ability of these methods in obtaining accurate wiggle-free solution is demonstrated by applying them to two test problems.

  12. Density functional theory based generalized effective fragment potential method

    SciTech Connect

    Nguyen, Kiet A. E-mail: ruth.pachter@wpafb.af.mil; Pachter, Ruth E-mail: ruth.pachter@wpafb.af.mil; Day, Paul N.

    2014-06-28

    We present a generalized Kohn-Sham (KS) density functional theory (DFT) based effective fragment potential (EFP2-DFT) method for the treatment of solvent effects. Similar to the original Hartree-Fock (HF) based potential with fitted parameters for water (EFP1) and the generalized HF based potential (EFP2-HF), EFP2-DFT includes electrostatic, exchange-repulsion, polarization, and dispersion potentials, which are generated for a chosen DFT functional for a given isolated molecule. The method does not have fitted parameters, except for implicit parameters within a chosen functional and the dispersion correction to the potential. The electrostatic potential is modeled with a multipolar expansion at each atomic center and bond midpoint using Stone's distributed multipolar analysis. The exchange-repulsion potential between two fragments is composed of the overlap and kinetic energy integrals and the nondiagonal KS matrices in the localized molecular orbital basis. The polarization potential is derived from the static molecular polarizability. The dispersion potential includes the intermolecular D3 dispersion correction of Grimme et al. [J. Chem. Phys. 132, 154104 (2010)]. The potential generated from the CAMB3LYP functional has mean unsigned errors (MUEs) with respect to results from coupled cluster singles, doubles, and perturbative triples with a complete basis set limit (CCSD(T)/CBS) extrapolation, of 1.7, 2.2, 2.0, and 0.5 kcal/mol, for the S22, water-benzene clusters, water clusters, and n-alkane dimers benchmark sets, respectively. The corresponding EFP2-HF errors for the respective benchmarks are 2.41, 3.1, 1.8, and 2.5 kcal/mol. Thus, the new EFP2-DFT-D3 method with the CAMB3LYP functional provides comparable or improved results at lower computational cost and, therefore, extends the range of applicability of EFP2 to larger system sizes.

  13. Discovery and validation of information theory-based transcription factor and cofactor binding site motifs.

    PubMed

    Lu, Ruipeng; Mucaki, Eliseos J; Rogan, Peter K

    2016-11-28

    Data from ChIP-seq experiments can derive the genome-wide binding specificities of transcription factors (TFs) and other regulatory proteins. We analyzed 765 ENCODE ChIP-seq peak datasets of 207 human TFs with a novel motif discovery pipeline based on recursive, thresholded entropy minimization. This approach, while obviating the need to compensate for skewed nucleotide composition, distinguishes true binding motifs from noise, quantifies the strengths of individual binding sites based on computed affinity and detects adjacent cofactor binding sites that coordinate with the targets of primary, immunoprecipitated TFs. We obtained contiguous and bipartite information theory-based position weight matrices (iPWMs) for 93 sequence-specific TFs, discovered 23 cofactor motifs for 127 TFs and revealed six high-confidence novel motifs. The reliability and accuracy of these iPWMs were determined via four independent validation methods, including the detection of experimentally proven binding sites, explanation of effects of characterized SNPs, comparison with previously published motifs and statistical analyses. We also predict previously unreported TF coregulatory interactions (e.g. TF complexes). These iPWMs constitute a powerful tool for predicting the effects of sequence variants in known binding sites, performing mutation analysis on regulatory SNPs and predicting previously unrecognized binding sites and target genes.

  14. An information theory based search for homogeneity on the largest accessible scale

    NASA Astrophysics Data System (ADS)

    Sarkar, Suman; Pandey, Biswajit

    2016-11-01

    We analyse the Sloan Digital Sky Survey Data Release 12 quasar catalogue to test the large-scale smoothness in the quasar distribution. We quantify the degree of inhomogeneity in the quasar distribution using information theory based measures and find that the degree of inhomogeneity diminishes with increasing length scales which finally reach a plateau at ˜250 h-1 Mpc. The residual inhomogeneity at the plateau is consistent with that expected for a Poisson point process. Our results indicate that the quasar distribution is homogeneous beyond length scales of 250 h-1 Mpc.

  15. A variable-order laminated plate theory based on the variational-asymptotical method

    NASA Technical Reports Server (NTRS)

    Lee, Bok W.; Sutyrin, Vladislav G.; Hodges, Dewey H.

    1993-01-01

    The variational-asymptotical method is a mathematical technique by which the three-dimensional analysis of laminated plate deformation can be split into a linear, one-dimensional, through-the-thickness analysis and a nonlinear, two-dimensional, plate analysis. The elastic constants used in the plate analysis are obtained from the through-the-thickness analysis, along with approximate, closed-form three-dimensional distributions of displacement, strain, and stress. In this paper, a theory based on this technique is developed which is capable of approximating three-dimensional elasticity to any accuracy desired. The asymptotical method allows for the approximation of the through-the-thickness behavior in terms of the eigenfunctions of a certain Sturm-Liouville problem associated with the thickness coordinate. These eigenfunctions contain all the necessary information about the nonhomogeneities along the thickness coordinate of the plate and thus possess the appropriate discontinuities in the derivatives of displacement. The theory is presented in this paper along with numerical results for the eigenfunctions of various laminated plates.

  16. A new theory-based social classification in Japan and its validation using historically collected information.

    PubMed

    Hiyoshi, Ayako; Fukuda, Yoshiharu; Shipley, Martin J; Bartley, Mel; Brunner, Eric J

    2013-06-01

    Studies of health inequalities in Japan have increased since the millennium. However, there remains a lack of an accepted theory-based classification to measure occupation-related social position for Japan. This study attempts to derive such a classification based on the National Statistics Socio-economic Classification in the UK. Using routinely collected data from the nationally representative Comprehensive Survey of the Living Conditions of People on Health and Welfare, the Japanese Socioeconomic Classification was derived using two variables - occupational group and employment status. Validation analyses were conducted using household income, home ownership, self-rated good or poor health, and Kessler 6 psychological distress (n ≈ 36,000). After adjustment for age, marital status, and area (prefecture), one step lower social class was associated with mean 16% (p < 0.001) lower income, and a risk ratio of 0.93 (p < 0.001) for home ownership. The probability of good health showed a trend in men and women (risk ratio 0.94 and 0.93, respectively, for one step lower social class, p < 0.001). The trend for poor health was significant in women (odds ratio 1.12, p < 0.001) but not in men. Kessler 6 psychological distress showed significant trends in men (risk ratio 1.03, p = 0.044) and in women (1.05, p = 0.004). We propose the Japanese Socioeconomic Classification, derived from basic occupational and employment status information, as a meaningful, theory-based and standard classification system suitable for monitoring occupation-related health inequalities in Japan.

  17. Using a Mixed Methods Sequential Design to Identify Factors Associated with African American Mothers' Intention to Vaccinate Their Daughters Aged 9 to 12 for HPV with a Purpose of Informing a Culturally-Relevant, Theory-Based Intervention

    ERIC Educational Resources Information Center

    Cunningham, Jennifer L.

    2013-01-01

    The purpose of this sequential, explanatory mixed methods research study was to understand what factors influenced African American maternal intentions to get their daughters aged 9 years to 12 years vaccinated in Alabama. In the first, quantitative phase of the study, the research questions focused on identifying the predictive power of eleven…

  18. Electrostatic Introduction Theory Based Spatial Filtering Method for Solid Particle Velocity Measurement

    NASA Astrophysics Data System (ADS)

    Xu, Chuanlong; Tang, Guanghua; Zhou, Bin; Yang, Daoye; Zhang, Jianyong; Wang, Shimin

    2007-06-01

    Electrostatic induction theory based spatial filtering method for particle velocity measurement has the advantages of the simplicity of measurement system and of the convenience of data processing. In this paper, the relationship between solid particle velocity and the power spectrum of the output signal of the electrostatic senor was derived theoretically. And the effects of the length of the electrode, the thickness of the dielectric pipe and its length on the spatial filtering characteristics of the electrostatic sensor were investigated numerically using finite element method. Additionally, as for the roughness and the difficult determination of the peak frequency fmax of the power spectrum characteristics curve of the output signal, a wavelet analysis based filtering method was adopted to smooth the curve, which can determine peak frequency fmax accurately. Finally, the velocity measurement method was applied in a dense phase pneumatic conveying system under high pressure, and the experimental results show that the system repeatability is within ±4% over the gas superficial velocity range of 8.63-18.62 m/s for particle concentration range 0.067-0.130 m3/m3.

  19. Information-theory-based solution of the inverse problem in classical statistical mechanics.

    PubMed

    D'Alessandro, Marco; Cilloco, Francesco

    2010-08-01

    We present a procedure for the determination of the interaction potential from the knowledge of the radial pair distribution function. The method, realized inside an inverse Monte Carlo simulation scheme, is based on the application of the maximum entropy principle of information theory and the interaction potential emerges as the asymptotic expression of the transition probability. Results obtained for high density monoatomic fluids are very satisfactory and provide an accurate extraction of the potential, despite a modest computational effort.

  20. Discovering Pair-Wise Genetic Interactions: An Information Theory-Based Approach

    PubMed Central

    Ignac, Tomasz M.; Skupin, Alexander; Sakhanenko, Nikita A.; Galas, David J.

    2014-01-01

    Phenotypic variation, including that which underlies health and disease in humans, results in part from multiple interactions among both genetic variation and environmental factors. While diseases or phenotypes caused by single gene variants can be identified by established association methods and family-based approaches, complex phenotypic traits resulting from multi-gene interactions remain very difficult to characterize. Here we describe a new method based on information theory, and demonstrate how it improves on previous approaches to identifying genetic interactions, including both synthetic and modifier kinds of interactions. We apply our measure, called interaction distance, to previously analyzed data sets of yeast sporulation efficiency, lipid related mouse data and several human disease models to characterize the method. We show how the interaction distance can reveal novel gene interaction candidates in experimental and simulated data sets, and outperforms other measures in several circumstances. The method also allows us to optimize case/control sample composition for clinical studies. PMID:24670935

  1. Scale effects on information theory-based measures applied to streamflow patterns in two rural watersheds

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Understanding streamflow patterns in space and time is important to improve the flood and drought forecasting, water resources management, and predictions of ecological changes. The objectives of this work were (a) to characterize the spatial and temporal patterns of streamflow using information the...

  2. Signal Detection Theory-Based Information Processing for the Detection of Breast Cancer at Microwave Frequencies

    DTIC Science & Technology

    2002-08-01

    the measurement noise, as well as the physical model of the forward scattered electric field. The Bayesian algorithms for the Uncertain Permittivity...received at multiple sensors. In this research project a tissue- model -based signal-detection theory approach for the detection of mammary tumors in the...oriented information processors. In this research project a tissue- model - based signal detection theory approach for the detection of mammary tumors in the

  3. Multivariate drought index: An information theory based approach for integrated drought assessment

    NASA Astrophysics Data System (ADS)

    Rajsekhar, Deepthi; Singh, Vijay. P.; Mishra, Ashok. K.

    2015-07-01

    Most of the existing drought indices are based on a single variable (e.g. precipitation) or a combination of two variables (e.g., precipitation and streamflow). This may not be sufficient for reliable quantification of the existing drought condition. It is possible that a region might be experiencing only a single type of drought at times, but multiple drought types affecting a region is quite common too. To have a comprehensive representation, it is better to consider all the variables that lead to different physical forms of drought, such as meteorological, hydrological, and agricultural droughts. Therefore, we propose to develop a multivariate drought index (MDI) that will utilize information from hydroclimatic variables, including precipitation, runoff, evapotranspiration and soil moisture as indicator variables, thus accounting for all the physical forms of drought. The entropy theory was utilized to develop this proposed index, that led to the smallest set of features maximally preserving the information of the input data set. MDI was then compared with the Palmer drought severity index (PDSI) for all climate regions within Texas for the time period 1950-2012, with particular attention to the two major drought occurrences in Texas, viz. the droughts which occurred in 1950-1957, and 2010-2011. The proposed MDI was found to represent drought conditions well, due to its multivariate, multi scalar, and nonlinear properties. To help the user choose the right time scale for further analysis, entropy maps of MDI at different time scales were used as a guideline. The MDI time scale that has the highest entropy value may be chosen, since a higher entropy indicates a higher information content.

  4. An information theory based framework for the measurement of population health.

    PubMed

    Nesson, Erik T; Robinson, Joshua J

    2015-04-01

    This paper proposes a new framework for the measurement of population health and the ranking of the health of different geographies. Since population health is a latent variable, studies which measure and rank the health of different geographies must aggregate observable health attributes into one summary measure. We show that the methods used in nearly all the literature to date implicitly assume that all attributes are infinitely substitutable. Our method, based on the measurement of multidimensional welfare and inequality, minimizes the entropic distance between the summary measure of population health and the distribution of the underlying attributes. This summary function coincides with the constant elasticity of substitution and Cobb-Douglas production functions and naturally allows different assumptions regarding attribute substitutability or complementarity. To compare methodologies, we examine a well-known ranking of the population health of U.S. states, America's Health Rankings. We find that states' rankings are somewhat sensitive to changes in the weight given to each attribute, but very sensitive to changes in aggregation methodology. Our results have broad implications for well-known health rankings such as the 2000 World Health Report, as well as other measurements of population and individual health levels and the measurement and decomposition of health inequality.

  5. A second-order accurate kinetic-theory-based method for inviscid compressible flows

    NASA Technical Reports Server (NTRS)

    Deshpande, Suresh M.

    1986-01-01

    An upwind method for the numerical solution of the Euler equations is presented. This method, called the kinetic numerical method (KNM), is based on the fact that the Euler equations are moments of the Boltzmann equation of the kinetic theory of gases when the distribution function is Maxwellian. The KNM consists of two phases, the convection phase and the collision phase. The method is unconditionally stable and explicit. It is highly vectorizable and can be easily made total variation diminishing for the distribution function by a suitable choice of the interpolation strategy. The method is applied to a one-dimensional shock-propagation problem and to a two-dimensional shock-reflection problem.

  6. Perturbative method for the derivation of quantum kinetic theory based on closed-time-path formalism.

    PubMed

    Koide, Jun

    2002-02-01

    Within the closed-time-path formalism, a perturbative method is presented, which reduces the microscopic field theory to the quantum kinetic theory. In order to make this reduction, the expectation value of a physical quantity must be calculated under the condition that the Wigner distribution function is fixed, because it is the independent dynamical variable in the quantum kinetic theory. It is shown that when a nonequilibrium Green function in the form of the generalized Kadanoff-Baym ansatz is utilized, this condition appears as a cancellation of a certain part of contributions in the diagrammatic expression of the expectation value. Together with the quantum kinetic equation, which can be derived in the closed-time-path formalism, this method provides a basis for the kinetic-theoretical description.

  7. Gas-Kinetic Theory Based Flux Splitting Method for Ideal Magnetohydrodynamics

    NASA Technical Reports Server (NTRS)

    Xu, Kun

    1998-01-01

    A gas-kinetic solver is developed for the ideal magnetohydrodynamics (MHD) equations. The new scheme is based on the direct splitting of the flux function of the MHD equations with the inclusion of "particle" collisions in the transport process. Consequently, the artificial dissipation in the new scheme is much reduced in comparison with the MHD Flux Vector Splitting Scheme. At the same time, the new scheme is compared with the well-developed Roe-type MHD solver. It is concluded that the kinetic MHD scheme is more robust and efficient than the Roe- type method, and the accuracy is competitive. In this paper the general principle of splitting the macroscopic flux function based on the gas-kinetic theory is presented. The flux construction strategy may shed some light on the possible modification of AUSM- and CUSP-type schemes for the compressible Euler equations, as well as to the development of new schemes for a non-strictly hyperbolic system.

  8. Practical application of game theory based production flow planning method in virtual manufacturing networks

    NASA Astrophysics Data System (ADS)

    Olender, M.; Krenczyk, D.

    2016-08-01

    Modern enterprises have to react quickly to dynamic changes in the market, due to changing customer requirements and expectations. One of the key area of production management, that must continuously evolve by searching for new methods and tools for increasing the efficiency of manufacturing systems is the area of production flow planning and control. These aspects are closely connected with the ability to implement the concept of Virtual Enterprises (VE) and Virtual Manufacturing Network (VMN) in which integrated infrastructure of flexible resources are created. In the proposed approach, the players role perform the objects associated with the objective functions, allowing to solve the multiobjective production flow planning problems based on the game theory, which is based on the theory of the strategic situation. For defined production system and production order models ways of solving the problem of production route planning in VMN on computational examples for different variants of production flow is presented. Possible decision strategy to use together with an analysis of calculation results is shown.

  9. Assessing the density functional theory-based multireference configuration interaction (DFT/MRCI) method for transition metal complexes

    SciTech Connect

    Escudero, Daniel E-mail: thiel@kofo.mpg.de; Thiel, Walter E-mail: thiel@kofo.mpg.de

    2014-05-21

    We report an assessment of the performance of density functional theory-based multireference configuration interaction (DFT/MRCI) calculations for a set of 3d- and 4d-transition metal (TM) complexes. The DFT/MRCI results are compared to published reference data from reliable high-level multi-configurational ab initio studies. The assessment covers the relative energies of different ground-state minima of the highly correlated CrF{sub 6} complex, the singlet and triplet electronically excited states of seven typical TM complexes (MnO{sub 4}{sup −}, Cr(CO){sub 6}, [Fe(CN){sub 6}]{sup 4−}, four larger Fe and Ru complexes), and the corresponding electronic spectra (vertical excitation energies and oscillator strengths). It includes comparisons with results from different flavors of time-dependent DFT (TD-DFT) calculations using pure, hybrid, and long-range corrected functionals. The DFT/MRCI method is found to be superior to the tested TD-DFT approaches and is thus recommended for exploring the excited-state properties of TM complexes.

  10. Assessing the density functional theory-based multireference configuration interaction (DFT/MRCI) method for transition metal complexes.

    PubMed

    Escudero, Daniel; Thiel, Walter

    2014-05-21

    We report an assessment of the performance of density functional theory-based multireference configuration interaction (DFT/MRCI) calculations for a set of 3d- and 4d-transition metal (TM) complexes. The DFT/MRCI results are compared to published reference data from reliable high-level multi-configurational ab initio studies. The assessment covers the relative energies of different ground-state minima of the highly correlated CrF6 complex, the singlet and triplet electronically excited states of seven typical TM complexes (MnO4(-), Cr(CO)6, [Fe(CN)6](4-), four larger Fe and Ru complexes), and the corresponding electronic spectra (vertical excitation energies and oscillator strengths). It includes comparisons with results from different flavors of time-dependent DFT (TD-DFT) calculations using pure, hybrid, and long-range corrected functionals. The DFT/MRCI method is found to be superior to the tested TD-DFT approaches and is thus recommended for exploring the excited-state properties of TM complexes.

  11. Battling the challenges of training nurses to use information systems through theory-based training material design.

    PubMed

    Galani, Malatsi; Yu, Ping; Paas, Fred; Chandler, Paul

    2014-01-01

    The attempts to train nurses to effectively use information systems have had mixed results. One problem is that training materials are not adequately designed to guide trainees to gradually learn to use a system without experiencing a heavy cognitive load. This is because training design often does not take into consideration a learner's cognitive ability to absorb new information in a short training period. Given the high cost and difficulty of organising training in healthcare organisations, there is an urgent need for information system trainers to be aware of how cognitive overload or information overload affect a trainee's capability to acquire new knowledge and skills, and what instructional techniques can be used to facilitate effective learning. This paper introduces the concept of cognitive load and how it affects nurses when learning to use a new health information system. This is followed by the relevant strategies for instructional design, underpinned by the principles of cognitive load theory, which may be helpful for the development of effective instructional materials and activities for training nurses to use information systems.

  12. What Do You Think You Are Measuring? A Mixed-Methods Procedure for Assessing the Content Validity of Test Items and Theory-Based Scaling.

    PubMed

    Koller, Ingrid; Levenson, Michael R; Glück, Judith

    2017-01-01

    The valid measurement of latent constructs is crucial for psychological research. Here, we present a mixed-methods procedure for improving the precision of construct definitions, determining the content validity of items, evaluating the representativeness of items for the target construct, generating test items, and analyzing items on a theoretical basis. To illustrate the mixed-methods content-scaling-structure (CSS) procedure, we analyze the Adult Self-Transcendence Inventory, a self-report measure of wisdom (ASTI, Levenson et al., 2005). A content-validity analysis of the ASTI items was used as the basis of psychometric analyses using multidimensional item response models (N = 1215). We found that the new procedure produced important suggestions concerning five subdimensions of the ASTI that were not identifiable using exploratory methods. The study shows that the application of the suggested procedure leads to a deeper understanding of latent constructs. It also demonstrates the advantages of theory-based item analysis.

  13. What Do You Think You Are Measuring? A Mixed-Methods Procedure for Assessing the Content Validity of Test Items and Theory-Based Scaling

    PubMed Central

    Koller, Ingrid; Levenson, Michael R.; Glück, Judith

    2017-01-01

    The valid measurement of latent constructs is crucial for psychological research. Here, we present a mixed-methods procedure for improving the precision of construct definitions, determining the content validity of items, evaluating the representativeness of items for the target construct, generating test items, and analyzing items on a theoretical basis. To illustrate the mixed-methods content-scaling-structure (CSS) procedure, we analyze the Adult Self-Transcendence Inventory, a self-report measure of wisdom (ASTI, Levenson et al., 2005). A content-validity analysis of the ASTI items was used as the basis of psychometric analyses using multidimensional item response models (N = 1215). We found that the new procedure produced important suggestions concerning five subdimensions of the ASTI that were not identifiable using exploratory methods. The study shows that the application of the suggested procedure leads to a deeper understanding of latent constructs. It also demonstrates the advantages of theory-based item analysis. PMID:28270777

  14. Nonlinear gyrokinetic theory based on a new method and computation of the guiding-center orbit in tokamaks

    SciTech Connect

    Xu, Yingfeng Dai, Zongliang; Wang, Shaojie

    2014-04-15

    The nonlinear gyrokinetic theory in the tokamak configuration based on the two-step transform is developed; in the first step, we transform the magnetic potential perturbation to the Hamiltonian part, and in the second step, we transform away the gyroangle-dependent part of the perturbed Hamiltonian. Then the I-transform method is used to decoupled the perturbation part of the motion from the unperturbed motion. The application of the I-transform method to the computation of the guiding-center orbit and the guiding-center distribution function in tokamaks is presented. It is demonstrated that the I-transform method of the orbit computation which involves integrating only along the unperturbed orbit agrees with the conventional method which integrates along the full orbit. A numerical code based on the I-transform method is developed and two numerical examples are given to verify the new method.

  15. Did you have an impact? A theory-based method for planning and evaluating knowledge-transfer and exchange activities in occupational health and safety.

    PubMed

    Kramer, Desré M; Wells, Richard P; Carlan, Nicolette; Aversa, Theresa; Bigelow, Philip P; Dixon, Shane M; McMillan, Keith

    2013-01-01

    Few evaluation tools are available to assess knowledge-transfer and exchange interventions. The objective of this paper is to develop and demonstrate a theory-based knowledge-transfer and exchange method of evaluation (KEME) that synthesizes 3 theoretical frameworks: the promoting action on research implementation of health services (PARiHS) model, the transtheoretical model of change, and a model of knowledge use. It proposes a new term, keme, to mean a unit of evidence-based transferable knowledge. The usefulness of the evaluation method is demonstrated with 4 occupational health and safety knowledge transfer and exchange (KTE) implementation case studies that are based upon the analysis of over 50 pre-existing interviews. The usefulness of the evaluation model has enabled us to better understand stakeholder feedback, frame our interpretation, and perform a more comprehensive evaluation of the knowledge use outcomes of our KTE efforts.

  16. Analytic second derivative of the energy for density functional theory based on the three-body fragment molecular orbital method

    NASA Astrophysics Data System (ADS)

    Nakata, Hiroya; Fedorov, Dmitri G.; Zahariev, Federico; Schmidt, Michael W.; Kitaura, Kazuo; Gordon, Mark S.; Nakamura, Shinichiro

    2015-03-01

    Analytic second derivatives of the energy with respect to nuclear coordinates have been developed for spin restricted density functional theory (DFT) based on the fragment molecular orbital method (FMO). The derivations were carried out for the three-body expansion (FMO3), and the two-body expressions can be obtained by neglecting the three-body corrections. Also, the restricted Hartree-Fock (RHF) Hessian for FMO3 can be obtained by neglecting the density-functional related terms. In both the FMO-RHF and FMO-DFT Hessians, certain terms with small magnitudes are neglected for computational efficiency. The accuracy of the FMO-DFT Hessian in terms of the Gibbs free energy is evaluated for a set of polypeptides and water clusters and found to be within 1 kcal/mol of the corresponding full (non-fragmented) ab initio calculation. The FMO-DFT method is also applied to transition states in SN2 reactions and for the computation of the IR and Raman spectra of a small Trp-cage protein (PDB: 1L2Y). Some computational timing analysis is also presented.

  17. Analytic second derivative of the energy for density functional theory based on the three-body fragment molecular orbital method.

    PubMed

    Nakata, Hiroya; Fedorov, Dmitri G; Zahariev, Federico; Schmidt, Michael W; Kitaura, Kazuo; Gordon, Mark S; Nakamura, Shinichiro

    2015-03-28

    Analytic second derivatives of the energy with respect to nuclear coordinates have been developed for spin restricted density functional theory (DFT) based on the fragment molecular orbital method (FMO). The derivations were carried out for the three-body expansion (FMO3), and the two-body expressions can be obtained by neglecting the three-body corrections. Also, the restricted Hartree-Fock (RHF) Hessian for FMO3 can be obtained by neglecting the density-functional related terms. In both the FMO-RHF and FMO-DFT Hessians, certain terms with small magnitudes are neglected for computational efficiency. The accuracy of the FMO-DFT Hessian in terms of the Gibbs free energy is evaluated for a set of polypeptides and water clusters and found to be within 1 kcal/mol of the corresponding full (non-fragmented) ab initio calculation. The FMO-DFT method is also applied to transition states in SN2 reactions and for the computation of the IR and Raman spectra of a small Trp-cage protein (PDB: 1L2Y). Some computational timing analysis is also presented.

  18. Analytic second derivative of the energy for density functional theory based on the three-body fragment molecular orbital method

    SciTech Connect

    Nakata, Hiroya; Fedorov, Dmitri G.; Zahariev, Federico; Schmidt, Michael W.; Gordon, Mark S.; Kitaura, Kazuo; Nakamura, Shinichiro

    2015-03-28

    Analytic second derivatives of the energy with respect to nuclear coordinates have been developed for spin restricted density functional theory (DFT) based on the fragment molecular orbital method (FMO). The derivations were carried out for the three-body expansion (FMO3), and the two-body expressions can be obtained by neglecting the three-body corrections. Also, the restricted Hartree-Fock (RHF) Hessian for FMO3 can be obtained by neglecting the density-functional related terms. In both the FMO-RHF and FMO-DFT Hessians, certain terms with small magnitudes are neglected for computational efficiency. The accuracy of the FMO-DFT Hessian in terms of the Gibbs free energy is evaluated for a set of polypeptides and water clusters and found to be within 1 kcal/mol of the corresponding full (non-fragmented) ab initio calculation. The FMO-DFT method is also applied to transition states in S{sub N}2 reactions and for the computation of the IR and Raman spectra of a small Trp-cage protein (PDB: 1L2Y). Some computational timing analysis is also presented.

  19. NbIT - A New Information Theory-Based Analysis of Allosteric Mechanisms Reveals Residues that Underlie Function in the Leucine Transporter LeuT

    PubMed Central

    LeVine, Michael V.; Weinstein, Harel

    2014-01-01

    Complex networks of interacting residues and microdomains in the structures of biomolecular systems underlie the reliable propagation of information from an input signal, such as the concentration of a ligand, to sites that generate the appropriate output signal, such as enzymatic activity. This information transduction often carries the signal across relatively large distances at the molecular scale in a form of allostery that is essential for the physiological functions performed by biomolecules. While allosteric behaviors have been documented from experiments and computation, the mechanism of this form of allostery proved difficult to identify at the molecular level. Here, we introduce a novel analysis framework, called N-body Information Theory (NbIT) analysis, which is based on information theory and uses measures of configurational entropy in a biomolecular system to identify microdomains and individual residues that act as (i)-channels for long-distance information sharing between functional sites, and (ii)-coordinators that organize dynamics within functional sites. Application of the new method to molecular dynamics (MD) trajectories of the occluded state of the bacterial leucine transporter LeuT identifies a channel of allosteric coupling between the functionally important intracellular gate and the substrate binding sites known to modulate it. NbIT analysis is shown also to differentiate residues involved primarily in stabilizing the functional sites, from those that contribute to allosteric couplings between sites. NbIT analysis of MD data thus reveals rigorous mechanistic elements of allostery underlying the dynamics of biomolecular systems. PMID:24785005

  20. NbIT--a new information theory-based analysis of allosteric mechanisms reveals residues that underlie function in the leucine transporter LeuT.

    PubMed

    LeVine, Michael V; Weinstein, Harel

    2014-05-01

    Complex networks of interacting residues and microdomains in the structures of biomolecular systems underlie the reliable propagation of information from an input signal, such as the concentration of a ligand, to sites that generate the appropriate output signal, such as enzymatic activity. This information transduction often carries the signal across relatively large distances at the molecular scale in a form of allostery that is essential for the physiological functions performed by biomolecules. While allosteric behaviors have been documented from experiments and computation, the mechanism of this form of allostery proved difficult to identify at the molecular level. Here, we introduce a novel analysis framework, called N-body Information Theory (NbIT) analysis, which is based on information theory and uses measures of configurational entropy in a biomolecular system to identify microdomains and individual residues that act as (i)-channels for long-distance information sharing between functional sites, and (ii)-coordinators that organize dynamics within functional sites. Application of the new method to molecular dynamics (MD) trajectories of the occluded state of the bacterial leucine transporter LeuT identifies a channel of allosteric coupling between the functionally important intracellular gate and the substrate binding sites known to modulate it. NbIT analysis is shown also to differentiate residues involved primarily in stabilizing the functional sites, from those that contribute to allosteric couplings between sites. NbIT analysis of MD data thus reveals rigorous mechanistic elements of allostery underlying the dynamics of biomolecular systems.

  1. Theory-Based Stakeholder Evaluation

    ERIC Educational Resources Information Center

    Hansen, Morten Balle; Vedung, Evert

    2010-01-01

    This article introduces a new approach to program theory evaluation called theory-based stakeholder evaluation or the TSE model for short. Most theory-based approaches are program theory driven and some are stakeholder oriented as well. Practically, all of the latter fuse the program perceptions of the various stakeholder groups into one unitary…

  2. Information storage media and method

    DOEpatents

    Miller, Steven D.; Endres, George W.

    1999-01-01

    Disclosed is a method for storing and retrieving information. More specifically, the present invention is a method for forming predetermined patterns, or data structures, using materials which exhibit enhanced absorption of light at certain wavelengths or, when interrogated with a light having a first wavelength, provide a luminescent response at a second wavelength. These materials may exhibit this response to light inherently, or may be made to exhibit this response by treating the materials with ionizing radiation.

  3. A recursively formulated first-order semianalytic artificial satellite theory based on the generalized method of averaging. Volume 1: The generalized method of averaging applied to the artificial satellite problem

    NASA Technical Reports Server (NTRS)

    Mcclain, W. D.

    1977-01-01

    A recursively formulated, first-order, semianalytic artificial satellite theory, based on the generalized method of averaging is presented in two volumes. Volume I comprehensively discusses the theory of the generalized method of averaging applied to the artificial satellite problem. Volume II presents the explicit development in the nonsingular equinoctial elements of the first-order average equations of motion. The recursive algorithms used to evaluate the first-order averaged equations of motion are also presented in Volume II. This semianalytic theory is, in principle, valid for a term of arbitrary degree in the expansion of the third-body disturbing function (nonresonant cases only) and for a term of arbitrary degree and order in the expansion of the nonspherical gravitational potential function.

  4. Derivation of a measure of systolic blood pressure mutability: a novel information theory-based metric from ambulatory blood pressure tests.

    PubMed

    Contreras, Danitza J; Vogel, Eugenio E; Saravia, Gonzalo; Stockins, Benjamin

    2016-03-01

    We provide ambulatory blood pressure (BP) exams with tools based on information theory to quantify fluctuations thus increasing the capture of dynamic test components. Data from 515 ambulatory 24-hour BP exams were considered. Average age was 54 years, 54% were women, and 53% were under BP treatment. The average systolic pressure (SP) was 127 ± 8 mm Hg. A data compressor (wlzip) designed to recognize meaningful information is invoked to measure mutability which is a form of dynamical variability. For patients with the same average SP, different mutability values are obtained which reflects the differences in dynamical variability. In unadjusted linear regression models, mutability had low association with the mean systolic BP (R(2) = 0.056; P < .000001) but larger association with the SP deviation (R(2) = 0.761; P < .001). Wlzip allows detecting levels of variability in SP that could be hazardous. This new indicator can be easily added to the 24-hour BP monitors improving information toward diagnosis.

  5. Information technology equipment cooling method

    DOEpatents

    Schultz, Mark D.

    2015-10-20

    According to one embodiment, a system for removing heat from a rack of information technology equipment may include a sidecar indoor air to liquid heat exchanger that cools air utilized by the rack of information technology equipment to cool the rack of information technology equipment. The system may also include a liquid to liquid heat exchanger and an outdoor heat exchanger. The system may further include configurable pathways to connect and control fluid flow through the sidecar heat exchanger, the liquid to liquid heat exchanger, the rack of information technology equipment, and the outdoor heat exchanger based upon ambient temperature and/or ambient humidity to remove heat generated by the rack of information technology equipment.

  6. EFFECTIVENESS OF INFORMATION RETRIEVAL METHODS.

    ERIC Educational Resources Information Center

    SWETS, JOHN A.

    RESULTS OF FIFTY DIFFERENT RETRIEVAL METHODS AS APPLIED IN THREE EXPERIMENTAL RETRIEVAL SYSTEMS WERE SUBJECTED TO AN ANALYSIS SUGGESTED BY STATISTICAL DECISION THEORY. THE ANALYSIS USES A PREVIOUSLY-PROPOSED MEASURE OF EFFECTIVENESS AND DEMONSTRATES ITS SEVERAL PROPERTIES. SOME OF THESE PROPERTIES ARE--(1) IT ENABLES THE RETRIEVAL SYSTEM TO…

  7. Levels of Reconstruction as Complementarity in Mixed Methods Research: A Social Theory-Based Conceptual Framework for Integrating Qualitative and Quantitative Research

    PubMed Central

    Carroll, Linda J.; Rothe, J. Peter

    2010-01-01

    Like other areas of health research, there has been increasing use of qualitative methods to study public health problems such as injuries and injury prevention. Likewise, the integration of qualitative and quantitative research (mixed-methods) is beginning to assume a more prominent role in public health studies. Likewise, using mixed-methods has great potential for gaining a broad and comprehensive understanding of injuries and their prevention. However, qualitative and quantitative research methods are based on two inherently different paradigms, and their integration requires a conceptual framework that permits the unity of these two methods. We present a theory-driven framework for viewing qualitative and quantitative research, which enables us to integrate them in a conceptually sound and useful manner. This framework has its foundation within the philosophical concept of complementarity, as espoused in the physical and social sciences, and draws on Bergson’s metaphysical work on the ‘ways of knowing’. Through understanding how data are constructed and reconstructed, and the different levels of meaning that can be ascribed to qualitative and quantitative findings, we can use a mixed-methods approach to gain a conceptually sound, holistic knowledge about injury phenomena that will enhance our development of relevant and successful interventions. PMID:20948937

  8. Levels of reconstruction as complementarity in mixed methods research: a social theory-based conceptual framework for integrating qualitative and quantitative research.

    PubMed

    Carroll, Linda J; Rothe, J Peter

    2010-09-01

    Like other areas of health research, there has been increasing use of qualitative methods to study public health problems such as injuries and injury prevention. Likewise, the integration of qualitative and quantitative research (mixed-methods) is beginning to assume a more prominent role in public health studies. Likewise, using mixed-methods has great potential for gaining a broad and comprehensive understanding of injuries and their prevention. However, qualitative and quantitative research methods are based on two inherently different paradigms, and their integration requires a conceptual framework that permits the unity of these two methods. We present a theory-driven framework for viewing qualitative and quantitative research, which enables us to integrate them in a conceptually sound and useful manner. This framework has its foundation within the philosophical concept of complementarity, as espoused in the physical and social sciences, and draws on Bergson's metaphysical work on the 'ways of knowing'. Through understanding how data are constructed and reconstructed, and the different levels of meaning that can be ascribed to qualitative and quantitative findings, we can use a mixed-methods approach to gain a conceptually sound, holistic knowledge about injury phenomena that will enhance our development of relevant and successful interventions.

  9. Probability theory-based SNP association study method for identifying susceptibility loci and genetic disease models in human case-control data.

    PubMed

    Yuan, Xiguo; Zhang, Junying; Wang, Yue

    2010-12-01

    One of the most challenging points in studying human common complex diseases is to search for both strong and weak susceptibility single-nucleotide polymorphisms (SNPs) and identify forms of genetic disease models. Currently, a number of methods have been proposed for this purpose. Many of them have not been validated through applications into various genome datasets, so their abilities are not clear in real practice. In this paper, we present a novel SNP association study method based on probability theory, called ProbSNP. The method firstly detects SNPs by evaluating their joint probabilities in combining with disease status and selects those with the lowest joint probabilities as susceptibility ones, and then identifies some forms of genetic disease models through testing multiple-locus interactions among the selected SNPs. The joint probabilities of combined SNPs are estimated by establishing Gaussian distribution probability density functions, in which the related parameters (i.e., mean value and standard deviation) are evaluated based on allele and haplotype frequencies. Finally, we test and validate the method using various genome datasets. We find that ProbSNP has shown remarkable success in the applications to both simulated genome data and real genome-wide data.

  10. High-resolution wave-theory-based ultrasound reflection imaging using the split-step fourier and globally optimized fourier finite-difference methods

    DOEpatents

    Huang, Lianjie

    2013-10-29

    Methods for enhancing ultrasonic reflection imaging are taught utilizing a split-step Fourier propagator in which the reconstruction is based on recursive inward continuation of ultrasonic wavefields in the frequency-space and frequency-wave number domains. The inward continuation within each extrapolation interval consists of two steps. In the first step, a phase-shift term is applied to the data in the frequency-wave number domain for propagation in a reference medium. The second step consists of applying another phase-shift term to data in the frequency-space domain to approximately compensate for ultrasonic scattering effects of heterogeneities within the tissue being imaged (e.g., breast tissue). Results from various data input to the method indicate significant improvements are provided in both image quality and resolution.

  11. Unrestricted density functional theory based on the fragment molecular orbital method for the ground and excited state calculations of large systems

    SciTech Connect

    Nakata, Hiroya; Fedorov, Dmitri G.; Yokojima, Satoshi; Kitaura, Kazuo; Sakurai, Minoru; Nakamura, Shinichiro

    2014-04-14

    We extended the fragment molecular orbital (FMO) method interfaced with density functional theory (DFT) into spin unrestricted formalism (UDFT) and developed energy gradients for the ground state and single point excited state energies based on time-dependent DFT. The accuracy of FMO is evaluated in comparison to the full calculations without fragmentation. Electronic excitations in solvated organic radicals and in the blue copper protein, plastocyanin (PDB code: 1BXV), are reported. The contributions of solvent molecules to the electronic excitations are analyzed in terms of the fragment polarization and quantum effects such as interfragment charge transfer.

  12. Benchmarking Density Functional Theory Based Methods To Model NiOOH Material Properties: Hubbard and van der Waals Corrections vs Hybrid Functionals.

    PubMed

    Zaffran, Jeremie; Caspary Toroker, Maytal

    2016-08-09

    NiOOH has recently been used to catalyze water oxidation by way of electrochemical water splitting. Few experimental data are available to rationalize the successful catalytic capability of NiOOH. Thus, theory has a distinctive role for studying its properties. However, the unique layered structure of NiOOH is associated with the presence of essential dispersion forces within the lattice. Hence, the choice of an appropriate exchange-correlation functional within Density Functional Theory (DFT) is not straightforward. In this work, we will show that standard DFT is sufficient to evaluate the geometry, but DFT+U and hybrid functionals are required to calculate the oxidation states. Notably, the benefit of DFT with van der Waals correction is marginal. Furthermore, only hybrid functionals succeed in opening a bandgap, and such methods are necessary to study NiOOH electronic structure. In this work, we expect to give guidelines to theoreticians dealing with this material and to present a rational approach in the choice of the DFT method of calculation.

  13. Dynamic mean field theory for lattice gas models of fluids confined in porous materials: higher order theory based on the Bethe-Peierls and path probability method approximations.

    PubMed

    Edison, John R; Monson, Peter A

    2014-07-14

    Recently we have developed a dynamic mean field theory (DMFT) for lattice gas models of fluids in porous materials [P. A. Monson, J. Chem. Phys. 128(8), 084701 (2008)]. The theory can be used to describe the relaxation processes in the approach to equilibrium or metastable states for fluids in pores and is especially useful for studying system exhibiting adsorption/desorption hysteresis. In this paper we discuss the extension of the theory to higher order by means of the path probability method (PPM) of Kikuchi and co-workers. We show that this leads to a treatment of the dynamics that is consistent with thermodynamics coming from the Bethe-Peierls or Quasi-Chemical approximation for the equilibrium or metastable equilibrium states of the lattice model. We compare the results from the PPM with those from DMFT and from dynamic Monte Carlo simulations. We find that the predictions from PPM are qualitatively similar to those from DMFT but give somewhat improved quantitative accuracy, in part due to the superior treatment of the underlying thermodynamics. This comes at the cost of greater computational expense associated with the larger number of equations that must be solved.

  14. Dynamic mean field theory for lattice gas models of fluids confined in porous materials: Higher order theory based on the Bethe-Peierls and path probability method approximations

    SciTech Connect

    Edison, John R.; Monson, Peter A.

    2014-07-14

    Recently we have developed a dynamic mean field theory (DMFT) for lattice gas models of fluids in porous materials [P. A. Monson, J. Chem. Phys. 128(8), 084701 (2008)]. The theory can be used to describe the relaxation processes in the approach to equilibrium or metastable states for fluids in pores and is especially useful for studying system exhibiting adsorption/desorption hysteresis. In this paper we discuss the extension of the theory to higher order by means of the path probability method (PPM) of Kikuchi and co-workers. We show that this leads to a treatment of the dynamics that is consistent with thermodynamics coming from the Bethe-Peierls or Quasi-Chemical approximation for the equilibrium or metastable equilibrium states of the lattice model. We compare the results from the PPM with those from DMFT and from dynamic Monte Carlo simulations. We find that the predictions from PPM are qualitatively similar to those from DMFT but give somewhat improved quantitative accuracy, in part due to the superior treatment of the underlying thermodynamics. This comes at the cost of greater computational expense associated with the larger number of equations that must be solved.

  15. Method for gathering and summarizing internet information

    DOEpatents

    Potok, Thomas E.; Elmore, Mark Thomas; Reed, Joel Wesley; Treadwell, Jim N.; Samatova, Nagiza Faridovna

    2010-04-06

    A computer method of gathering and summarizing large amounts of information comprises collecting information from a plurality of information sources (14, 51) according to respective maps (52) of the information sources (14), converting the collected information from a storage format to XML-language documents (26, 53) and storing the XML-language documents in a storage medium, searching for documents (55) according to a search query (13) having at least one term and identifying the documents (26) found in the search, and displaying the documents as nodes (33) of a tree structure (32) having links (34) and nodes (33) so as to indicate similarity of the documents to each other.

  16. Method for gathering and summarizing internet information

    DOEpatents

    Potok, Thomas E.; Elmore, Mark Thomas; Reed, Joel Wesley; Treadwell, Jim N.; Samatova, Nagiza Faridovna

    2008-01-01

    A computer method of gathering and summarizing large amounts of information comprises collecting information from a plurality of information sources (14, 51) according to respective maps (52) of the information sources (14), converting the collected information from a storage format to XML-language documents (26, 53) and storing the XML-language documents in a storage medium, searching for documents (55) according to a search query (13) having at least one term and identifying the documents (26) found in the search, and displaying the documents as nodes (33) of a tree structure (32) having links (34) and nodes (33) so as to indicate similarity of the documents to each other.

  17. Research on polarization imaging information parsing method

    NASA Astrophysics Data System (ADS)

    Yuan, Hongwu; Zhou, Pucheng; Wang, Xiaolong

    2016-11-01

    Polarization information parsing plays an important role in polarization imaging detection. This paper focus on the polarization information parsing method: Firstly, the general process of polarization information parsing is given, mainly including polarization image preprocessing, multiple polarization parameters calculation, polarization image fusion and polarization image tracking, etc.; And then the research achievements of the polarization information parsing method are presented, in terms of polarization image preprocessing, the polarization image registration method based on the maximum mutual information is designed. The experiment shows that this method can improve the precision of registration and be satisfied the need of polarization information parsing; In terms of multiple polarization parameters calculation, based on the omnidirectional polarization inversion model is built, a variety of polarization parameter images are obtained and the precision of inversion is to be improve obviously; In terms of polarization image fusion , using fuzzy integral and sparse representation, the multiple polarization parameters adaptive optimal fusion method is given, and the targets detection in complex scene is completed by using the clustering image segmentation algorithm based on fractal characters; In polarization image tracking, the average displacement polarization image characteristics of auxiliary particle filtering fusion tracking algorithm is put forward to achieve the smooth tracking of moving targets. Finally, the polarization information parsing method is applied to the polarization imaging detection of typical targets such as the camouflage target, the fog and latent fingerprints.

  18. Theory-Based Considerations Influence the Interpretation of Generic Sentences

    ERIC Educational Resources Information Center

    Cimpian, Andrei; Gelman, Susan A.; Brandone, Amanda C.

    2010-01-01

    Under what circumstances do people agree that a kind-referring generic sentence (e.g., "Swans are beautiful") is true? We hypothesised that theory-based considerations are sufficient, independently of prevalence/frequency information, to lead to acceptance of a generic statement. To provide evidence for this general point, we focused on…

  19. Analysis of methods. [information systems evolution environment

    NASA Technical Reports Server (NTRS)

    Mayer, Richard J. (Editor); Ackley, Keith A.; Wells, M. Sue; Mayer, Paula S. D.; Blinn, Thomas M.; Decker, Louis P.; Toland, Joel A.; Crump, J. Wesley; Menzel, Christopher P.; Bodenmiller, Charles A.

    1991-01-01

    Information is one of an organization's most important assets. For this reason the development and maintenance of an integrated information system environment is one of the most important functions within a large organization. The Integrated Information Systems Evolution Environment (IISEE) project has as one of its primary goals a computerized solution to the difficulties involved in the development of integrated information systems. To develop such an environment a thorough understanding of the enterprise's information needs and requirements is of paramount importance. This document is the current release of the research performed by the Integrated Development Support Environment (IDSE) Research Team in support of the IISEE project. Research indicates that an integral part of any information system environment would be multiple modeling methods to support the management of the organization's information. Automated tool support for these methods is necessary to facilitate their use in an integrated environment. An integrated environment makes it necessary to maintain an integrated database which contains the different kinds of models developed under the various methodologies. In addition, to speed the process of development of models, a procedure or technique is needed to allow automatic translation from one methodology's representation to another while maintaining the integrity of both. The purpose for the analysis of the modeling methods included in this document is to examine these methods with the goal being to include them in an integrated development support environment. To accomplish this and to develop a method for allowing intra-methodology and inter-methodology model element reuse, a thorough understanding of multiple modeling methodologies is necessary. Currently the IDSE Research Team is investigating the family of Integrated Computer Aided Manufacturing (ICAM) DEFinition (IDEF) languages IDEF(0), IDEF(1), and IDEF(1x), as well as ENALIM, Entity

  20. Research Investigation of Information Access Methods

    ERIC Educational Resources Information Center

    Heinrichs, John H.; Sharkey, Thomas W.; Lim, Jeen-Su

    2006-01-01

    This study investigates the satisfaction of library users at Wayne State University who utilize alternative information access methods. The LibQUAL+[TM] desired and perceived that satisfaction ratings are used to determine the user's "superiority gap." By focusing limited library resources to address "superiority gap" issues identified by each…

  1. Switching theory-based steganographic system for JPEG images

    NASA Astrophysics Data System (ADS)

    Cherukuri, Ravindranath C.; Agaian, Sos S.

    2007-04-01

    Cellular communications constitute a significant portion of the global telecommunications market. Therefore, the need for secured communication over a mobile platform has increased exponentially. Steganography is an art of hiding critical data into an innocuous signal, which provide answers to the above needs. The JPEG is one of commonly used format for storing and transmitting images on the web. In addition, the pictures captured using mobile cameras are in mostly in JPEG format. In this article, we introduce a switching theory based steganographic system for JPEG images which is applicable for mobile and computer platforms. The proposed algorithm uses the fact that energy distribution among the quantized AC coefficients varies from block to block and coefficient to coefficient. Existing approaches are effective with a part of these coefficients but when employed over all the coefficients they show there ineffectiveness. Therefore, we propose an approach that works each set of AC coefficients with different frame work thus enhancing the performance of the approach. The proposed system offers a high capacity and embedding efficiency simultaneously withstanding to simple statistical attacks. In addition, the embedded information could be retrieved without prior knowledge of the cover image. Based on simulation results, the proposed method demonstrates an improved embedding capacity over existing algorithms while maintaining a high embedding efficiency and preserving the statistics of the JPEG image after hiding information.

  2. Method and apparatus for displaying information

    NASA Technical Reports Server (NTRS)

    Ingber, Donald E. (Inventor); Huang, Sui (Inventor); Eichler, Gabriel (Inventor)

    2010-01-01

    A method for displaying large amounts of information. The method includes the steps of forming a spatial layout of tiles each corresponding to a representative reference element; mapping observed elements onto the spatial layout of tiles of representative reference elements; assigning a respective value to each respective tile of the spatial layout of the representative elements; and displaying an image of the spatial layout of tiles of representative elements. Each tile includes atomic attributes of representative elements. The invention also relates to an apparatus for displaying large amounts of information. The apparatus includes a tiler forming a spatial layout of tiles, each corresponding to a representative reference element; a comparator mapping observed elements onto said spatial layout of tiles of representative reference elements; an assigner assigning a respective value to each respective tile of said spatial layout of representative reference elements; and a display displaying an image of the spatial layout of tiles of representative reference elements.

  3. A Method for Analyzing Volunteered Geographic Information ...

    EPA Pesticide Factsheets

    Volunteered geographic information (VGI) can be used to identify public valuation of ecosystem services in a defined geographic area using photos as a representation of lived experiences. This method can help researchers better survey and report on the values and preferences of stakeholders involved in rehabilitation and revitalization projects. Current research utilizes VGI in the form of geotagged social media photos from three platforms: Flickr, Instagram, and Panaramio. Social media photos have been obtained for the neighborhoods next to the St. Louis River in Duluth, Minnesota, and are being analyzed along several dimensions. These dimensions include the spatial distribution of each platform, the characteristics of the physical environment portrayed in the photos, and finally, the ecosystem service depicted. In this poster, we focus on the photos from the Irving and Fairmount neighborhoods of Duluth, MN to demonstrate the method at the neighborhood scale. This study demonstrates a method for translating the values expressed in social media photos into ecosystem services and spatially-explicit data to be used in multiple settings, including the City of Duluth’s Comprehensive Planning and community revitalization efforts, habitat restoration in a Great Lakes Area of Concern, and the USEPA’s Office of Research and Development. This poster will demonstrate a method for translating values expressed in social media photos into ecosystem services and spatially

  4. Knowledge information management toolkit and method

    DOEpatents

    Hempstead, Antoinette R.; Brown, Kenneth L.

    2006-08-15

    A system is provided for managing user entry and/or modification of knowledge information into a knowledge base file having an integrator support component and a data source access support component. The system includes processing circuitry, memory, a user interface, and a knowledge base toolkit. The memory communicates with the processing circuitry and is configured to store at least one knowledge base. The user interface communicates with the processing circuitry and is configured for user entry and/or modification of knowledge pieces within a knowledge base. The knowledge base toolkit is configured for converting knowledge in at least one knowledge base from a first knowledge base form into a second knowledge base form. A method is also provided.

  5. On Methods for Higher Order Information Fusion

    DTIC Science & Technology

    2005-02-01

    074-0188 Public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing...ORDINAL SPACES ..................................................................................................17 8. CREDIBILITY WEIGHTED SOURCES...specified, we shall assume the information provided by a source is a specific value in the space X. Another consideration is the inclusion of source

  6. Advanced Feedback Methods in Information Retrieval.

    ERIC Educational Resources Information Center

    Salton, G.; And Others

    1985-01-01

    In this study, automatic feedback techniques are applied to Boolean query statements in online information retrieval to generate improved query statements based on information contained in previously retrieved documents. Feedback operations are carried out using conventional Boolean logic and extended logic. Experimental output is included to…

  7. Methods of Eliciting Information from Experts

    DTIC Science & Technology

    1987-10-01

    Itzhak Perlman) or a concertmeister in an orchestra, or simply one of its violinists . Differences in amount of expertise may supply different...would lead to becoming a world class violinist . Underlying Assumptions In attempting to elicit information from experts, one makes a number of

  8. Governance Methods Used in Externalizing Information Technology

    ERIC Educational Resources Information Center

    Chan, Steven King-Lun

    2012-01-01

    Information technology (IT) is the largest capital expenditure in many firms and is an integral part of many organizations' strategies. However, the benefits that each company receives from its IT investments vary. One study by Weill (2004) found that the top performer in the sample was estimated to have as high as a 40% greater return on its…

  9. 48 CFR 2905.101 - Methods of disseminating information.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 7 2010-10-01 2010-10-01 false Methods of disseminating information. 2905.101 Section 2905.101 Federal Acquisition Regulations System DEPARTMENT OF LABOR ACQUISITION PLANNING PUBLICIZING CONTRACT ACTIONS Dissemination of Information 2905.101 Methods of disseminating information. Contracting officers...

  10. Information in Our World: Conceptions of Information and Problems of Method in Information Science

    ERIC Educational Resources Information Center

    Ma, Lai

    2012-01-01

    Many concepts of information have been proposed and discussed in library and information science. These concepts of information can be broadly categorized as empirical and situational information. Unlike nomenclatures in many sciences, however, the concept of information in library and information science does not bear a generally accepted…

  11. Using Qualitative Methods to Inform Scale Development

    ERIC Educational Resources Information Center

    Rowan, Noell; Wulff, Dan

    2007-01-01

    This article describes the process by which one study utilized qualitative methods to create items for a multi dimensional scale to measure twelve step program affiliation. The process included interviewing fourteen addicted persons while in twelve step focused treatment about specific pros (things they like or would miss out on by not being…

  12. Cable Television: A Method for Delivering Information.

    ERIC Educational Resources Information Center

    Nebraska Univ., Lincoln. Cooperative Extension Service.

    This report presents the recommendations of a committee that was formed to explore the possibility of using cable television networks as a method of delivering extension education programs to urban audiences. After developing and testing a pilot project that used cable television as a mode to disseminate horticulture and 4-H leader training…

  13. Applying Human Computation Methods to Information Science

    ERIC Educational Resources Information Center

    Harris, Christopher Glenn

    2013-01-01

    Human Computation methods such as crowdsourcing and games with a purpose (GWAP) have each recently drawn considerable attention for their ability to synergize the strengths of people and technology to accomplish tasks that are challenging for either to do well alone. Despite this increased attention, much of this transformation has been focused on…

  14. Informing Patients About Placebo Effects: Using Evidence, Theory, and Qualitative Methods to Develop a New Website

    PubMed Central

    Greville-Harris, Maddy; Bostock, Jennifer; Din, Amy; Graham, Cynthia A; Lewith, George; Liossi, Christina; O’Riordan, Tim; White, Peter; Yardley, Lucy

    2016-01-01

    Background According to established ethical principles and guidelines, patients in clinical trials should be fully informed about the interventions they might receive. However, information about placebo-controlled clinical trials typically focuses on the new intervention being tested and provides limited and at times misleading information about placebos. Objective We aimed to create an informative, scientifically accurate, and engaging website that could be used to improve understanding of placebo effects among patients who might be considering taking part in a placebo-controlled clinical trial. Methods Our approach drew on evidence-, theory-, and person-based intervention development. We used existing evidence and theory about placebo effects to develop content that was scientifically accurate. We used existing evidence and theory of health behavior to ensure our content would be communicated persuasively, to an audience who might currently be ignorant or misinformed about placebo effects. A qualitative ‘think aloud’ study was conducted in which 10 participants viewed prototypes of the website and spoke their thoughts out loud in the presence of a researcher. Results The website provides information about 10 key topics and uses text, evidence summaries, quizzes, audio clips of patients’ stories, and a short film to convey key messages. Comments from participants in the think aloud study highlighted occasional misunderstandings and off-putting/confusing features. These were addressed by modifying elements of content, style, and navigation to improve participants’ experiences of using the website. Conclusions We have developed an evidence-based website that incorporates theory-based techniques to inform members of the public about placebos and placebo effects. Qualitative research ensured our website was engaging and convincing for our target audience who might not perceive a need to learn about placebo effects. Before using the website in clinical trials

  15. Versatile Formal Methods Applied to Quantum Information.

    SciTech Connect

    Witzel, Wayne; Rudinger, Kenneth Michael; Sarovar, Mohan

    2015-11-01

    Using a novel formal methods approach, we have generated computer-veri ed proofs of major theorems pertinent to the quantum phase estimation algorithm. This was accomplished using our Prove-It software package in Python. While many formal methods tools are available, their practical utility is limited. Translating a problem of interest into these systems and working through the steps of a proof is an art form that requires much expertise. One must surrender to the preferences and restrictions of the tool regarding how mathematical notions are expressed and what deductions are allowed. Automation is a major driver that forces restrictions. Our focus, on the other hand, is to produce a tool that allows users the ability to con rm proofs that are essentially known already. This goal is valuable in itself. We demonstrate the viability of our approach that allows the user great exibility in expressing state- ments and composing derivations. There were no major obstacles in following a textbook proof of the quantum phase estimation algorithm. There were tedious details of algebraic manipulations that we needed to implement (and a few that we did not have time to enter into our system) and some basic components that we needed to rethink, but there were no serious roadblocks. In the process, we made a number of convenient additions to our Prove-It package that will make certain algebraic manipulations easier to perform in the future. In fact, our intent is for our system to build upon itself in this manner.

  16. Application of geo-information science methods in ecotourism exploitation

    NASA Astrophysics Data System (ADS)

    Dong, Suocheng; Hou, Xiaoli

    2004-11-01

    Application of geo-information science methods in ecotourism development was discussed in the article. Since 1990s, geo-information science methods, which take the 3S (Geographic Information System, Global Positioning System, and Remote Sensing) as core techniques, has played an important role in resources reconnaissance, data management, environment monitoring, and regional planning. Geo-information science methods can easily analyze and convert geographic spatial data. The application of 3S methods is helpful to sustainable development in tourism. Various assignments are involved in the development of ecotourism, such as reconnaissance of ecotourism resources, drawing of tourism maps, dealing with mass data, and also tourism information inquire, employee management, quality management of products. The utilization of geo-information methods in ecotourism can make the development more efficient by promoting the sustainable development of tourism and the protection of eco-environment.

  17. Evaluating Theory-Based Evaluation: Information, Norms, and Adherence

    ERIC Educational Resources Information Center

    Jacobs, W. Jake; Sisco, Melissa; Hill, Dawn; Malter, Frederic; Figueredo, Aurelio Jose

    2012-01-01

    Programmatic social interventions attempt to produce appropriate social-norm-guided behavior in an open environment. A marriage of applicable psychological theory, appropriate program evaluation theory, and outcome of evaluations of specific social interventions assures the acquisition of cumulative theory and the production of successful social…

  18. Database design using NIAM (Nijssen Information Analysis Method) modeling

    SciTech Connect

    Stevens, N.H.

    1989-01-01

    The Nissjen Information Analysis Method (NIAM) is an information modeling technique based on semantics and founded in set theory. A NIAM information model is a graphical representation of the information requirements for some universe of discourse. Information models facilitate data integration and communication within an organization about data semantics. An information model is sometimes referred to as the semantic model or the conceptual schema. It helps in the logical and physical design and implementation of databases. NIAM information modeling is used at Sandia National Laboratories to design and implement relational databases containing engineering information which meet the users' information requirements. The paper focuses on the design of one database which satisfied the data needs of four disjoint but closely related applications. The applications as they existed before did not talk to each other even though they stored much of the same data redundantly. NIAM was used to determine the information requirements and design the integrated database. 6 refs., 7 figs.

  19. Axiomatic Evaluation Method and Content Structure for Information Appliances

    ERIC Educational Resources Information Center

    Guo, Yinni

    2010-01-01

    Extensive studies have been conducted to determine how best to present information in order to enhance usability, but not what information is needed to be presented for effective decision making. Hence, this dissertation addresses the factor structure of the nature of information needed for presentation and proposes a more effective method than…

  20. 19 CFR 201.9 - Methods employed in obtaining information.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 19 Customs Duties 3 2010-04-01 2010-04-01 false Methods employed in obtaining information. 201.9... APPLICATION Initiation and Conduct of Investigations § 201.9 Methods employed in obtaining information. In... agencies of the Government, through questionnaires and correspondence, through field work by members of...

  1. Method and system of integrating information from multiple sources

    SciTech Connect

    Alford, Francine A.; Brinkerhoff, David L.

    2006-08-15

    A system and method of integrating information from multiple sources in a document centric application system. A plurality of application systems are connected through an object request broker to a central repository. The information may then be posted on a webpage. An example of an implementation of the method and system is an online procurement system.

  2. Compressed sensing theory-based channel estimation for optical orthogonal frequency division multiplexing communication system

    NASA Astrophysics Data System (ADS)

    Zhao, Hui; Li, Minghui; Wang, Ruyan; Liu, Yuanni; Song, Daiping

    2014-09-01

    Due to the spare multipath property of the channel, a channel estimation method, which is based on partial superimposed training sequence and compressed sensing theory, is proposed for line of sight optical orthogonal frequency division multiplexing communication systems. First, a continuous training sequence is added at variable power ratio to the cyclic prefix of orthogonal frequency division multiplexing symbols at the transmitter prior to transmission. Then the observation matrix of compressed sensing theory is structured by the use of the training symbols at receiver. Finally, channel state information is estimated using sparse signal reconstruction algorithm. Compared to traditional training sequences, the proposed partial superimposed training sequence not only improves the spectral efficiency, but also reduces the influence to information symbols. In addition, compared with classical least squares and linear minimum mean square error methods, the proposed compressed sensing theory based channel estimation method can improve both the estimation accuracy and the system performance. Simulation results are given to demonstrate the performance of the proposed method.

  3. Collecting Information for Rating Global Assessment of Functioning (GAF): Sources of Information and Methods for Information Collection

    PubMed Central

    Aas, I. H. Monrad

    2014-01-01

    Introduction: Global Assessment of Functioning (GAF) is an assessment instrument that is known worldwide. It is widely used for rating the severity of illness. Results from evaluations in psychiatry should characterize the patients. Rating of GAF is based on collected information. The aim of the study is to identify the factors involved in collecting information that is relevant for rating GAF, and gaps in knowledge where it is likely that further development would play a role for improved scoring. Methods: A literature search was conducted with a combination of thorough hand search and search in the bibliographic databases PubMed, PsycINFO, Google Scholar, and Campbell Collaboration Library of Systematic Reviews. Results: Collection of information for rating GAF depends on two fundamental factors: the sources of information and the methods for information collection. Sources of information are patients, informants, health personnel, medical records, letters of referral and police records about violence and substance abuse. Methods for information collection include the many different types of interview – unstructured, semi-structured, structured, interviews for Axis I and II disorders, semistructured interviews for rating GAF, and interviews of informants – as well as instruments for rating symptoms and functioning, and observation. The different sources of information, and methods for collection, frequently result in inconsistencies in the information collected. The variation in collected information, and lack of a generally accepted algorithm for combining collected information, is likely to be important for rated GAF values, but there is a fundamental lack of knowledge about the degree of importance. Conclusions: Research to improve GAF has not reached a high level. Rated GAF values are likely to be influenced by both the sources of information used and the methods employed for information collection, but the lack of research-based information about these

  4. Remarks on the information entropy maximization method and extended thermodynamics

    NASA Astrophysics Data System (ADS)

    Eu, Byung Chan

    1998-04-01

    The information entropy maximization method was applied by Jou et al. [J. Phys. A 17, 2799 (1984)] to heat conduction in the past. Advancing this method one more step, Nettleton [J. Chem. Phys. 106, 10311 (1997)] combined the method with a projection operator technique to derive a set of evolution equations for macroscopic variables from the Liouville equation for a simple liquid, and a claim was made that the method provides a statistical mechanical theory basis of irreversible processes and, in particular, of extended thermodynamics which is consistent with the laws of thermodynamics. This line of information entropy maximization method is analyzed from the viewpoint of the laws of thermodynamics in this paper.

  5. Information Integration for Concurrent Engineering (IICE) Compendium of Methods Report

    DTIC Science & Technology

    1995-06-01

    Theory Informal Formal Language Formal Semantics Di sc ip lin e Definition Use Method Figure 1. Anatomy of a Method Ultimately, methods are designed to...architecture, observed: [T]here is not an architecture, but a set of architectural representations. One is not right and another wrong. The...representations,” and, correspondingly, many methods. Methods, and their associated architectural representations, focus on a limited set of system

  6. XML-based product information processing method for product design

    NASA Astrophysics Data System (ADS)

    Zhang, Zhen Yu

    2012-01-01

    Design knowledge of modern mechatronics product is based on information processing as the center of the knowledge-intensive engineering, thus product design innovation is essentially the knowledge and information processing innovation. Analysis of the role of mechatronics product design knowledge and information management features, a unified model of XML-based product information processing method is proposed. Information processing model of product design includes functional knowledge, structural knowledge and their relationships. For the expression of product function element, product structure element, product mapping relationship between function and structure based on the XML model are proposed. The information processing of a parallel friction roller is given as an example, which demonstrates that this method is obviously helpful for knowledge-based design system and product innovation.

  7. XML-based product information processing method for product design

    NASA Astrophysics Data System (ADS)

    Zhang, Zhen Yu

    2011-12-01

    Design knowledge of modern mechatronics product is based on information processing as the center of the knowledge-intensive engineering, thus product design innovation is essentially the knowledge and information processing innovation. Analysis of the role of mechatronics product design knowledge and information management features, a unified model of XML-based product information processing method is proposed. Information processing model of product design includes functional knowledge, structural knowledge and their relationships. For the expression of product function element, product structure element, product mapping relationship between function and structure based on the XML model are proposed. The information processing of a parallel friction roller is given as an example, which demonstrates that this method is obviously helpful for knowledge-based design system and product innovation.

  8. Information theory in living systems, methods, applications, and challenges.

    PubMed

    Gatenby, Robert A; Frieden, B Roy

    2007-02-01

    Living systems are distinguished in nature by their ability to maintain stable, ordered states far from equilibrium. This is despite constant buffeting by thermodynamic forces that, if unopposed, will inevitably increase disorder. Cells maintain a steep transmembrane entropy gradient by continuous application of information that permits cellular components to carry out highly specific tasks that import energy and export entropy. Thus, the study of information storage, flow and utilization is critical for understanding first principles that govern the dynamics of life. Initial biological applications of information theory (IT) used Shannon's methods to measure the information content in strings of monomers such as genes, RNA, and proteins. Recent work has used bioinformatic and dynamical systems to provide remarkable insights into the topology and dynamics of intracellular information networks. Novel applications of Fisher-, Shannon-, and Kullback-Leibler informations are promoting increased understanding of the mechanisms by which genetic information is converted to work and order. Insights into evolution may be gained by analysis of the the fitness contributions from specific segments of genetic information as well as the optimization process in which the fitness are constrained by the substrate cost for its storage and utilization. Recent IT applications have recognized the possible role of nontraditional information storage structures including lipids and ion gradients as well as information transmission by molecular flux across cell membranes. Many fascinating challenges remain, including defining the intercellular information dynamics of multicellular organisms and the role of disordered information storage and flow in disease.

  9. Consent, Informal Organization and Job Rewards: A Mixed Methods Analysis

    ERIC Educational Resources Information Center

    Laubach, Marty

    2005-01-01

    This study uses a mixed methods approach to workplace dynamics. Ethnographic observations show that the consent deal underlies an informal stratification that divides the workplace into an "informal periphery," a "conventional core" and an "administrative clan." The "consent deal" is defined as an exchange of autonomy, voice and schedule…

  10. Assessment of density functional theory based Î'SCF (self-consistent field) and linear response methods for longest wavelength excited states of extended π-conjugated molecular systems

    NASA Astrophysics Data System (ADS)

    Filatov, Michael; Huix-Rotllant, Miquel

    2014-07-01

    Computational investigation of the longest wavelength excitations in a series of cyanines and linear n-acenes is undertaken with the use of standard spin-conserving linear response time-dependent density functional theory (TD-DFT) as well as its spin-flip variant and a ΔSCF method based on the ensemble DFT. The spin-conserving linear response TD-DFT fails to accurately reproduce the lowest excitation energy in these π-conjugated systems by strongly overestimating the excitation energies of cyanines and underestimating the excitation energies of n-acenes. The spin-flip TD-DFT is capable of correcting the underestimation of excitation energies of n-acenes by bringing in the non-dynamic electron correlation into the ground state; however, it does not fully correct for the overestimation of the excitation energies of cyanines, for which the non-dynamic correlation does not seem to play a role. The ensemble DFT method employed in this work is capable of correcting for the effect of missing non-dynamic correlation in the ground state of n-acenes and for the deficient description of differential correlation effects between the ground and excited states of cyanines and yields the excitation energies of both types of extended π-conjugated systems with the accuracy matching high-level ab initio multireference calculations.

  11. Assessment of density functional theory based ΔSCF (self-consistent field) and linear response methods for longest wavelength excited states of extended π-conjugated molecular systems.

    PubMed

    Filatov, Michael; Huix-Rotllant, Miquel

    2014-07-14

    Computational investigation of the longest wavelength excitations in a series of cyanines and linear n-acenes is undertaken with the use of standard spin-conserving linear response time-dependent density functional theory (TD-DFT) as well as its spin-flip variant and a ΔSCF method based on the ensemble DFT. The spin-conserving linear response TD-DFT fails to accurately reproduce the lowest excitation energy in these π-conjugated systems by strongly overestimating the excitation energies of cyanines and underestimating the excitation energies of n-acenes. The spin-flip TD-DFT is capable of correcting the underestimation of excitation energies of n-acenes by bringing in the non-dynamic electron correlation into the ground state; however, it does not fully correct for the overestimation of the excitation energies of cyanines, for which the non-dynamic correlation does not seem to play a role. The ensemble DFT method employed in this work is capable of correcting for the effect of missing non-dynamic correlation in the ground state of n-acenes and for the deficient description of differential correlation effects between the ground and excited states of cyanines and yields the excitation energies of both types of extended π-conjugated systems with the accuracy matching high-level ab initio multireference calculations.

  12. Assessment of density functional theory based ΔSCF (self-consistent field) and linear response methods for longest wavelength excited states of extended π-conjugated molecular systems

    SciTech Connect

    Filatov, Michael; Huix-Rotllant, Miquel

    2014-07-14

    Computational investigation of the longest wavelength excitations in a series of cyanines and linear n-acenes is undertaken with the use of standard spin-conserving linear response time-dependent density functional theory (TD-DFT) as well as its spin-flip variant and a ΔSCF method based on the ensemble DFT. The spin-conserving linear response TD-DFT fails to accurately reproduce the lowest excitation energy in these π-conjugated systems by strongly overestimating the excitation energies of cyanines and underestimating the excitation energies of n-acenes. The spin-flip TD-DFT is capable of correcting the underestimation of excitation energies of n-acenes by bringing in the non-dynamic electron correlation into the ground state; however, it does not fully correct for the overestimation of the excitation energies of cyanines, for which the non-dynamic correlation does not seem to play a role. The ensemble DFT method employed in this work is capable of correcting for the effect of missing non-dynamic correlation in the ground state of n-acenes and for the deficient description of differential correlation effects between the ground and excited states of cyanines and yields the excitation energies of both types of extended π-conjugated systems with the accuracy matching high-level ab initio multireference calculations.

  13. How Qualitative Methods Can be Used to Inform Model Development.

    PubMed

    Husbands, Samantha; Jowett, Susan; Barton, Pelham; Coast, Joanna

    2017-03-20

    Decision-analytic models play a key role in informing healthcare resource allocation decisions. However, there are ongoing concerns with the credibility of models. Modelling methods guidance can encourage good practice within model development, but its value is dependent on its ability to address the areas that modellers find most challenging. Further, it is important that modelling methods and related guidance are continually updated in light of any new approaches that could potentially enhance model credibility. The objective of this article was to highlight the ways in which qualitative methods have been used and recommended to inform decision-analytic model development and enhance modelling practices. With reference to the literature, the article discusses two key ways in which qualitative methods can be, and have been, applied. The first approach involves using qualitative methods to understand and inform general and future processes of model development, and the second, using qualitative techniques to directly inform the development of individual models. The literature suggests that qualitative methods can improve the validity and credibility of modelling processes by providing a means to understand existing modelling approaches that identifies where problems are occurring and further guidance is needed. It can also be applied within model development to facilitate the input of experts to structural development. We recommend that current and future model development would benefit from the greater integration of qualitative methods, specifically by studying 'real' modelling processes, and by developing recommendations around how qualitative methods can be adopted within everyday modelling practice.

  14. System and method for acquisition management of subject position information

    DOEpatents

    Carrender, Curt

    2005-12-13

    A system and method for acquisition management of subject position information that utilizes radio frequency identification (RF ID) to store position information in position tags. Tag programmers receive position information from external positioning systems, such as the Global Positioning System (GPS), from manual inputs, such as keypads, or other tag programmers. The tag programmers program each position tag with the received position information. Both the tag programmers and the position tags can be portable or fixed. Implementations include portable tag programmers and fixed position tags for subject position guidance, and portable tag programmers for collection sample labeling. Other implementations include fixed tag programmers and portable position tags for subject route recordation. Position tags can contain other associated information such as destination address of an affixed subject for subject routing.

  15. System and method for acquisition management of subject position information

    DOEpatents

    Carrender, Curt

    2007-01-23

    A system and method for acquisition management of subject position information that utilizes radio frequency identification (RF ID) to store position information in position tags. Tag programmers receive position information from external positioning systems, such as the Global Positioning System (GPS), from manual inputs, such as keypads, or other tag programmers. The tag programmers program each position tag with the received position information. Both the tag programmers and the position tags can be portable or fixed. Implementations include portable tag programmers and fixed position tags for subject position guidance, and portable tag programmers for collection sample labeling. Other implementations include fixed tag programmers and portable position tags for subject route recordation. Position tags can contain other associated information such as destination address of an affixed subject for subject routing.

  16. Density functional Theory Based Generalized Effective Fragment Potential Method (Postprint)

    DTIC Science & Technology

    2014-07-01

    perturbative triples with a complete basis set limit (CCSD(T)/CBS) extrapolation, of 1.7, 2.2, 2.0, and 0.5 kcal/mol, for the S22, water- benzene ... benzene clusters, water clusters, and n-alkane dimers benchmark sets, respectively. The corresponding EFP2-HF errors for the respective benchmarks are 2.41...Pyrazine)2 − 4.20 1.35 − 1.75 − 3.65 13, (Uracil)2 stack − 9.74 − 11.98 − 6.74 − 10.85 14, Indole · benzene stack − 4.59 7.70 − 1.39 − 3.44 15, Adenine

  17. Index Theory-Based Algorithm for the Gradiometer Inverse Problem

    DTIC Science & Technology

    2015-03-28

    Index Theory-Based Algorithm for the Gradiometer Inverse Problem Robert C. Anderson and Jonathan W. Fitton Abstract: We present an Index Theory...based gravity gradiometer inverse problem algorithm. This algorithm relates changes in the index value computed on a closed curve containing a line...field generated by the positive eigenvector of the gradiometer tensor to the closeness of fit of the proposed inverse solution to the mass and

  18. Adaptive windowed range-constrained Otsu method using local information

    NASA Astrophysics Data System (ADS)

    Zheng, Jia; Zhang, Dinghua; Huang, Kuidong; Sun, Yuanxi; Tang, Shaojie

    2016-01-01

    An adaptive windowed range-constrained Otsu method using local information is proposed for improving the performance of image segmentation. First, the reason why traditional thresholding methods do not perform well in the segmentation of complicated images is analyzed. Therein, the influences of global and local thresholdings on the image segmentation are compared. Second, two methods that can adaptively change the size of the local window according to local information are proposed by us. The characteristics of the proposed methods are analyzed. Thereby, the information on the number of edge pixels in the local window of the binarized variance image is employed to adaptively change the local window size. Finally, the superiority of the proposed method over other methods such as the range-constrained Otsu, the active contour model, the double Otsu, the Bradley's, and the distance-regularized level set evolution is demonstrated. It is validated by the experiments that the proposed method can keep more details and acquire much more satisfying area overlap measure as compared with the other conventional methods.

  19. Financial time series analysis based on information categorization method

    NASA Astrophysics Data System (ADS)

    Tian, Qiang; Shang, Pengjian; Feng, Guochen

    2014-12-01

    The paper mainly applies the information categorization method to analyze the financial time series. The method is used to examine the similarity of different sequences by calculating the distances between them. We apply this method to quantify the similarity of different stock markets. And we report the results of similarity in US and Chinese stock markets in periods 1991-1998 (before the Asian currency crisis), 1999-2006 (after the Asian currency crisis and before the global financial crisis), and 2007-2013 (during and after global financial crisis) by using this method. The results show the difference of similarity between different stock markets in different time periods and the similarity of the two stock markets become larger after these two crises. Also we acquire the results of similarity of 10 stock indices in three areas; it means the method can distinguish different areas' markets from the phylogenetic trees. The results show that we can get satisfactory information from financial markets by this method. The information categorization method can not only be used in physiologic time series, but also in financial time series.

  20. The value of value of information: best informing research design and prioritization using current methods.

    PubMed

    Eckermann, Simon; Karnon, Jon; Willan, Andrew R

    2010-01-01

    Value of information (VOI) methods have been proposed as a systematic approach to inform optimal research design and prioritization. Four related questions arise that VOI methods could address. (i) Is further research for a health technology assessment (HTA) potentially worthwhile? (ii) Is the cost of a given research design less than its expected value? (iii) What is the optimal research design for an HTA? (iv) How can research funding be best prioritized across alternative HTAs? Following Occam's razor, we consider the usefulness of VOI methods in informing questions 1-4 relative to their simplicity of use. Expected value of perfect information (EVPI) with current information, while simple to calculate, is shown to provide neither a necessary nor a sufficient condition to address question 1, given that what EVPI needs to exceed varies with the cost of research design, which can vary from very large down to negligible. Hence, for any given HTA, EVPI does not discriminate, as it can be large and further research not worthwhile or small and further research worthwhile. In contrast, each of questions 1-4 are shown to be fully addressed (necessary and sufficient) where VOI methods are applied to maximize expected value of sample information (EVSI) minus expected costs across designs. In comparing complexity in use of VOI methods, applying the central limit theorem (CLT) simplifies analysis to enable easy estimation of EVSI and optimal overall research design, and has been shown to outperform bootstrapping, particularly with small samples. Consequently, VOI methods applying the CLT to inform optimal overall research design satisfy Occam's razor in both improving decision making and reducing complexity. Furthermore, they enable consideration of relevant decision contexts, including option value and opportunity cost of delay, time, imperfect implementation and optimal design across jurisdictions. More complex VOI methods such as bootstrapping of the expected value of

  1. Integrating Informative Priors from Experimental Research with Bayesian Methods

    PubMed Central

    Hamra, Ghassan; Richardson, David; MacLehose, Richard; Wing, Steve

    2013-01-01

    Informative priors can be a useful tool for epidemiologists to handle problems of sparse data in regression modeling. It is sometimes the case that an investigator is studying a population exposed to two agents, X and Y, where Y is the agent of primary interest. Previous research may suggest that the exposures have different effects on the health outcome of interest, one being more harmful than the other. Such information may be derived from epidemiologic analyses; however, in the case where such evidence is unavailable, knowledge can be drawn from toxicologic studies or other experimental research. Unfortunately, using toxicologic findings to develop informative priors in epidemiologic analyses requires strong assumptions, with no established method for its utilization. We present a method to help bridge the gap between animal and cellular studies and epidemiologic research by specification of an order-constrained prior. We illustrate this approach using an example from radiation epidemiology. PMID:23222512

  2. Reverse Engineering Cellular Networks with Information Theoretic Methods

    PubMed Central

    Villaverde, Alejandro F.; Ross, John; Banga, Julio R.

    2013-01-01

    Building mathematical models of cellular networks lies at the core of systems biology. It involves, among other tasks, the reconstruction of the structure of interactions between molecular components, which is known as network inference or reverse engineering. Information theory can help in the goal of extracting as much information as possible from the available data. A large number of methods founded on these concepts have been proposed in the literature, not only in biology journals, but in a wide range of areas. Their critical comparison is difficult due to the different focuses and the adoption of different terminologies. Here we attempt to review some of the existing information theoretic methodologies for network inference, and clarify their differences. While some of these methods have achieved notable success, many challenges remain, among which we can mention dealing with incomplete measurements, noisy data, counterintuitive behaviour emerging from nonlinear relations or feedback loops, and computational burden of dealing with large data sets. PMID:24709703

  3. Verbal Information Processing Paradigms: A Review of Theory and Methods.

    ERIC Educational Resources Information Center

    Mitchell, Karen J.

    The purpose of this resarch was to develop a model of verbal information processing for use in subsequent analyses of the construct and predictive validity of the current Department of Defense military selection and classification battery, the Armed Services Vocational Aptitude Battery (ASVAB) 8/9/10. The theory and research methods of selected…

  4. Transfer mutual information: A new method for measuring information transfer to the interactions of time series

    NASA Astrophysics Data System (ADS)

    Zhao, Xiaojun; Shang, Pengjian; Lin, Aijing

    2017-02-01

    In this paper, we propose a new method to measure the influence of a third variable on the interactions of two variables. The method called transfer mutual information (TMI) is defined by the difference between the mutual information and the partial mutual information. It is established on the assumption that if the presence or the absence of one variable does make change to the interactions of another two variables, then quantifying this change is supposed to be the influence from this variable to those two variables. Moreover, a normalized TMI and other derivatives of the TMI are introduced as well. The empirical analysis including the simulations as well as real-world applications is investigated to examine this measure and to reveal more information among variables.

  5. A Model-Driven Development Method for Management Information Systems

    NASA Astrophysics Data System (ADS)

    Mizuno, Tomoki; Matsumoto, Keinosuke; Mori, Naoki

    Traditionally, a Management Information System (MIS) has been developed without using formal methods. By the informal methods, the MIS is developed on its lifecycle without having any models. It causes many problems such as lack of the reliability of system design specifications. In order to overcome these problems, a model theory approach was proposed. The approach is based on an idea that a system can be modeled by automata and set theory. However, it is very difficult to generate automata of the system to be developed right from the start. On the other hand, there is a model-driven development method that can flexibly correspond to changes of business logics or implementing technologies. In the model-driven development, a system is modeled using a modeling language such as UML. This paper proposes a new development method for management information systems applying the model-driven development method to a component of the model theory approach. The experiment has shown that a reduced amount of efforts is more than 30% of all the efforts.

  6. Quantitative methods to direct exploration based on hydrogeologic information

    USGS Publications Warehouse

    Graettinger, A.J.; Lee, J.; Reeves, H.W.; Dethan, D.

    2006-01-01

    Quantitatively Directed Exploration (QDE) approaches based on information such as model sensitivity, input data covariance and model output covariance are presented. Seven approaches for directing exploration are developed, applied, and evaluated on a synthetic hydrogeologic site. The QDE approaches evaluate input information uncertainty, subsurface model sensitivity and, most importantly, output covariance to identify the next location to sample. Spatial input parameter values and covariances are calculated with the multivariate conditional probability calculation from a limited number of samples. A variogram structure is used during data extrapolation to describe the spatial continuity, or correlation, of subsurface information. Model sensitivity can be determined by perturbing input data and evaluating output response or, as in this work, sensitivities can be programmed directly into an analysis model. Output covariance is calculated by the First-Order Second Moment (FOSM) method, which combines the covariance of input information with model sensitivity. A groundwater flow example, modeled in MODFLOW-2000, is chosen to demonstrate the seven QDE approaches. MODFLOW-2000 is used to obtain the piezometric head and the model sensitivity simultaneously. The seven QDE approaches are evaluated based on the accuracy of the modeled piezometric head after information from a QDE sample is added. For the synthetic site used in this study, the QDE approach that identifies the location of hydraulic conductivity that contributes the most to the overall piezometric head variance proved to be the best method to quantitatively direct exploration. ?? IWA Publishing 2006.

  7. Information System Selection: Methods for Comparing Service Benefits

    PubMed Central

    Bradley, Evelyn; Campbell, James G.

    1981-01-01

    Automated hospital information systems are purchased both for their potential impact on costs (economic benefits) and for their potential impact on the efficiency and effectiveness of hospital performance (Service Benefits). This paper defines and describes Service Benefits and describes their importance in information system selection. Comparing various systems' Service Benefit contributions implies developing a composite measure of potential Service Benefits; this necessitates expressing Service Benefits in a single unit of measure. This paper concludes with discussion of alternative methods for translating Service Benefits into a common unit of measure, so they may be summed and compared for each system under consideration.

  8. Mixed-methods exploration of parents' health information understanding.

    PubMed

    Lehna, Carlee; McNeil, Jack

    2008-05-01

    Health literacy--the ability to read, understand, and use health information to make health care decisions--affects health care outcomes, hospitalization costs, and readmission. The purpose of this exploratory mixed-methods study is to determine how two different parent groups (English speaking and Spanish speaking) understand medical care for their children and the procedural and research consent forms required by that care. Quantitative and qualitative data are gathered and compared concurrently. Differences between groups are found in age, grade completed, Short Test of Functional Health Literacy in Adults scores, and ways of understanding health information. Identifying how parents understand health information is the first step in providing effective family-centered health care education.

  9. Application of information theory methods to food web reconstruction

    USGS Publications Warehouse

    Moniz, L.J.; Cooch, E.G.; Ellner, S.P.; Nichols, J.D.; Nichols, J.M.

    2007-01-01

    In this paper we use information theory techniques on time series of abundances to determine the topology of a food web. At the outset, the food web participants (two consumers, two resources) are known; in addition we know that each consumer prefers one of the resources over the other. However, we do not know which consumer prefers which resource, and if this preference is absolute (i.e., whether or not the consumer will consume the non-preferred resource). Although the consumers and resources are identified at the beginning of the experiment, we also provide evidence that the consumers are not resources for each other, and the resources do not consume each other. We do show that there is significant mutual information between resources; the model is seasonally forced and some shared information between resources is expected. Similarly, because the model is seasonally forced, we expect shared information between consumers as they respond to the forcing of the resources. The model that we consider does include noise, and in an effort to demonstrate that these methods may be of some use in other than model data, we show the efficacy of our methods with decreasing time series size; in this particular case we obtain reasonably clear results with a time series length of 400 points. This approaches ecological time series lengths from real systems.

  10. Theory-informed design of values clarification methods: a cognitive psychological perspective on patient health-related decision making.

    PubMed

    Pieterse, Arwen H; de Vries, Marieke; Kunneman, Marleen; Stiggelbout, Anne M; Feldman-Stewart, Deb

    2013-01-01

    Healthcare decisions, particularly those involving weighing benefits and harms that may significantly affect quality and/or length of life, should reflect patients' preferences. To support patients in making choices, patient decision aids and values clarification methods (VCM) in particular have been developed. VCM intend to help patients to determine the aspects of the choices that are important to their selection of a preferred option. Several types of VCM exist. However, they are often designed without clear reference to theory, which makes it difficult for their development to be systematic and internally coherent. Our goal was to provide theory-informed recommendations for the design of VCM. Process theories of decision making specify components of decision processes, thus, identify particular processes that VCM could aim to facilitate. We conducted a review of the MEDLINE and PsycINFO databases and of references to theories included in retrieved papers, to identify process theories of decision making. We selected a theory if (a) it fulfilled criteria for a process theory; (b) provided a coherent description of the whole process of decision making; and (c) empirical evidence supports at least some of its postulates. Four theories met our criteria: Image Theory, Differentiation and Consolidation theory, Parallel Constraint Satisfaction theory, and Fuzzy-trace Theory. Based on these, we propose that VCM should: help optimize mental representations; encourage considering all potentially appropriate options; delay selection of an initially favoured option; facilitate the retrieval of relevant values from memory; facilitate the comparison of options and their attributes; and offer time to decide. In conclusion, our theory-based design recommendations are explicit and transparent, providing an opportunity to test each in a systematic manner.

  11. A queuing-theory-based interval-fuzzy robust two-stage programming model for environmental management under uncertainty

    NASA Astrophysics Data System (ADS)

    Sun, Y.; Li, Y. P.; Huang, G. H.

    2012-06-01

    In this study, a queuing-theory-based interval-fuzzy robust two-stage programming (QB-IRTP) model is developed through introducing queuing theory into an interval-fuzzy robust two-stage (IRTP) optimization framework. The developed QB-IRTP model can not only address highly uncertain information for the lower and upper bounds of interval parameters but also be used for analysing a variety of policy scenarios that are associated with different levels of economic penalties when the promised targets are violated. Moreover, it can reflect uncertainties in queuing theory problems. The developed method has been applied to a case of long-term municipal solid waste (MSW) management planning. Interval solutions associated with different waste-generation rates, different waiting costs and different arriving rates have been obtained. They can be used for generating decision alternatives and thus help managers to identify desired MSW management policies under various economic objectives and system reliability constraints.

  12. Determination of nuclear level densities from experimental information

    SciTech Connect

    Cole, B.J. ); Davidson, N.J. , P.O. Box 88, Manchester M60 1QD ); Miller, H.G. )

    1994-10-01

    A novel information theory based method for determining the density of states from prior information is presented. The energy dependence of the density of states is determined from the observed number of states per energy interval, and model calculations suggest that the method is sufficiently reliable to calculate the thermal properties of nuclei over a reasonable temperature range.

  13. Emotion identification method using RGB information of human face

    NASA Astrophysics Data System (ADS)

    Kita, Shinya; Mita, Akira

    2015-03-01

    Recently, the number of single households is drastically increased due to the growth of the aging society and the diversity of lifestyle. Therefore, the evolution of building spaces is demanded. Biofied Building we propose can help to avoid this situation. It helps interaction between the building and residents' conscious and unconscious information using robots. The unconscious information includes emotion, condition, and behavior. One of the important information is thermal comfort. We assume we can estimate it from human face. There are many researchs about face color analysis, but a few of them are conducted in real situations. In other words, the existing methods were not used with disturbance such as room lumps. In this study, Kinect was used with face-tracking. Room lumps and task lumps were used to verify that our method could be applicable to real situation. In this research, two rooms at 22 and 28 degrees C were prepared. We showed that the transition of thermal comfort by changing temperature can be observed from human face. Thus, distinction between the data of 22 and 28 degrees C condition from face color was proved to be possible.

  14. A Rapid Usability Evaluation (RUE) Method for Health Information Technology.

    PubMed

    Russ, Alissa L; Baker, Darrell A; Fahner, W Jeffrey; Milligan, Bryce S; Cox, Leeann; Hagg, Heather K; Saleem, Jason J

    2010-11-13

    Usability testing can help generate design ideas to enhance the quality and safety of health information technology. Despite these potential benefits, few healthcare organizations conduct systematic usability testing prior to software implementation. We used a Rapid Usability Evaluation (RUE) method to apply usability testing to software development at a major VA Medical Center. We describe the development of the RUE method, provide two examples of how it was successfully applied, and discuss key insights gained from this work. Clinical informaticists with limited usability training were able to apply RUE to improve software evaluation and elected to continue to use this technique. RUE methods are relatively simple, do not require advanced training or usability software, and should be easy to adopt. Other healthcare organizations may be able to implement RUE to improve software effectiveness, efficiency, and safety.

  15. A mixed model reduction method for preserving selected physical information

    NASA Astrophysics Data System (ADS)

    Zhang, Jing; Zheng, Gangtie

    2017-03-01

    A new model reduction method in the frequency domain is presented. By mixedly using the model reduction techniques from both the time domain and the frequency domain, the dynamic model is condensed to selected physical coordinates, and the contribution of slave degrees of freedom is taken as a modification to the model in the form of effective modal mass of virtually constrained modes. The reduced model can preserve the physical information related to the selected physical coordinates such as physical parameters and physical space positions of corresponding structure components. For the cases of non-classical damping, the method is extended to the model reduction in the state space but still only contains the selected physical coordinates. Numerical results are presented to validate the method and show the effectiveness of the model reduction.

  16. Theory-Based Approaches to the Concept of Life

    ERIC Educational Resources Information Center

    El-Hani, Charbel Nino

    2008-01-01

    In this paper, I argue that characterisations of life through lists of properties have several shortcomings and should be replaced by theory-based accounts that explain the coexistence of a set of properties in living beings. The concept of life should acquire its meaning from its relationships with other concepts inside a theory. I illustrate…

  17. Theory Based Approaches to Learning. Implications for Adult Educators.

    ERIC Educational Resources Information Center

    Bolton, Elizabeth B.; Jones, Edward V.

    This paper presents a codification of theory-based approaches that are applicable to adult learning situations. It also lists some general guidelines that can be used when selecting a particular approach or theory as a basis for planning instruction. Adult education's emphasis on practicality and the relationship between theory and practice is…

  18. Theory-Based University Admissions Testing for a New Millennium

    ERIC Educational Resources Information Center

    Sternberg, Robert J.

    2004-01-01

    This article describes two projects based on Robert J. Sternberg's theory of successful intelligence and designed to provide theory-based testing for university admissions. The first, Rainbow Project, provided a supplementary test of analytical, practical, and creative skills to augment the SAT in predicting college performance. The Rainbow…

  19. Continuing Bonds in Bereavement: An Attachment Theory Based Perspective

    ERIC Educational Resources Information Center

    Field, Nigel P.; Gao, Beryl; Paderna, Lisa

    2005-01-01

    An attachment theory based perspective on the continuing bond to the deceased (CB) is proposed. The value of attachment theory in specifying the normative course of CB expression and in identifying adaptive versus maladaptive variants of CB expression based on their deviation from this normative course is outlined. The role of individual…

  20. Theory-Based Diagnosis and Remediation of Writing Disabilities.

    ERIC Educational Resources Information Center

    Berninger, Virginia W.; And Others

    1991-01-01

    Briefly reviews recent trends in research on writing; introduces theory-based model being developed for differential diagnosis of writing disabilities at neuropsychological, linguistic, and cognitive levels; presents cases and patterns in cases that illustrate differential diagnosis of writing disabilities at linguistic level; and suggests…

  1. Information processing systems, reasoning modules, and reasoning system design methods

    DOEpatents

    Hohimer, Ryan E.; Greitzer, Frank L.; Hampton, Shawn D.

    2016-08-23

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  2. Information processing systems, reasoning modules, and reasoning system design methods

    DOEpatents

    Hohimer, Ryan E.; Greitzer, Frank L.; Hampton, Shawn D.

    2015-08-18

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  3. Information bias in health research: definition, pitfalls, and adjustment methods

    PubMed Central

    Althubaiti, Alaa

    2016-01-01

    As with other fields, medical sciences are subject to different sources of bias. While understanding sources of bias is a key element for drawing valid conclusions, bias in health research continues to be a very sensitive issue that can affect the focus and outcome of investigations. Information bias, otherwise known as misclassification, is one of the most common sources of bias that affects the validity of health research. It originates from the approach that is utilized to obtain or confirm study measurements. This paper seeks to raise awareness of information bias in observational and experimental research study designs as well as to enrich discussions concerning bias problems. Specifying the types of bias can be essential to limit its effects and, the use of adjustment methods might serve to improve clinical evaluation and health care practice. PMID:27217764

  4. Information processing systems, reasoning modules, and reasoning system design methods

    SciTech Connect

    Hohimer, Ryan E; Greitzer, Frank L; Hampton, Shawn D

    2014-03-04

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  5. a Task-Oriented Disaster Information Correlation Method

    NASA Astrophysics Data System (ADS)

    Linyao, Q.; Zhiqiang, D.; Qing, Z.

    2015-07-01

    With the rapid development of sensor networks and Earth observation technology, a large quantity of disaster-related data is available, such as remotely sensed data, historic data, case data, simulated data, and disaster products. However, the efficiency of current data management and service systems has become increasingly difficult due to the task variety and heterogeneous data. For emergency task-oriented applications, the data searches primarily rely on artificial experience based on simple metadata indices, the high time consumption and low accuracy of which cannot satisfy the speed and veracity requirements for disaster products. In this paper, a task-oriented correlation method is proposed for efficient disaster data management and intelligent service with the objectives of 1) putting forward disaster task ontology and data ontology to unify the different semantics of multi-source information, 2) identifying the semantic mapping from emergency tasks to multiple data sources on the basis of uniform description in 1), and 3) linking task-related data automatically and calculating the correlation between each data set and a certain task. The method goes beyond traditional static management of disaster data and establishes a basis for intelligent retrieval and active dissemination of disaster information. The case study presented in this paper illustrates the use of the method on an example flood emergency relief task.

  6. The analysis of network transmission method for welding robot information

    NASA Astrophysics Data System (ADS)

    Cheng, Weide; Zhang, Hua; Liu, Donghua; Wang, Hongbo

    2011-12-01

    On the asis of User Datagram Protocol (UserDatagram Protocol, UDP), to do some improvement and design a welding robot network communication protocol (welding robot network communicate protocol: WRNCP), working on the fields of the transport layer and application layer of TCP / IP protocol. According to the characteristics of video data, to design the radio push-type (Broadcast Push Model, BPM) transmission method, improving the efficiency and stability of video transmission.and to designed the network information transmission system, used for real-time control of welding robot network.

  7. The analysis of network transmission method for welding robot information

    NASA Astrophysics Data System (ADS)

    Cheng, Weide; Zhang, Hua; Liu, Donghua; Wang, Hongbo

    2012-01-01

    On the asis of User Datagram Protocol (UserDatagram Protocol, UDP), to do some improvement and design a welding robot network communication protocol (welding robot network communicate protocol: WRNCP), working on the fields of the transport layer and application layer of TCP / IP protocol. According to the characteristics of video data, to design the radio push-type (Broadcast Push Model, BPM) transmission method, improving the efficiency and stability of video transmission.and to designed the network information transmission system, used for real-time control of welding robot network.

  8. Linear reduction method for predictive and informative tag SNP selection.

    PubMed

    He, Jingwu; Westbrooks, Kelly; Zelikovsky, Alexander

    2005-01-01

    Constructing a complete human haplotype map is helpful when associating complex diseases with their related SNPs. Unfortunately, the number of SNPs is very large and it is costly to sequence many individuals. Therefore, it is desirable to reduce the number of SNPs that should be sequenced to a small number of informative representatives called tag SNPs. In this paper, we propose a new linear algebra-based method for selecting and using tag SNPs. We measure the quality of our tag SNP selection algorithm by comparing actual SNPs with SNPs predicted from selected linearly independent tag SNPs. Our experiments show that for sufficiently long haplotypes, knowing only 0.4% of all SNPs the proposed linear reduction method predicts an unknown haplotype with the error rate below 2% based on 10% of the population.

  9. Methods and Systems for Advanced Spaceport Information Management

    NASA Technical Reports Server (NTRS)

    Fussell, Ronald M. (Inventor); Ely, Donald W. (Inventor); Meier, Gary M. (Inventor); Halpin, Paul C. (Inventor); Meade, Phillip T. (Inventor); Jacobson, Craig A. (Inventor); Blackwell-Thompson, Charlie (Inventor)

    2007-01-01

    Advanced spaceport information management methods and systems are disclosed. In one embodiment, a method includes coupling a test system to the payload and transmitting one or more test signals that emulate an anticipated condition from the test system to the payload. One or more responsive signals are received from the payload into the test system and are analyzed to determine whether one or more of the responsive signals comprises an anomalous signal. At least one of the steps of transmitting, receiving, analyzing and determining includes transmitting at least one of the test signals and the responsive signals via a communications link from a payload processing facility to a remotely located facility. In one particular embodiment, the communications link is an Internet link from a payload processing facility to a remotely located facility (e.g. a launch facility, university, etc.).

  10. Methods and systems for advanced spaceport information management

    NASA Technical Reports Server (NTRS)

    Fussell, Ronald M. (Inventor); Ely, Donald W. (Inventor); Meier, Gary M. (Inventor); Halpin, Paul C. (Inventor); Meade, Phillip T. (Inventor); Jacobson, Craig A. (Inventor); Blackwell-Thompson, Charlie (Inventor)

    2007-01-01

    Advanced spaceport information management methods and systems are disclosed. In one embodiment, a method includes coupling a test system to the payload and transmitting one or more test signals that emulate an anticipated condition from the test system to the payload. One or more responsive signals are received from the payload into the test system and are analyzed to determine whether one or more of the responsive signals comprises an anomalous signal. At least one of the steps of transmitting, receiving, analyzing and determining includes transmitting at least one of the test signals and the responsive signals via a communications link from a payload processing facility to a remotely located facility. In one particular embodiment, the communications link is an Internet link from a payload processing facility to a remotely located facility (e.g. a launch facility, university, etc.).

  11. Transport-theory based multispectral imaging with PDE-constrained optimization

    NASA Astrophysics Data System (ADS)

    Kim, Hyun K.; Flexman, Molly; Yamashiro, Darrell J.; Kandel, Jessica J.; Hielscher, Andreas H.

    2011-02-01

    We introduce here a transport-theory-based PDE-constrained multispectral imaging algorithm for direct reconstruction of the spatial distribution of chromophores in tissue. The method solves the forward and inverse problems simultaneously in the framework of a reduced Hessian sequential quadratic programming method. The performance of the new algorithm is evaluated using numerical and experimental studies involving tumor bearing mice. The results show that the PDE-constrained multispectral method leads to 15-fold acceleration in the image reconstruction of tissue chromophores when compared to the unconstrained multispectral approach and also gives more accurate results when compared to the traditional two-step method.

  12. A diffusive information preservation method for small Knudsen number flows

    SciTech Connect

    Fei, Fei; Fan, Jing

    2013-06-15

    The direct simulation Monte Carlo (DSMC) method is a powerful particle-based method for modeling gas flows. It works well for relatively large Knudsen (Kn) numbers, typically larger than 0.01, but quickly becomes computationally intensive as Kn decreases due to its time step and cell size limitations. An alternative approach was proposed to relax or remove these limitations, based on replacing pairwise collisions with a stochastic model corresponding to the Fokker–Planck equation [J. Comput. Phys., 229, 1077 (2010); J. Fluid Mech., 680, 574 (2011)]. Similar to the DSMC method, the downside of that approach suffers from computationally statistical noise. To solve the problem, a diffusion-based information preservation (D-IP) method has been developed. The main idea is to track the motion of a simulated molecule from the diffusive standpoint, and obtain the flow velocity and temperature through sampling and averaging the IP quantities. To validate the idea and the corresponding model, several benchmark problems with Kn ∼ 10{sup −3}–10{sup −4} have been investigated. It is shown that the IP calculations are not only accurate, but also efficient because they make possible using a time step and cell size over an order of magnitude larger than the mean collision time and mean free path, respectively.

  13. A diffusive information preservation method for small Knudsen number flows

    NASA Astrophysics Data System (ADS)

    Fei, Fei; Fan, Jing

    2013-06-01

    The direct simulation Monte Carlo (DSMC) method is a powerful particle-based method for modeling gas flows. It works well for relatively large Knudsen (Kn) numbers, typically larger than 0.01, but quickly becomes computationally intensive as Kn decreases due to its time step and cell size limitations. An alternative approach was proposed to relax or remove these limitations, based on replacing pairwise collisions with a stochastic model corresponding to the Fokker-Planck equation [J. Comput. Phys., 229, 1077 (2010); J. Fluid Mech., 680, 574 (2011)]. Similar to the DSMC method, the downside of that approach suffers from computationally statistical noise. To solve the problem, a diffusion-based information preservation (D-IP) method has been developed. The main idea is to track the motion of a simulated molecule from the diffusive standpoint, and obtain the flow velocity and temperature through sampling and averaging the IP quantities. To validate the idea and the corresponding model, several benchmark problems with Kn ˜ 10-3-10-4 have been investigated. It is shown that the IP calculations are not only accurate, but also efficient because they make possible using a time step and cell size over an order of magnitude larger than the mean collision time and mean free path, respectively.

  14. Liposome/water lipophilicity: methods, information content, and pharmaceutical applications.

    PubMed

    van Balen, Georgette Plemper; Martinet, Catherine a Marca; Caron, Giulia; Bouchard, Géraldine; Reist, Marianne; Carrupt, Pierre-Alain; Fruttero, Roberta; Gasco, Alberto; Testa, Bernard

    2004-05-01

    This review discusses liposome/water lipophilicity in terms of the structure of liposomes, experimental methods, and information content. In a first part, the structural properties of the hydrophobic core and polar surface of liposomes are examined in the light of potential interactions with solute molecules. Particular emphasis is placed on the physicochemical properties of polar headgroups of lipids in liposomes. A second part is dedicated to three useful methods to study liposome/water partitioning, namely potentiometry, equilibrium dialysis, and (1)H-NMR relaxation rates. In each case, the principle and limitations of the method are discussed. The next part presents the structural information encoded in liposome/water lipophilicity, in other words the solutes' structural and physicochemical properties that determine their behavior and hence their partitioning in such systems. This presentation is based on a comparison between isotropic (i.e., solvent/water) and anisotropic (e.g., liposome/water) systems. An important factor to be considered is whether the anisotropic lipid phase is ionized or not. Three examples taken from the authors' laboratories are discussed to illustrate the factors or combinations thereof that govern liposome/water lipophilicity, namely (a) hydrophobic interactions alone, (b) hydrophobic and polar interactions, and (c) conformational effects plus hydrophobic and ionic interactions. The next part presents two studies taken from the field of QSAR to exemplify the use of liposome/water lipophilicity in structure-disposition and structure-activity relationships. In the conclusion, we summarize the interests and limitations of this technology and point to promising developments.

  15. A Specification Method for Interactive Medical Information Systems

    PubMed Central

    Wasserman, Anthony I.; Stinson, Susan K.

    1980-01-01

    This paper presents the User Software Engineering (USE) approach for developing specifications for an interactive information system (IIS) and shows how the method is applied to the specification of a Perinatal Data Registry system. Two linked views of the system are developed: a user view suitable for computer-naive users, and a design/verification view, suitable for computer-knowledgeable users. The user view is intended to facilitate user participation in the analysis task and in the definition of the user/system dialogue. The verification view is intended to facilitate design and testing of the resulting system. The two notations share their notations for data base definition and for specification of the user/system dialogue; however, the user view may utilize narrative text for describing the operations, while the design/verification view relies on a more formal specification method. The specification method encourages effective communication between users and developers and permits refinement of the specification in order to ensure that the resulting specification is as complete, consistent, and accurate as possible before proceeding with design and implementation.

  16. Measurement Theory Based on the Truth Values Violates Local Realism

    NASA Astrophysics Data System (ADS)

    Nagata, Koji

    2017-02-01

    We investigate the violation factor of the Bell-Mermin inequality. Until now, we use an assumption that the results of measurement are ±1. In this case, the maximum violation factor is 2( n-1)/2. The quantum predictions by n-partite Greenberger-Horne-Zeilinger (GHZ) state violate the Bell-Mermin inequality by an amount that grows exponentially with n. Recently, a new measurement theory based on the truth values is proposed (Nagata and Nakamura, Int. J. Theor. Phys. 55:3616, 2016). The values of measurement outcome are either +1 or 0. Here we use the new measurement theory. We consider multipartite GHZ state. It turns out that the Bell-Mermin inequality is violated by the amount of 2( n-1)/2. The measurement theory based on the truth values provides the maximum violation of the Bell-Mermin inequality.

  17. Theory-Based Bayesian Models of Inductive Inference

    DTIC Science & Technology

    2010-06-30

    Oxford University Press . 28. Griffiths, T. L. and Tenenbaum, J.B. (2007). Two proposals for causal grammar. In A. Gopnik and L. Schulz (eds.). ( ausal Learning. Oxford University Press . 29. Tenenbaum. J. B.. Kemp, C, Shafto. P. (2007). Theory-based Bayesian models for inductive reasoning. In A. Feeney and E. Heit (eds.). Induction. Cambridge University Press. 30. Goodman, N. D., Tenenbaum, J. B., Griffiths. T. L.. & Feldman, J. (2008). Compositionality in rational analysis: Grammar-based induction for concept

  18. System and Method for RFID-Enabled Information Collection

    NASA Technical Reports Server (NTRS)

    Fink, Patrick W. (Inventor); Lin, Gregory Y. (Inventor); Kennedy, Timothy F. (Inventor); Ngo, Phong H. (Inventor); Byerly, Diane (Inventor)

    2016-01-01

    Methods, apparatuses and systems for radio frequency identification (RFID)-enabled information collection are disclosed, including an enclosure, a collector coupled to the enclosure, an interrogator, a processor, and one or more RFID field sensors, each having an individual identification, disposed within the enclosure. In operation, the interrogator transmits an incident signal to the collector, causing the collector to generate an electromagnetic field within the enclosure. The electromagnetic field is affected by one or more influences. RFID sensors respond to the electromagnetic field by transmitting reflected signals containing the individual identifications of the responding RFID sensors to the interrogator. The interrogator receives the reflected signals, measures one or more returned signal strength indications ("RSSI") of the reflected signals and sends the RSSI measurements and identification of the responding RFID sensors to the processor to determine one or more facts about the influences. Other embodiments are also described.

  19. Methods of Information Geometry to model complex shapes

    NASA Astrophysics Data System (ADS)

    De Sanctis, A.; Gattone, S. A.

    2016-09-01

    In this paper, a new statistical method to model patterns emerging in complex systems is proposed. A framework for shape analysis of 2- dimensional landmark data is introduced, in which each landmark is represented by a bivariate Gaussian distribution. From Information Geometry we know that Fisher-Rao metric endows the statistical manifold of parameters of a family of probability distributions with a Riemannian metric. Thus this approach allows to reconstruct the intermediate steps in the evolution between observed shapes by computing the geodesic, with respect to the Fisher-Rao metric, between the corresponding distributions. Furthermore, the geodesic path can be used for shape predictions. As application, we study the evolution of the rat skull shape. A future application in Ophthalmology is introduced.

  20. A new template matching method based on contour information

    NASA Astrophysics Data System (ADS)

    Cai, Huiying; Zhu, Feng; Wu, Qingxiao; Li, Sicong

    2014-11-01

    Template matching is a significant approach in machine vision due to its effectiveness and robustness. However, most of the template matching methods are so time consuming that they can't be used to many real time applications. The closed contour matching method is a popular kind of template matching methods. This paper presents a new closed contour template matching method which is suitable for two dimensional objects. Coarse-to-fine searching strategy is used to improve the matching efficiency and a partial computation elimination scheme is proposed to further speed up the searching process. The method consists of offline model construction and online matching. In the process of model construction, triples and distance image are obtained from the template image. A certain number of triples which are composed by three points are created from the contour information that is extracted from the template image. The rule to select the three points is that the template contour is divided equally into three parts by these points. The distance image is obtained here by distance transform. Each point on the distance image represents the nearest distance between current point and the points on the template contour. During the process of matching, triples of the searching image are created with the same rule as the triples of the model. Through the similarity that is invariant to rotation, translation and scaling between triangles, the triples corresponding to the triples of the model are found. Then we can obtain the initial RST (rotation, translation and scaling) parameters mapping the searching contour to the template contour. In order to speed up the searching process, the points on the searching contour are sampled to reduce the number of the triples. To verify the RST parameters, the searching contour is projected into the distance image, and the mean distance can be computed rapidly by simple operations of addition and multiplication. In the fine searching process

  1. Formative research to develop theory-based messages for a Western Australian child drowning prevention television campaign: study protocol

    PubMed Central

    Denehy, Mel; Crawford, Gemma; Leavy, Justine; Nimmo, Lauren; Jancey, Jonine

    2016-01-01

    Introduction Worldwide, children under the age of 5 years are at particular risk of drowning. Responding to this need requires the development of evidence-informed drowning prevention strategies. Historically, drowning prevention strategies have included denying access, learning survival skills and providing supervision, as well as education and information which includes the use of mass media. Interventions underpinned by behavioural theory and formative evaluation tend to be more effective, yet few practical examples exist in the drowning and/or injury prevention literature. The Health Belief Model and Social Cognitive Theory will be used to explore participants' perspectives regarding proposed mass media messaging. This paper describes a qualitative protocol to undertake formative research to develop theory-based messages for a child drowning prevention campaign. Methods and analysis The primary data source will be focus group interviews with parents and caregivers of children under 5 years of age in metropolitan and regional Western Australia. Qualitative content analysis will be used to analyse the data. Ethics and dissemination This study will contribute to the drowning prevention literature to inform the development of future child drowning prevention mass media campaigns. Findings from the study will be disseminated to practitioners, policymakers and researchers via international conferences, peer and non-peer-reviewed journals and evidence summaries. The study was submitted and approved by the Curtin University Human Research Ethics Committee. PMID:27207621

  2. Agent-based method for distributed clustering of textual information

    DOEpatents

    Potok, Thomas E [Oak Ridge, TN; Reed, Joel W [Knoxville, TN; Elmore, Mark T [Oak Ridge, TN; Treadwell, Jim N [Louisville, TN

    2010-09-28

    A computer method and system for storing, retrieving and displaying information has a multiplexing agent (20) that calculates a new document vector (25) for a new document (21) to be added to the system and transmits the new document vector (25) to master cluster agents (22) and cluster agents (23) for evaluation. These agents (22, 23) perform the evaluation and return values upstream to the multiplexing agent (20) based on the similarity of the document to documents stored under their control. The multiplexing agent (20) then sends the document (21) and the document vector (25) to the master cluster agent (22), which then forwards it to a cluster agent (23) or creates a new cluster agent (23) to manage the document (21). The system also searches for stored documents according to a search query having at least one term and identifying the documents found in the search, and displays the documents in a clustering display (80) of similarity so as to indicate similarity of the documents to each other.

  3. An address geocoding method for improving rural spatial information infrastructure

    NASA Astrophysics Data System (ADS)

    Pan, Yuchun; Chen, Baisong; Lu, Zhou; Li, Shuhua; Zhang, Jingbo; Zhou, Yanbing

    2009-09-01

    The transition of rural and agricultural management from divisional to integrated mode has highlighted the importance of data integration and sharing. Current data are mostly collected by specific department to satisfy their own needs and lake of considering on wider potential uses. This led to great difference in data format, semantic, and precision even in same area, which is a significant barrier for constructing an integrated rural spatial information system to support integrated management and decision-making. Considering the rural cadastral management system and postal zones, the paper designs a rural address geocoding method based on rural cadastral parcel. It puts forward a geocoding standard which consists of absolute position code, relative position code and extended code. It designs a rural geocoding database model, and addresses collection and update model. Then, based on the rural address geocoding model, it proposed a data model for rural agricultural resources management. The results show that the address coding based on postal code is stable and easy to memorize, two-dimensional coding based on the direction and distance is easy to be located and memorized, while extended code can enhance the extensibility and flexibility of address geocoding.

  4. An address geocoding method for improving rural spatial information infrastructure

    NASA Astrophysics Data System (ADS)

    Pan, Yuchun; Chen, Baisong; Lu, Zhou; Li, Shuhua; Zhang, Jingbo; Zhou, YanBing

    2010-11-01

    The transition of rural and agricultural management from divisional to integrated mode has highlighted the importance of data integration and sharing. Current data are mostly collected by specific department to satisfy their own needs and lake of considering on wider potential uses. This led to great difference in data format, semantic, and precision even in same area, which is a significant barrier for constructing an integrated rural spatial information system to support integrated management and decision-making. Considering the rural cadastral management system and postal zones, the paper designs a rural address geocoding method based on rural cadastral parcel. It puts forward a geocoding standard which consists of absolute position code, relative position code and extended code. It designs a rural geocoding database model, and addresses collection and update model. Then, based on the rural address geocoding model, it proposed a data model for rural agricultural resources management. The results show that the address coding based on postal code is stable and easy to memorize, two-dimensional coding based on the direction and distance is easy to be located and memorized, while extended code can enhance the extensibility and flexibility of address geocoding.

  5. Informative Parameters of Dynamic Geo-electricity Methods

    NASA Astrophysics Data System (ADS)

    Tursunmetov, R.

    With growing complexity of geological tasks and revealing abnormality zones con- nected with ore, oil, gas and water availability, methods of dynamic geo-electricity started to be used. In these methods geological environment is considered as inter- phase irregular one. Main dynamic element of this environment is double electric layer, which develops on the boundary between solid and liquid phase. In ore or wa- ter saturated environment double electric layers become electrochemical or electro- kinetic active elements of geo-electric environment, which, in turn, form natural elec- tric field. Mentioned field influences artificially created field distribution and inter- action bear complicated super-position or non-linear character. Therefore, geological environment is considered as active one, which is able to accumulate and transform artificially superpositioned fields. Main dynamic property of this environment is non- liner behavior of specific electric resistance and soil polarization depending on current density and measurements frequency, which serve as informative parameters for dy- namic geo-electricity methods. Study of disperse soil electric properties in impulse- frequency regime with study of temporal and frequency characteristics of electric field is of main interest for definition of geo-electric abnormality. Volt-amperic characteris- tics of electromagnetic field study has big practical significance. These characteristics are determined by electric-chemically active ore and water saturated fields. Mentioned parameters depend on initiated field polarity, in particular on ore saturated zone's character, composition and mineralization and natural electric field availability un- der cathode and anode mineralization. Non-linear behavior of environment's dynamic properties impacts initiated field structure that allows to define abnormal zone loca- tion. And, finally, study of soil anisotropy dynamic properties in space will allow to identify filtration flows

  6. Theory-based Bayesian models of inductive learning and reasoning.

    PubMed

    Tenenbaum, Joshua B; Griffiths, Thomas L; Kemp, Charles

    2006-07-01

    Inductive inference allows humans to make powerful generalizations from sparse data when learning about word meanings, unobserved properties, causal relationships, and many other aspects of the world. Traditional accounts of induction emphasize either the power of statistical learning, or the importance of strong constraints from structured domain knowledge, intuitive theories or schemas. We argue that both components are necessary to explain the nature, use and acquisition of human knowledge, and we introduce a theory-based Bayesian framework for modeling inductive learning and reasoning as statistical inferences over structured knowledge representations.

  7. Remote multi-position information gathering system and method

    DOEpatents

    Hirschfeld, Tomas B.

    1986-01-01

    A technique for gathering specific information from various remote locations, especially fluorimetric information characteristic of particular materials at the various locations is disclosed herein. This technique uses a single source of light disposed at still a different, central location and an overall optical network including an arrangement of optical fibers cooperating with the light source for directing individual light beams into the different information bearing locations. The incoming light beams result in corresponding displays of light, e.g., fluorescent light, containing the information to be obtained. The optical network cooperates with these light displays at the various locations for directing outgoing light beams containing the same information as their cooperating displays from these locations to the central location. Each of these outgoing beams is applied to a detection arrangement, e.g., a fluorescence spectroscope, for retrieving the information contained thereby.

  8. Remote multi-position information gathering system and method

    DOEpatents

    Hirschfeld, Tomas B.

    1986-01-01

    A technique for gathering specific information from various remote locations, especially fluorimetric information characteristic of particular materials at the various locations is disclosed herein. This technique uses a single source of light disposed at still a different, central location and an overall optical network including an arrangement of optical fibers cooperating with the light source for directing individual light beams into the different information bearing locations. The incoming light beams result in corresponding displays of light, e.g., fluorescent light, containing the information to be obtained. The optical network cooperates with these light displays at the various locations for directing ongoing light beams containing the same information as their cooperating displays from these locations to the central location. Each of these outgoing beams is applied to a detection arrangement, e.g., a fluorescence spectroscope, for retrieving the information contained thereby.

  9. Remote multi-position information gathering system and method

    DOEpatents

    Hirschfeld, Tomas B.

    1989-01-01

    A technique for gathering specific information from various remote locations, especially fluorimetric information characteristic of particular materials at the various locations is disclosed herein. This technique uses a single source of light disposed at still a different, central location and an overall optical network including an arrangement of optical fibers cooperating with the light source for directing individual light beams into the different information bearing locations. The incoming light beams result in corresponding displays of light, e.g., fluorescent light, containing the information to be obtained. The optical network cooperates with these light displays at the various locations for directing outgoing light beams containing the same information as their cooperating displays from these locations to the central location. Each of these outgoing beams is applied to a detection arrangement, e.g., a fluorescence spectroscope, for retrieving the information contained thereby.

  10. Remote multi-position information gathering system and method

    DOEpatents

    Hirschfeld, T.B.

    1989-01-24

    A technique for gathering specific information from various remote locations, especially fluorimetric information characteristic of particular materials at the various locations is disclosed herein. This technique uses a single source of light disposed at still a different, central location and an overall optical network including an arrangement of optical fibers cooperating with the light source for directing individual light beams into the different information bearing locations. The incoming light beams result in corresponding displays of light, e.g., fluorescent light, containing the information to be obtained. The optical network cooperates with these light displays at the various locations for directing outgoing light beams containing the same information as their cooperating displays from these locations to the central location. Each of these outgoing beams is applied to a detection arrangement, e.g., a fluorescence spectroscope, for retrieving the information contained thereby. 9 figs.

  11. Remote multi-position information gathering system and method

    DOEpatents

    Hirschfeld, T.B.

    1986-12-02

    A technique for gathering specific information from various remote locations, especially fluorimetric information characteristic of particular materials at the various locations is disclosed herein. This technique uses a single source of light disposed at still a different, central location and an overall optical network including an arrangement of optical fibers cooperating with the light source for directing individual light beams into the different information bearing locations. The incoming light beams result in corresponding displays of light, e.g., fluorescent light, containing the information to be obtained. The optical network cooperates with these light displays at the various locations for directing outgoing light beams containing the same information as their cooperating displays from these locations to the central location. Each of these outgoing beams is applied to a detection arrangement, e.g., a fluorescence spectroscope, for retrieving the information contained thereby. 9 figs.

  12. When Educational Material Is Delivered: A Mixed Methods Content Validation Study of the Information Assessment Method

    PubMed Central

    2017-01-01

    Background The Information Assessment Method (IAM) allows clinicians to report the cognitive impact, clinical relevance, intention to use, and expected patient health benefits associated with clinical information received by email. More than 15,000 Canadian physicians and pharmacists use the IAM in continuing education programs. In addition, information providers can use IAM ratings and feedback comments from clinicians to improve their products. Objective Our general objective was to validate the IAM questionnaire for the delivery of educational material (ecological and logical content validity). Our specific objectives were to measure the relevance and evaluate the representativeness of IAM items for assessing information received by email. Methods A 3-part mixed methods study was conducted (convergent design). In part 1 (quantitative longitudinal study), the relevance of IAM items was measured. Participants were 5596 physician members of the Canadian Medical Association who used the IAM. A total of 234,196 ratings were collected in 2012. The relevance of IAM items with respect to their main construct was calculated using descriptive statistics (relevance ratio R). In part 2 (qualitative descriptive study), the representativeness of IAM items was evaluated. A total of 15 family physicians completed semistructured face-to-face interviews. For each construct, we evaluated the representativeness of IAM items using a deductive-inductive thematic qualitative data analysis. In part 3 (mixing quantitative and qualitative parts), results from quantitative and qualitative analyses were reviewed, juxtaposed in a table, discussed with experts, and integrated. Thus, our final results are derived from the views of users (ecological content validation) and experts (logical content validation). Results Of the 23 IAM items, 21 were validated for content, while 2 were removed. In part 1 (quantitative results), 21 items were deemed relevant, while 2 items were deemed not relevant

  13. Informal Learning of Social Workers: A Method of Narrative Inquiry

    ERIC Educational Resources Information Center

    Gola, Giancarlo

    2009-01-01

    Purpose: The purpose of this paper is to investigate social workers' processes of informal learning, through their narration of their professional experience, in order to understand how social workers learn. Informal learning is any individual practice or activity that is able to produce continuous learning; it is often non-intentional and…

  14. Method and system for analyzing and classifying electronic information

    DOEpatents

    McGaffey, Robert W.; Bell, Michael Allen; Kortman, Peter J.; Wilson, Charles H.

    2003-04-29

    A data analysis and classification system that reads the electronic information, analyzes the electronic information according to a user-defined set of logical rules, and returns a classification result. The data analysis and classification system may accept any form of computer-readable electronic information. The system creates a hash table wherein each entry of the hash table contains a concept corresponding to a word or phrase which the system has previously encountered. The system creates an object model based on the user-defined logical associations, used for reviewing each concept contained in the electronic information in order to determine whether the electronic information is classified. The data analysis and classification system extracts each concept in turn from the electronic information, locates it in the hash table, and propagates it through the object model. In the event that the system can not find the electronic information token in the hash table, that token is added to a missing terms list. If any rule is satisfied during propagation of the concept through the object model, the electronic information is classified.

  15. Using the Work System Method with Freshman Information Systems Students

    ERIC Educational Resources Information Center

    Recker, Jan; Alter, Steven

    2012-01-01

    Recent surveys of information technology management professionals show that understanding business domains in terms of business productivity and cost reduction potential, knowledge of different vertical industry segments and their information requirements, understanding of business processes and client-facing skills are more critical for…

  16. Control theory based airfoil design using the Euler equations

    NASA Technical Reports Server (NTRS)

    Jameson, Antony; Reuther, James

    1994-01-01

    This paper describes the implementation of optimization techniques based on control theory for airfoil design. In our previous work it was shown that control theory could be employed to devise effective optimization procedures for two-dimensional profiles by using the potential flow equation with either a conformal mapping or a general coordinate system. The goal of our present work is to extend the development to treat the Euler equations in two-dimensions by procedures that can readily be generalized to treat complex shapes in three-dimensions. Therefore, we have developed methods which can address airfoil design through either an analytic mapping or an arbitrary grid perturbation method applied to a finite volume discretization of the Euler equations. Here the control law serves to provide computationally inexpensive gradient information to a standard numerical optimization method. Results are presented for both the inverse problem and drag minimization problem.

  17. Contextualized theory-based predictors of intention to practice monogamy among adolescents in Botswana junior secondary schools: Results of focus group sessions and a cross-sectional study.

    PubMed

    Chilisa, Bagele; Mohiemang, Irene; Mpeta, Kolentino Nyamadzapasi; Malinga, Tumane; Ntshwarang, Poloko; Koyabe, Bramwell Walela; Heeren, G Anita

    2016-01-01

    Culture and tradition influences behaviour. Multiple partner and concurrent relationships are made responsible for the increase of HIV infection in Sub-Saharan Africa. A contextualized "Theory of Planned Behaviour" was used to identify predictors of intention to practice monogamy. A mixed method design using qualitative data from focus groups, stories and a survey were analyzed for quantitative data. The qualitative data added to the behavioural beliefs a socio-cultural belief domain as well as attitudes, subjective norms, and perceived behavioural control predicted the intention to practice monogamy. The adolescents showed a tendency towards having more than one sexual partner. The normative beliefs and the socio cultural beliefs also predicted intentions while hedonistic belief and partner reaction did not. In contextualizing theory-based interventions, it is important to draw from stories and the langauage that circulate in a community about a given behaviour. More studies are needed on ways to combine qualitative approaches with quantitative approaches to inform the development of theory based culturally appropriate and context specific intervention strategies to reduce the risk of HIV.

  18. A Danger-Theory-Based Immune Network Optimization Algorithm

    PubMed Central

    Li, Tao; Xiao, Xin; Shi, Yuanquan

    2013-01-01

    Existing artificial immune optimization algorithms reflect a number of shortcomings, such as premature convergence and poor local search ability. This paper proposes a danger-theory-based immune network optimization algorithm, named dt-aiNet. The danger theory emphasizes that danger signals generated from changes of environments will guide different levels of immune responses, and the areas around danger signals are called danger zones. By defining the danger zone to calculate danger signals for each antibody, the algorithm adjusts antibodies' concentrations through its own danger signals and then triggers immune responses of self-regulation. So the population diversity can be maintained. Experimental results show that the algorithm has more advantages in the solution quality and diversity of the population. Compared with influential optimization algorithms, CLONALG, opt-aiNet, and dopt-aiNet, the algorithm has smaller error values and higher success rates and can find solutions to meet the accuracies within the specified function evaluation times. PMID:23483853

  19. A theory-based approach to teaching young children about health: A recipe for understanding

    PubMed Central

    Nguyen, Simone P.; McCullough, Mary Beth; Noble, Ashley

    2011-01-01

    The theory-theory account of conceptual development posits that children’s concepts are integrated into theories. Concept learning studies have documented the central role that theories play in children’s learning of experimenter-defined categories, but have yet to extensively examine complex, real-world concepts such as health. The present study examined whether providing young children with coherent and causally-related information in a theory-based lesson would facilitate their learning about the concept of health. This study used a pre-test/lesson/post-test design, plus a five month follow-up. Children were randomly assigned to one of three conditions: theory (i.e., 20 children received a theory-based lesson); nontheory (i.e., 20 children received a nontheory-based lesson); and control (i.e., 20 children received no lesson). Overall, the results showed that children in the theory condition had a more accurate conception of health than children in the nontheory and control conditions, suggesting the importance of theories in children’s learning of complex, real-world concepts. PMID:21894237

  20. Information Theory: A Method for Human Communication Research.

    ERIC Educational Resources Information Center

    Black, John W.

    This paper describes seven experiments related to human communication research. The first two experiments discuss studies treating the aural responses of listeners. The third experiment was undertaken to estimate the information of sounds and diagrams which might lead to an estimate of the redundancy ascribed to the phonetic structure of words. A…

  1. Paper Trail: One Method of Information Literacy Assessment

    ERIC Educational Resources Information Center

    Nutefall, Jennifer

    2004-01-01

    Assessing students' information literacy skills can be difficult depending on the involvement of the librarian in a course. To overcome this, librarians created an assignment called the Paper Trail, where students wrote a short essay about their research process and reflected on what they would do differently. Through reviewing and grading these…

  2. A multi-method approach to evaluate health information systems.

    PubMed

    Yu, Ping

    2010-01-01

    Systematic evaluation of the introduction and impact of health information systems (HIS) is a challenging task. As the implementation is a dynamic process, with diverse issues emerge at various stages of system introduction, it is challenge to weigh the contribution of various factors and differentiate the critical ones. A conceptual framework will be helpful in guiding the evaluation effort; otherwise data collection may not be comprehensive and accurate. This may again lead to inadequate interpretation of the phenomena under study. Based on comprehensive literature research and own practice of evaluating health information systems, the author proposes a multimethod approach that incorporates both quantitative and qualitative measurement and centered around DeLone and McLean Information System Success Model. This approach aims to quantify the performance of HIS and its impact, and provide comprehensive and accurate explanations about the casual relationships of the different factors. This approach will provide decision makers with accurate and actionable information for improving the performance of the introduced HIS.

  3. Statistical methods of combining information: Applications to sensor data fusion

    SciTech Connect

    Burr, T.

    1996-12-31

    This paper reviews some statistical approaches to combining information from multiple sources. Promising new approaches will be described, and potential applications to combining not-so-different data sources such as sensor data will be discussed. Experiences with one real data set are described.

  4. Methodical Bases for the Regional Information Potential Estimation

    ERIC Educational Resources Information Center

    Ashmarina, Svetlana I.; Khasaev, Gabibulla R.; Mantulenko, Valentina V.; Kasarin, Stanislav V.; Dorozhkin, Evgenij M.

    2016-01-01

    The relevance of the investigated problem is caused by the need to assess the implementation of informatization level of the region and the insufficient development of the theoretical, content-technological, scientific and methodological aspects of the assessment of the region's information potential. The aim of the research work is to develop a…

  5. 5 CFR 1640.6 - Methods of providing information.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... part to participants by making it available on the TSP Web site. A participant can request paper copies of that information from the TSP by calling the ThriftLine, submitting a request through the TSP Web site, or by writing to the TSP record keeper....

  6. 5 CFR 1640.6 - Methods of providing information.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... part to participants by making it available on the TSP Web site. A participant can request paper copies of that information from the TSP by calling the ThriftLine, submitting a request through the TSP Web site, or by writing to the TSP record keeper....

  7. 5 CFR 1640.6 - Methods of providing information.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... part to participants by making it available on the TSP Web site. A participant can request paper copies of that information from the TSP by calling the ThriftLine, submitting a request through the TSP Web site, or by writing to the TSP record keeper....

  8. Factors influencing variation in physician adenoma detection rates: a theory-based approach

    PubMed Central

    Atkins, Louise; Hunkeler, Enid M.; Jensen, Christopher D.; Michie, Susan; Lee, Jeffrey K.; Doubeni, Chyke A.; Zauber, Ann G.; Levin, Theodore R.; Quinn, Virginia P.; Corley, Douglas A.

    2015-01-01

    Background & Aims Interventions to improve physician adenoma detection rates for colonoscopy have generally not been successful and there are little data on the factors contributing to variation that may be appropriate targets for intervention. We sought to identify factors that may influence variation in detection rates using theory-based tools for understanding behavior. Methods We separately studied gastroenterologists and endoscopy nurses at three Kaiser Permanente Northern California medical centers to identify potentially modifiable factors relevant to physician adenoma detection rate variability using structured group interviews (focus groups) and theory-based tools for understanding behavior and eliciting behavior change: the Capability, Opportunity, and Motivation behavior model; the Theoretical Domains Framework; and the Behavior Change Wheel. Results Nine factors potentially associated with detection rate variability were identified, including six related to capability (uncertainty about which types of polyps to remove; style of endoscopy team leadership; compromised ability to focus during an examination due to distractions; examination technique during withdrawal; difficulty detecting certain types of adenomas; and examiner fatigue and pain), two related to opportunity (perceived pressure due to the number of examinations expected per shift and social pressure to finish examinations before scheduled breaks or the end of a shift), and one related to motivation (valuing a meticulous examination as the top priority). Examples of potential intervention strategies are provided. Conclusions Using theory-based tools, this study identified several novel and potentially modifiable factors relating to capability, opportunity, and motivation that may contribute to adenoma detection rate variability and be appropriate targets for future intervention trials. PMID:26366787

  9. Dissolved oxygen prediction using a possibility theory based fuzzy neural network

    NASA Astrophysics Data System (ADS)

    Khan, Usman T.; Valeo, Caterina

    2016-06-01

    A new fuzzy neural network method to predict minimum dissolved oxygen (DO) concentration in a highly urbanised riverine environment (in Calgary, Canada) is proposed. The method uses abiotic factors (non-living, physical and chemical attributes) as inputs to the model, since the physical mechanisms governing DO in the river are largely unknown. A new two-step method to construct fuzzy numbers using observations is proposed. Then an existing fuzzy neural network is modified to account for fuzzy number inputs and also uses possibility theory based intervals to train the network. Results demonstrate that the method is particularly well suited to predicting low DO events in the Bow River. Model performance is compared with a fuzzy neural network with crisp inputs, as well as with a traditional neural network. Model output and a defuzzification technique are used to estimate the risk of low DO so that water resource managers can implement strategies to prevent the occurrence of low DO.

  10. Theory-based metrological traceability in education: A reading measurement network.

    PubMed

    Fisher, William P; Stenner, A Jackson

    2016-10-01

    Huge resources are invested in metrology and standards in the natural sciences, engineering, and across a wide range of commercial technologies. Significant positive returns of human, social, environmental, and economic value on these investments have been sustained for decades. Proven methods for calibrating test and survey instruments in linear units are readily available, as are data- and theory-based methods for equating those instruments to a shared unit. Using these methods, metrological traceability is obtained in a variety of commercially available elementary and secondary English and Spanish language reading education programs in the U.S., Canada, Mexico, and Australia. Given established historical patterns, widespread routine reproduction of predicted text-based and instructional effects expressed in a common language and shared frame of reference may lead to significant developments in theory and practice. Opportunities for systematic implementations of teacher-driven lean thinking and continuous quality improvement methods may be of particular interest and value.

  11. Information storage medium and method of recording and retrieving information thereon

    DOEpatents

    Marchant, D. D.; Begej, Stefan

    1986-01-01

    Information storage medium comprising a semiconductor doped with first and second impurities or dopants. Preferably, one of the impurities is introduced by ion implantation. Conductive electrodes are photolithographically formed on the surface of the medium. Information is recorded on the medium by selectively applying a focused laser beam to discrete regions of the medium surface so as to anneal discrete regions of the medium containing lattice defects introduced by the ion-implanted impurity. Information is retrieved from the storage medium by applying a focused laser beam to annealed and non-annealed regions so as to produce a photovoltaic signal at each region.

  12. An efficient steganography method for hiding patient confidential information.

    PubMed

    Al-Dmour, Hayat; Al-Ani, Ahmed; Nguyen, Hung

    2014-01-01

    This paper deals with the important issue of security and confidentiality of patient information when exchanging or storing medical images. Steganography has recently been viewed as an alternative or complement to cryptography, as existing cryptographic systems are not perfect due to their vulnerability to certain types of attack. We propose in this paper a new steganography algorithm for hiding patient confidential information. It utilizes Pixel Value Differencing (PVD) to identify contrast regions in the image and a Hamming code that embeds 3 secret message bits into 4 bits of the cover image. In order to preserve the content of the region of interest (ROI), the embedding is only performed using the Region of Non-Interest (RONI).

  13. Methods of information theory and algorithmic complexity for network biology.

    PubMed

    Zenil, Hector; Kiani, Narsis A; Tegnér, Jesper

    2016-03-01

    We survey and introduce concepts and tools located at the intersection of information theory and network biology. We show that Shannon's information entropy, compressibility and algorithmic complexity quantify different local and global aspects of synthetic and biological data. We show examples such as the emergence of giant components in Erdös-Rényi random graphs, and the recovery of topological properties from numerical kinetic properties simulating gene expression data. We provide exact theoretical calculations, numerical approximations and error estimations of entropy, algorithmic probability and Kolmogorov complexity for different types of graphs, characterizing their variant and invariant properties. We introduce formal definitions of complexity for both labeled and unlabeled graphs and prove that the Kolmogorov complexity of a labeled graph is a good approximation of its unlabeled Kolmogorov complexity and thus a robust definition of graph complexity.

  14. Imaging systems and methods for obtaining and using biometric information

    DOEpatents

    McMakin, Douglas L [Richland, WA; Kennedy, Mike O [Richland, WA

    2010-11-30

    Disclosed herein are exemplary embodiments of imaging systems and methods of using such systems. In one exemplary embodiment, one or more direct images of the body of a clothed subject are received, and a motion signature is determined from the one or more images. In this embodiment, the one or more images show movement of the body of the subject over time, and the motion signature is associated with the movement of the subject's body. In certain implementations, the subject can be identified based at least in part on the motion signature. Imaging systems for performing any of the disclosed methods are also disclosed herein. Furthermore, the disclosed imaging, rendering, and analysis methods can be implemented, at least in part, as one or more computer-readable media comprising computer-executable instructions for causing a computer to perform the respective methods.

  15. Implementing shared decision-making in nutrition clinical practice: A theory-based approach and feasibility study

    PubMed Central

    Desroches, Sophie; Gagnon, Marie-Pierre; Tapp, Sylvie; Légaré, France

    2008-01-01

    Background There are a growing number of dietary treatment options to choose from for the management of many chronic diseases. Shared decision making represents a promising approach to improve the quality of the decision making process needed for dietary choices that are informed by the best evidence and value-based. However, there are no studies reporting on theory-based approaches that foster the implementation of shared decision making in health professions allied to medicine. The objectives of this study are to explore the integration of shared decision making within real nutritional consultations, and to design questionnaires to assess dieticians' intention to adopt two specific behaviors related to shared decision making using the Theory of Planned Behavior. Methods Forty dieticians will audiotape one clinical encounter to explore the presence of shared decision making within the consultation. They will also participate to one of five to six focus groups that aim to identify the salient beliefs underlying the determinants of their intention to present evidence-based dietary treatment options to their patients, and clarify the values related to dietary choices that are important to their patients. These salient beliefs will be used to elaborate the items of two questionnaires. The internal consistency of theoretical constructs and the temporal stability of their measurement will be checked using the test-retest method by asking 35 dieticians to complete the questionnaire twice within a two-week interval. Discussion The proposed research project will be the first study to: provide preliminary data about the adoption of shared decision making by dieticians and theirs patients; elicit dieticians' salient beliefs regarding the intention to adopt shared decision making behaviors, report on the development of a specific questionnaire; explore dieticians' views on the implementation of shared decision making; and compare their views regarding the implementation of

  16. A High Accuracy Method for Semi-supervised Information Extraction

    SciTech Connect

    Tratz, Stephen C.; Sanfilippo, Antonio P.

    2007-04-22

    Customization to specific domains of dis-course and/or user requirements is one of the greatest challenges for today’s Information Extraction (IE) systems. While demonstrably effective, both rule-based and supervised machine learning approaches to IE customization pose too high a burden on the user. Semi-supervised learning approaches may in principle offer a more resource effective solution but are still insufficiently accurate to grant realistic application. We demonstrate that this limitation can be overcome by integrating fully-supervised learning techniques within a semi-supervised IE approach, without increasing resource requirements.

  17. A Method to Measure the Amount of Battlefield Situation Information

    DTIC Science & Technology

    2014-06-01

    log)( 20 atS  3.2 Measurement of trends information Kierkegaard once said “Life can only be understood backwards, but it must be lived forwards” [8...Towards a theory of s 10 11 , “The Journals of Soren Kierkegaard ”, A selection and translated by Alexander chun, “Formal Description of Command...and 37(2), pp32-64, 1995. [8] Kierkegaard , Soren – Dru , Oxford: Oxford University Press (1938). pp465 [9] ZHOU Dao-an, ZHANG Dong-ge, CHANG Shu

  18. The Effects of Presentation Method and Information Density on Visual Search Ability and Working Memory Load

    ERIC Educational Resources Information Center

    Chang, Ting-Wen; Kinshuk; Chen, Nian-Shing; Yu, Pao-Ta

    2012-01-01

    This study investigates the effects of successive and simultaneous information presentation methods on learner's visual search ability and working memory load for different information densities. Since the processing of information in the brain depends on the capacity of visual short-term memory (VSTM), the limited information processing capacity…

  19. 10 CFR 207.3 - Method of collecting energy information under ESECA.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 3 2010-01-01 2010-01-01 false Method of collecting energy information under ESECA. 207.3 Section 207.3 Energy DEPARTMENT OF ENERGY OIL COLLECTION OF INFORMATION Collection of Information Under the Energy Supply and Environmental Coordination Act of 1974 § 207.3 Method of collecting...

  20. 10 CFR 207.3 - Method of collecting energy information under ESECA.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 3 2011-01-01 2011-01-01 false Method of collecting energy information under ESECA. 207.3 Section 207.3 Energy DEPARTMENT OF ENERGY OIL COLLECTION OF INFORMATION Collection of Information Under the Energy Supply and Environmental Coordination Act of 1974 § 207.3 Method of collecting...

  1. A Method of Surrogate Model Construction which Leverages Lower-Fidelity Information using Space Mapping Techniques

    DTIC Science & Technology

    2014-03-27

    A Method of Surrogate Model Construction which Leverages Lower- Fidelity Information using Space Mapping Techniques THESIS Jason W. Thomas, Capt, USAF...Lower- Fidelity Information using Space Mapping Techniques THESIS Presented to the Faculty Department of Aeronautics and Astronautics Graduate School of...RELEASE; DISTRIBUTION IS UNLIMITED A Method of Surrogate Model Construction which Leverages Lower- Fidelity Information using Space Mapping Techniques

  2. 10 CFR 207.3 - Method of collecting energy information under ESECA.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 3 2014-01-01 2014-01-01 false Method of collecting energy information under ESECA. 207.3 Section 207.3 Energy DEPARTMENT OF ENERGY OIL COLLECTION OF INFORMATION Collection of Information Under the Energy Supply and Environmental Coordination Act of 1974 § 207.3 Method of collecting...

  3. 10 CFR 207.3 - Method of collecting energy information under ESECA.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 3 2012-01-01 2012-01-01 false Method of collecting energy information under ESECA. 207.3 Section 207.3 Energy DEPARTMENT OF ENERGY OIL COLLECTION OF INFORMATION Collection of Information Under the Energy Supply and Environmental Coordination Act of 1974 § 207.3 Method of collecting...

  4. 10 CFR 207.3 - Method of collecting energy information under ESECA.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 3 2013-01-01 2013-01-01 false Method of collecting energy information under ESECA. 207.3 Section 207.3 Energy DEPARTMENT OF ENERGY OIL COLLECTION OF INFORMATION Collection of Information Under the Energy Supply and Environmental Coordination Act of 1974 § 207.3 Method of collecting...

  5. Testing a theory-based mobility monitoring protocol using in-home sensors: a feasibility study.

    PubMed

    Reeder, Blaine; Chung, Jane; Lazar, Amanda; Joe, Jonathan; Demiris, George; Thompson, Hilaire J

    2013-10-01

    Mobility is a key factor in the performance of many everyday tasks required for independent living as a person ages. The purpose of this mixed-methods study was to test a theory-based mobility monitoring protocol by comparing sensor-based measures to self-report measures of mobility and assess the acceptability of in-home sensors with older adults. Standardized instruments to measure physical, psychosocial, and cognitive parameters were administered to 8 community-dwelling older adults at baseline, 3-month, and 6-month visits. Semi-structured interviews to characterize acceptability of the technology were conducted at the 3-month and 6-month visits. Technical issues prevented comparison of sensor-based measures with self-report measures. In-home sensor technology for monitoring mobility is acceptable to older adults. Implementing our theory-based mobility monitoring protocol in a field study in the homes of older adults is a feasible undertaking but requires more robust technology for sensor-based measure validation.

  6. Risk-Informed Safety Margin Characterization Methods Development Work

    SciTech Connect

    Smith, Curtis L; Ma, Zhegang; Riley, Tom; Mandelli, Diego; Nielsen, Joseph W; Alfonsi, Andrea; Rabiti, Cristian

    2014-09-01

    This report summarizes the research activity developed during the Fiscal year 2014 within the Risk Informed Safety Margin and Characterization (RISMC) pathway within the Light Water Reactor Sustainability (LWRS) campaign. This research activity is complementary to the one presented in the INL/EXT-??? report which shows advances Probabilistic Risk Assessment Analysis using RAVEN and RELAP-7 in conjunction to novel flooding simulation tools. Here we present several analyses that prove the values of the RISMC approach in order to assess risk associated to nuclear power plants (NPPs). We focus on simulation based PRA which, in contrast to classical PRA, heavily employs system simulator codes. Firstly we compare, these two types of analyses, classical and RISMC, for a Boiling water reactor (BWR) station black out (SBO) initiating event. Secondly we present an extended BWR SBO analysis using RAVEN and RELAP-5 which address the comments and suggestions received about he original analysis presented in INL/EXT-???. This time we focus more on the stochastic analysis such probability of core damage and on the determination of the most risk-relevant factors. We also show some preliminary results regarding the comparison between RELAP5-3D and the new code RELAP-7 for a simplified Pressurized Water Reactors system. Lastly we present some conceptual ideas regarding the possibility to extended the RISMC capabilities from an off-line tool (i.e., as PRA analysis tool) to an online-tool. In this new configuration, RISMC capabilities can be used to assist and inform reactor operator during real accident scenarios.

  7. Similarity theory based on the Dougherty-Ozmidov length scale

    NASA Astrophysics Data System (ADS)

    Grachev, Andrey A.; Andreas, Edgar L.; Fairall, Christopher W.; Guest, Peter S.; Persson, P. Ola G.

    2015-07-01

    Local similarity theory is suggested based on the Brunt-Vaisala frequency and the dissipation rate of turbulent kinetic energy instead the turbulent fluxes used in the traditional Monin-Obukhov similarity theory. Based on dimensional analysis (Pi theorem), it is shown that any properly scaled statistics of the small-scale turbulence are universal functions of a stability parameter defined as the ratio of a reference height z and the Dougherty-Ozmidov length scale which in the limit of z-less stratification is linearly proportional to the Obukhov length scale. Measurements of atmospheric turbulence made at five levels on a 20-m tower over the Arctic pack ice during the Surface Heat Budget of the Arctic Ocean experiment (SHEBA) are used to examine the behaviour of different similarity functions in the stable boundary layer. It is found that in the framework of this approach the non-dimensional turbulent viscosity is equal to the gradient Richardson number whereas the non-dimensional turbulent thermal diffusivity is equal to the flux Richardson number. These results are a consequence of the approximate local balance between production of turbulence by the mean flow shear and viscous dissipation. The turbulence framework based on the Brunt-Vaisala frequency and the dissipation rate of turbulent kinetic energy may have practical advantages for estimating turbulence when the fluxes are not directly available.

  8. Evaluating participatory decision processes: which methods inform reflective practice?

    PubMed

    Kaufman, Sanda; Ozawa, Connie P; Shmueli, Deborah F

    2014-02-01

    Evaluating participatory decision processes serves two key purposes: validating the usefulness of specific interventions for stakeholders, interveners and funders of conflict management processes, and improving practice. However, evaluation design remains challenging, partly because when attempting to serve both purposes we may end up serving neither well. In fact, the better we respond to one, the less we may satisfy the other. Evaluations tend to focus on endogenous factors (e.g., stakeholder selection, BATNAs, mutually beneficial tradeoffs, quality of the intervention, etc.), because we believe that the success of participatory decision processes hinges on them, and they also seem to lend themselves to caeteris paribus statistical comparisons across cases. We argue that context matters too and that contextual differences among specific cases are meaningful enough to undermine conclusions derived solely from comparisons of process-endogenous factors implicitly rooted in the caeteris paribus assumption. We illustrate this argument with an environmental mediation case. We compare data collected about it through surveys geared toward comparability across cases to information elicited through in-depth interviews geared toward case specifics. The surveys, designed by the U.S. Institute of Environmental Conflict Resolution, feed a database of environmental conflicts that can help make the (statistical) case for intervention in environmental conflict management. Our interviews elicit case details - including context - that enable interveners to link context specifics and intervention actions to outcomes. We argue that neither approach can "serve both masters."

  9. Examination of an Electronic Patient Record Display Method to Protect Patient Information Privacy.

    PubMed

    Niimi, Yukari; Ota, Katsumasa

    2017-02-01

    Electronic patient records facilitate the provision of safe, high-quality medical care. However, because personnel can view almost all stored information, this study designed a display method using a mosaic blur (pixelation) to temporarily conceal information patients do not want shared. This study developed an electronic patient records display method for patient information that balanced the patient's desire for personal information protection against the need for information sharing among medical personnel. First, medical personnel were interviewed about the degree of information required for both individual duties and team-based care. Subsequently, they tested a mock display method that partially concealed information using a mosaic blur, and they were interviewed about the effectiveness of the display method that ensures patient privacy. Participants better understood patients' demand for confidentiality, suggesting increased awareness of patients' privacy protection. However, participants also indicated that temporary concealment of certain information was problematic. Other issues included the inconvenience of removing the mosaic blur to obtain required information and risk of insufficient information for medical care. Despite several issues with using a display method that temporarily conceals information according to patient privacy needs, medical personnel could accept this display method if information essential to medical safety remains accessible.

  10. A method for extracting drainage networks with heuristic information from digital elevation models.

    PubMed

    Hou, Kun; Yang, Wei; Sun, Jigui; Sun, Tieli

    2011-01-01

    Depression filling and direction assignment over flat areas are critical issues in hydrologic analysis. This paper proposes a method to handle depressions and flat areas in one procedure. Being different from the traditional raster neighbourhoods processing with little heuristic information, the method is designed to compensate for the inadequate searching information of other methods. The proposed method routes flow through depressions and flat areas by searching for the outlet using the heuristic information. Heuristic information can reveal the general trend slope of the DEM (digital elevation models) and help the proposed method find the outlet accurately. The method is implemented in Pascal and experiments are carried out on actual DEM data. It can be seen from the comparison with the four existing methods that the proposed method can get a closer match result with the ground truth network. Moreover, the proposed method can avoid the generation of the unrealistic parallel drainage lines, unreal drainage lines and spurious terrain features.

  11. Navigating Longitudinal Clinical Notes with an Automated Method for Detecting New Information

    PubMed Central

    Zhang, Rui; Pakhomov, Serguei; Lee, Janet T.; Melton, Genevieve B.

    2015-01-01

    Automated methods to detect new information in clinical notes may be valuable for navigating and using information in these documents for patient care. Statistical language models were evaluated as a means to quantify new information over longitudinal clinical notes for a given patient. The new information proportion (NIP) in target notes decreased logarithmically with increasing numbers of previous notes to create the language model. For a given patient, the amount of new information had cyclic patterns. Higher NIP scores correlated with notes having more new information often with clinically significant events, and lower NIP scores indicated notes with less new information. Our analysis also revealed “copying and pasting” to be widely used in generating clinical notes by copying information from the most recent historical clinical notes forward. These methods can potentially aid clinicians in finding notes with more clinically relevant new information and in reviewing notes more purposefully which may increase the efficiency of clinicians in delivering patient care. PMID:23920658

  12. Method and Application for Dynamic Comprehensive Evaluation with Subjective and Objective Information

    PubMed Central

    Liu, Dinglin; Zhao, Xianglian

    2013-01-01

    In an effort to deal with more complicated evaluation situations, scientists have focused their efforts on dynamic comprehensive evaluation research. How to make full use of the subjective and objective information has become one of the noteworthy content. In this paper, a dynamic comprehensive evaluation method with subjective and objective information is proposed. We use the combination weighting method to determine the index weight. Analysis hierarchy process method is applied to dispose the subjective information, and criteria importance through intercriteria correlation method is used to handle the objective information. And for the time weight determination, we consider both time distance and information size to embody the principle of esteeming the present over the past. And then the linear weighted average model is constructed to make the evaluation process more practicable. Finally, an example is presented to illustrate the effectiveness of this method. Overall, the results suggest that the proposed method is reasonable and effective. PMID:24386176

  13. Effective Methods for Studying Information Seeking and Use. Introduction and Overview.

    ERIC Educational Resources Information Center

    Wildemuth, Barbara M.

    2002-01-01

    In conjunction with the American Society for Information Science and Technology's (ASIST) annual meeting in fall 2001, the Special Interest Group on Information Needs, Seeking, and Use (SIG USE) sponsored a research symposium on "Effective Methods for Studying Information Seeking and Use." This article briefly reviews six articles presented at the…

  14. Removing barriers to rehabilitation: Theory-based family intervention in community settings after brain injury.

    PubMed

    Stejskal, Taryn M

    2012-01-01

    Rehabilitation professionals have become increasingly aware that family members play a critical role in the recovery process of individuals after brain injury. In addition, researchers have begun to identify a relationship between family member caregivers' well-being and survivors' outcomes. The idea of a continuum of care or following survivors from inpatient care to community reintegration has become an important model of treatment across many hospital and community-based settings. In concert with the continuum of care, present research literature indicates that family intervention may be a key component to successful rehabilitation after brain injury. Yet, clinicians interacting with family members and survivors often feel confounded about how exactly to intervene with the broader family system beyond the individual survivor. Drawing on the systemic nature of the field of marriage and family therapy (MFT), this article provides information to assist clinicians in effectively intervening with families using theory-based interventions in community settings. First, a rationale for the utilization of systems-based, as opposed to individual-based, therapies will be uncovered. Second, historically relevant publications focusing on family psychotherapy and intervention after brain injury are reviewed and their implications discussed. Recommendations for the utilization of systemic theory-based principles and strategies, specifically cognitive behavioral therapy (CBT), narrative therapy (NT), and solution-focused therapy (SFT) will be examined. Descriptions of common challenges families and couples face will be presented along with case examples to illustrate how these theoretical frameworks might be applied to these special concerns postinjury. Finally, the article concludes with an overview of the ideas presented in this manuscript to assist practitioners and systems of care in community-based settings to more effectively intervene with the family system as a whole

  15. A cloud theory-based particle swarm optimization for multiple decision maker vehicle routing problems with fuzzy random time windows

    NASA Astrophysics Data System (ADS)

    Ma, Yanfang; Xu, Jiuping

    2015-06-01

    This article puts forward a cloud theory-based particle swarm optimization (CTPSO) algorithm for solving a variant of the vehicle routing problem, namely a multiple decision maker vehicle routing problem with fuzzy random time windows (MDVRPFRTW). A new mathematical model is developed for the proposed problem in which fuzzy random theory is used to describe the time windows and bi-level programming is applied to describe the relationship between the multiple decision makers. To solve the problem, a cloud theory-based particle swarm optimization (CTPSO) is proposed. More specifically, this approach makes improvements in initialization, inertia weight and particle updates to overcome the shortcomings of the basic particle swarm optimization (PSO). Parameter tests and results analysis are presented to highlight the performance of the optimization method, and comparison of the algorithm with the basic PSO and the genetic algorithm demonstrates its efficiency.

  16. Mixed Methods Approach to Assessing an Informal Buddy Support System for Canadian Forces Reservists

    DTIC Science & Technology

    2011-04-01

    Mixed Methods Approach to Assessing an Informal Buddy Support System for Canadian Forces Reservists Donna I. Pickering...Tara Holton Defence R&D Canada Technical Memorandum DRDC Toronto TM 2011-028 April 2011...Mixed Methods Approach to Assessing an Informal Buddy Support System for Canadian Forces Reservists Donna I. Pickering Tara Holton

  17. Novel lattice Boltzmann method based on integrated edge and region information for medical image segmentation.

    PubMed

    Wen, Junling; Yan, Zhuangzhi; Jiang, Jiehui

    2014-01-01

    The lattice Boltzmann (LB) method is a mesoscopic method based on kinetic theory and statistical mechanics. The main advantage of the LB method is parallel computation, which increases the speed of calculation. In the past decade, LB methods have gradually been introduced for image processing, e.g., image segmentation. However, a major shortcoming of existing LB methods is that they can only be applied to the processing of medical images with intensity homogeneity. In practice, however, many medical images possess intensity inhomogeneity. In this study, we developed a novel LB method to integrate edge and region information for medical image segmentation. In contrast to other segmentation methods, we added edge information as a relaxing factor and used region information as a source term. The proposed method facilitates the segmentation of medical images with intensity inhomogeneity and it still allows parallel computation. Preliminary tests of the proposed method are presented in this paper.

  18. Models for Theory-Based M.A. and Ph.D. Programs.

    ERIC Educational Resources Information Center

    Botan, Carl; Vasquez, Gabriel

    1999-01-01

    Presents work accomplished at the 1998 National Communication Association Summer Conference. Outlines reasons for theory-based education in public relations. Presents an integrated model of student outcomes, curriculum, pedagogy, and assessment for theory-based master's and doctoral programs, including assumptions made and rationale for such…

  19. Dynamic stepping information process method in mobile bio-sensing computing environments.

    PubMed

    Lee, Tae-Gyu; Lee, Seong-Hoon

    2014-01-01

    Recently, the interest toward human longevity free from diseases is being converged as one system frame along with the development of mobile computing environment, diversification of remote medical system and aging society. Such converged system enables implementation of a bioinformatics system created as various supplementary information services by sensing and gathering health conditions and various bio-information of mobile users to set up medical information. The existing bio-information system performs static and identical process without changes after the bio-information process defined at the initial system configuration executes the system. However, such static process indicates ineffective execution in the application of mobile bio-information system performing mobile computing. Especially, an inconvenient duty of having to perform initialization of new definition and execution is accompanied during the process configuration of bio-information system and change of method. This study proposes a dynamic process design and execution method to overcome such ineffective process.

  20. Novel Methods for Measuring Depth of Anesthesia by Quantifying Dominant Information Flow in Multichannel EEGs

    PubMed Central

    Choi, Byung-Moon; Noh, Gyu-Jeong

    2017-01-01

    In this paper, we propose novel methods for measuring depth of anesthesia (DOA) by quantifying dominant information flow in multichannel EEGs. Conventional methods mainly use few EEG channels independently and most of multichannel EEG based studies are limited to specific regions of the brain. Therefore the function of the cerebral cortex over wide brain regions is hardly reflected in DOA measurement. Here, DOA is measured by the quantification of dominant information flow obtained from principle bipartition. Three bipartitioning methods are used to detect the dominant information flow in entire EEG channels and the dominant information flow is quantified by calculating information entropy. High correlation between the proposed measures and the plasma concentration of propofol is confirmed from the experimental results of clinical data in 39 subjects. To illustrate the performance of the proposed methods more easily we present the results for multichannel EEG on a two-dimensional (2D) brain map.

  1. Personality and Psychopathology: a Theory-Based Revision of Eysenck’s PEN Model

    PubMed Central

    van Kampen, Dirk

    2009-01-01

    The principal aim of this paper is to investigate whether it is possible to create a personality taxonomy of clinical relevance out of Eysenck’s original PEN model by repairing the various shortcomings that can be noted in Eysenck’s personality theory, particularly in relation to P or Psychoticism. Addressing three approaches that have been followed to answer the question ‘which personality factors are basic?’, arguments are listed to show that particularly the theory-informed approach, originally defended by Eysenck, may lead to scientific progress. However, also noting the many deficiencies in the nomological network surrounding P, the peculiar situation arises that we adhere to Eysenck’s theory-informed methodology, but criticize his theory. These arguments and criticisms led to the replacement of P by three orthogonal and theory-based factors, Insensitivity (S), Orderliness (G), and Absorption (A), that together with the dimensions E or Extraversion and N or Neuroticism, that were retained from Eysenck’s PEN model, appear to give a comprehensive account of the main vulnerability factors in schizophrenia and affective disorders, as well as in other psychopathological conditions. PMID:20498694

  2. Inter-instrumental method transfer of chiral capillary electrophoretic methods using robustness test information.

    PubMed

    De Cock, Bart; Borsuk, Agnieszka; Dejaegher, Bieke; Stiens, Johan; Mangelings, Debby; Vander Heyden, Yvan

    2014-08-01

    Capillary electrophoresis (CE) is an electrodriven separation technique that is often used for the separation of chiral molecules. Advantages of CE are its flexibility, low cost and efficiency. On the other hand, the precision and transfer of CE methods are well-known problems of the technique. Reasons for the more complicated method transfer are the more diverse instrumental differences, such as total capillary lengths and capillary cooling systems; and the higher response variability of CE methods compared to other techniques, such as liquid chromatography (HPLC). Therefore, a larger systematic change in peak resolutions, migration times and peak areas, with a loss of separation and efficiency may be seen when a CE method is transferred to another laboratory or another type of instrument. A swift and successful method transfer is required because development and routine use of analytical methods are usually not performed in the same laboratory and/or on the same type of equipment. The aim of our study was to develop transfer rules to facilitate CE method transfers between different laboratories and instruments. In our case study, three β-blockers were chirally separated and inter-instrumental transfers were performed. The first step of our study was to optimise the precision of the chiral CE method. Next, a robustness test was performed to identify the instrumental and experimental parameters that were most influencing the considered responses. The precision- and the robustness study results were used to adapt instrumental and/or method settings to improve the transfer between different instruments. Finally, the comparison of adapted and non-adapted transfers allowed deriving some rules to facilitate CE method transfers.

  3. Method of bistable optical information storage using antiferroelectric phase PLZT ceramics

    DOEpatents

    Land, Cecil E.

    1990-01-01

    A method for bistable storage of binary optical information includes an antiferroelectric (AFE) lead lanthanum zirconate titanate (PLZT) layer having a stable antiferroelectric first phase and a ferroelectric (FE) second phase obtained by applying a switching electric field across the surface of the device. Optical information is stored by illuminating selected portions of the layer to photoactivate an FE to AFE transition in those portions. Erasure of the stored information is obtained by reapplying the switching field.

  4. Method of bistable optical information storage using antiferroelectric phase PLZT ceramics

    DOEpatents

    Land, C.E.

    1990-07-31

    A method for bistable storage of binary optical information includes an antiferroelectric (AFE) lead lanthanum zirconate titanate (PLZT) layer having a stable antiferroelectric first phase and a ferroelectric (FE) second phase obtained by applying a switching electric field across the surface of the device. Optical information is stored by illuminating selected portions of the layer to photoactivate an FE to AFE transition in those portions. Erasure of the stored information is obtained by reapplying the switching field. 8 figs.

  5. Mathematical, Logical, and Formal Methods in Information Retrieval: An Introduction to the Special Issue.

    ERIC Educational Resources Information Center

    Crestani, Fabio; Dominich, Sandor; Lalmas, Mounia; van Rijsbergen, Cornelis Joost

    2003-01-01

    Discusses the importance of research on the use of mathematical, logical, and formal methods in information retrieval to help enhance retrieval effectiveness and clarify underlying concepts of information retrieval. Highlights include logic; probability; spaces; and future research needs. (Author/LRW)

  6. Theories and Methods for Research on Informal Learning and Work: Towards Cross-Fertilization

    ERIC Educational Resources Information Center

    Sawchuk, Peter H.

    2008-01-01

    The topic of informal learning and work has quickly become a staple in contemporary work and adult learning research internationally. The narrow conceptualization of work is briefly challenged before the article turns to a review of the historical origins as well as contemporary theories and methods involved in researching informal learning and…

  7. Basic Information for EPA's Selected Analytical Methods for Environmental Remediation and Recovery (SAM)

    EPA Pesticide Factsheets

    Contains basic information on the role and origins of the Selected Analytical Methods including the formation of the Homeland Security Laboratory Capacity Work Group and the Environmental Evaluation Analytical Process Roadmap for Homeland Security Events

  8. Redox potentials and pKa for benzoquinone from density functional theory based molecular dynamics.

    PubMed

    Cheng, Jun; Sulpizi, Marialore; Sprik, Michiel

    2009-10-21

    The density functional theory based molecular dynamics (DFTMD) method for the computation of redox free energies presented in previous publications and the more recent modification for computation of acidity constants are reviewed. The method uses a half reaction scheme based on reversible insertion/removal of electrons and protons. The proton insertion is assisted by restraining potentials acting as chaperones. The procedure for relating the calculated deprotonation free energies to Brønsted acidities (pK(a)) and the oxidation free energies to electrode potentials with respect to the normal hydrogen electrode is discussed in some detail. The method is validated in an application to the reduction of aqueous 1,4-benzoquinone. The conversion of hydroquinone to quinone can take place via a number of alternative pathways consisting of combinations of acid dissociations, oxidations, or dehydrogenations. The free energy changes of all elementary steps (ten in total) are computed. The accuracy of the calculations is assessed by comparing the energies of different pathways for the same reaction (Hess's law) and by comparison to experiment. This two-sided test enables us to separate the errors related with the restrictions on length and time scales accessible to DFTMD from the errors introduced by the DFT approximation. It is found that the DFT approximation is the main source of error for oxidation free energies.

  9. Method and apparatus for bistable optical information storage for erasable optical disks

    DOEpatents

    Land, Cecil E.; McKinney, Ira D.

    1990-01-01

    A method and an optical device for bistable storage of optical information, together with reading and erasure of the optical information, using a photoactivated shift in a field dependent phase transition between a metastable or a bias-stabilized ferroelectric (FE) phase and a stable antiferroelectric (AFE) phase in an lead lanthanum zirconate titanate (PLZT). An optical disk contains the PLZT. Writing and erasing of optical information can be accomplished by a light beam normal to the disk. Reading of optical information can be accomplished by a light beam at an incidence angle of 15 to 60 degrees to the normal of the disk.

  10. Method and apparatus for bistable optical information storage for erasable optical disks

    DOEpatents

    Land, C.E.; McKinney, I.D.

    1988-05-31

    A method and an optical device for bistable storage of optical information, together with reading and erasure of the optical information, using a photoactivated shift in a field dependent phase transition between a metastable or a bias-stabilized ferroelectric (FE) phase and a stable antiferroelectric (AFE) phase in a lead lanthanum zirconate titanate (PLZT). An optical disk contains the PLZT. Writing and erasing of optical information can be accomplished by a light beam normal to the disk. Reading of optical information can be accomplished by a light beam at an incidence angle of 15 to 60 degrees to the normal of the disk. 10 figs.

  11. An Adaptive Altitude Information Fusion Method for Autonomous Landing Processes of Small Unmanned Aerial Rotorcraft

    PubMed Central

    Lei, Xusheng; Li, Jingjing

    2012-01-01

    This paper presents an adaptive information fusion method to improve the accuracy and reliability of the altitude measurement information for small unmanned aerial rotorcraft during the landing process. Focusing on the low measurement performance of sensors mounted on small unmanned aerial rotorcraft, a wavelet filter is applied as a pre-filter to attenuate the high frequency noises in the sensor output. Furthermore, to improve altitude information, an adaptive extended Kalman filter based on a maximum a posteriori criterion is proposed to estimate measurement noise covariance matrix in real time. Finally, the effectiveness of the proposed method is proved by static tests, hovering flight and autonomous landing flight tests. PMID:23201993

  12. The equivalence of information-theoretic and likelihood-based methods for neural dimensionality reduction.

    PubMed

    Williamson, Ross S; Sahani, Maneesh; Pillow, Jonathan W

    2015-04-01

    Stimulus dimensionality-reduction methods in neuroscience seek to identify a low-dimensional space of stimulus features that affect a neuron's probability of spiking. One popular method, known as maximally informative dimensions (MID), uses an information-theoretic quantity known as "single-spike information" to identify this space. Here we examine MID from a model-based perspective. We show that MID is a maximum-likelihood estimator for the parameters of a linear-nonlinear-Poisson (LNP) model, and that the empirical single-spike information corresponds to the normalized log-likelihood under a Poisson model. This equivalence implies that MID does not necessarily find maximally informative stimulus dimensions when spiking is not well described as Poisson. We provide several examples to illustrate this shortcoming, and derive a lower bound on the information lost when spiking is Bernoulli in discrete time bins. To overcome this limitation, we introduce model-based dimensionality reduction methods for neurons with non-Poisson firing statistics, and show that they can be framed equivalently in likelihood-based or information-theoretic terms. Finally, we show how to overcome practical limitations on the number of stimulus dimensions that MID can estimate by constraining the form of the non-parametric nonlinearity in an LNP model. We illustrate these methods with simulations and data from primate visual cortex.

  13. Literary pedagogy in nursing: a theory-based perspective.

    PubMed

    Sakalys, Jurate A

    2002-09-01

    Using fictional and autobiographical literature in nursing education is a primary way of understanding patients' lived experiences and fostering development of essential relational and reflective thinking skills. Application of literary theory to this pedagogic practice can expand conceptualization of teaching goals, inform specific teaching strategies, and potentially contribute to socially consequential educational outcomes. This article describes a theoretical schema that focuses on pedagogical goals in terms of the three related skills (i.e., reading, interpretation, criticism) of textual competence.

  14. Assessing Bayesian model averaging uncertainty of groundwater modeling based on information entropy method

    NASA Astrophysics Data System (ADS)

    Zeng, Xiankui; Wu, Jichun; Wang, Dong; Zhu, Xiaobin; Long, Yuqiao

    2016-07-01

    Because of groundwater conceptualization uncertainty, multi-model methods are usually used and the corresponding uncertainties are estimated by integrating Markov Chain Monte Carlo (MCMC) and Bayesian model averaging (BMA) methods. Generally, the variance method is used to measure the uncertainties of BMA prediction. The total variance of ensemble prediction is decomposed into within-model and between-model variances, which represent the uncertainties derived from parameter and conceptual model, respectively. However, the uncertainty of a probability distribution couldn't be comprehensively quantified by variance solely. A new measuring method based on information entropy theory is proposed in this study. Due to actual BMA process hard to meet the ideal mutually exclusive collectively exhaustive condition, BMA predictive uncertainty could be decomposed into parameter, conceptual model, and overlapped uncertainties, respectively. Overlapped uncertainty is induced by the combination of predictions from correlated model structures. In this paper, five simple analytical functions are firstly used to illustrate the feasibility of the variance and information entropy methods. A discrete distribution example shows that information entropy could be more appropriate to describe between-model uncertainty than variance. Two continuous distribution examples show that the two methods are consistent in measuring normal distribution, and information entropy is more appropriate to describe bimodal distribution than variance. The two examples of BMA uncertainty decomposition demonstrate that the two methods are relatively consistent in assessing the uncertainty of unimodal BMA prediction. Information entropy is more informative in describing the uncertainty decomposition of bimodal BMA prediction. Then, based on a synthetical groundwater model, the variance and information entropy methods are used to assess the BMA uncertainty of groundwater modeling. The uncertainty assessments of

  15. A beacon configuration optimization method based on Fisher information for Mars atmospheric entry

    NASA Astrophysics Data System (ADS)

    Zhao, Zeduan; Yu, Zhengshi; Cui, Pingyuan

    2017-04-01

    The navigation capability of the proposed Mars network based entry navigation system is directly related to the beacon number and the relative configuration between the beacons and the entry vehicle. In this paper, a new beacon configuration optimization method is developed based on the Fisher information theory and this method is suitable for any number of visible beacons. The proposed method can be used for the navigation schemes based on range measurements provided by radio transceivers or other sensors for Mars entry. The observability of specific state is defined as its Fisher information based on the observation model. The overall navigation capability is improved by maximizing the minimum average Fisher information, even though the navigation system is not fully observed. In addition, when there is only one beacon capable of entry navigation and the observation information is relatively limited, the optimization method can be modulated to maximize the Fisher information of the specific state which may be preferred for the guidance and control system to improve its estimation accuracy. Finally, navigation scenarios consisted of 1-3 beacons are tested to validate the effectiveness of the developed optimization method. The extended Kalman filter (EKF) is employed to derive the state estimation error covariance. The results also show that the zero-Fisher information situation should be avoided, especially when the dynamic system is highly nonlinear and the states change dramatically.

  16. Game theory-based mode cooperative selection mechanism for device-to-device visible light communication

    NASA Astrophysics Data System (ADS)

    Liu, Yuxin; Huang, Zhitong; Li, Wei; Ji, Yuefeng

    2016-03-01

    Various patterns of device-to-device (D2D) communication, from Bluetooth to Wi-Fi Direct, are emerging due to the increasing requirements of information sharing between mobile terminals. This paper presents an innovative pattern named device-to-device visible light communication (D2D-VLC) to alleviate the growing traffic problem. However, the occlusion problem is a difficulty in D2D-VLC. This paper proposes a game theory-based solution in which the best-response dynamics and best-response strategies are used to realize a mode-cooperative selection mechanism. This mechanism uses system capacity as the utility function to optimize system performance and selects the optimal communication mode for each active user from three candidate modes. Moreover, the simulation and experimental results show that the mechanism can attain a significant improvement in terms of effectiveness and energy saving compared with the cases where the users communicate via only the fixed transceivers (light-emitting diode and photo diode) or via only D2D.

  17. Development of StopAdvisor: A theory-based interactive internet-based smoking cessation intervention.

    PubMed

    Michie, Susan; Brown, Jamie; Geraghty, Adam W A; Miller, Sascha; Yardley, Lucy; Gardner, Benjamin; Shahab, Lion; McEwen, Andy; Stapleton, John A; West, Robert

    2012-09-01

    Reviews of internet-based behaviour-change interventions have shown that they can be effective but there is considerable heterogeneity and effect sizes are generally small. In order to advance science and technology in this area, it is essential to be able to build on principles and evidence of behaviour change in an incremental manner. We report the development of an interactive smoking cessation website, StopAdvisor, designed to be attractive and effective across the social spectrum. It was informed by a broad motivational theory (PRIME), empirical evidence, web-design expertise, and user-testing. The intervention was developed using an open-source web-development platform, 'LifeGuide', designed to facilitate optimisation and collaboration. We identified 19 theoretical propositions, 33 evidence- or theory-based behaviour change techniques, 26 web-design principles and nine principles from user-testing. These were synthesised to create the website, 'StopAdvisor' (see http://www.lifeguideonline.org/player/play/stopadvisordemonstration). The systematic and transparent application of theory, evidence, web-design expertise and user-testing within an open-source development platform can provide a basis for multi-phase optimisation contributing to an 'incremental technology' of behaviour change.

  18. Development and validation of a theory-based multimedia application for educating Persian patients on hemodialysis.

    PubMed

    Feizalahzadeh, Hossein; Tafreshi, Mansoureh Zagheri; Moghaddasi, Hamid; Farahani, Mansoureh A; Khosrovshahi, Hamid Tayebi; Zareh, Zahra; Mortazavi, Fakhrsadat

    2014-05-01

    Although patients on hemodialysis require effective education for self-care, several issues associated with the process raise barriers that make learning difficult. Computer-based education can reduce these problems and improve the quality of education. This study aims to develop and validate a theory-based multimedia application to educate Persian patients on hemodialysis. The study consisted of five phases: (1) content development, (2) prototype development 1, (3) evaluation by users, (4) evaluation by a multidisciplinary group of experts, and (5) prototype development 2. Data were collected through interviews and literature review with open-ended questions and two survey forms that consisted of a five-level scale. In the Results section, patient needs on hemodialysis self-care and related content were categorized into seven sections, including kidney function and failure, hemodialysis, vascular access, nutrition, medication, physical activity, and living with hemodialysis. The application designed includes seven modules consisting of user-controlled small multimedia units. During navigation through this application, the users were provided step-by-step information on self-care. Favorable scores were obtained from evaluations by users and experts. The researchers concluded that this application can facilitate hemodialysis education and learning process for the patients by focusing on their self-care needs using the multimedia design principles.

  19. Research on Methods of Processing Transit IC Card Information and Constructing Transit OD Matrix

    NASA Astrophysics Data System (ADS)

    Han, Xiuhua; Li, Jin; Peng, Han

    Transit OD matrix is of vital importance when planning urban transit system. Traditional transit OD matrix constructing method needs a large range of spot check survey. It is expensive and needs long cycle time to process information. Recently transit IC card charging systems have been widely applied in big cities. Being processed reasonably, transit passenger information stored in IC card database can turn into information resource. It will reduce survey cost a lot. The concept of transit trip chain is put forward in this paper. According to the characteristics of closed transit trip chain, it discusses how to process IC card information and construct transit OD matrix. It also points out that urban transit information platform and data warehouse should be constructed, and how to integrate IC card information.

  20. Game Theory Based Trust Model for Cloud Environment

    PubMed Central

    Gokulnath, K.; Uthariaraj, Rhymend

    2015-01-01

    The aim of this work is to propose a method to establish trust at bootload level in cloud computing environment. This work proposes a game theoretic based approach for achieving trust at bootload level of both resources and users perception. Nash equilibrium (NE) enhances the trust evaluation of the first-time users and providers. It also restricts the service providers and the users to violate service level agreement (SLA). Significantly, the problem of cold start and whitewashing issues are addressed by the proposed method. In addition appropriate mapping of cloud user's application to cloud service provider for segregating trust level is achieved as a part of mapping. Thus, time complexity and space complexity are handled efficiently. Experiments were carried out to compare and contrast the performance of the conventional methods and the proposed method. Several metrics like execution time, accuracy, error identification, and undecidability of the resources were considered. PMID:26380365

  1. Development and Validation of a Theory Based Screening Process for Suicide Risk

    DTIC Science & Technology

    2014-09-01

    AD_________________ Award Number: W81XWH-11-1-0588 TITLE: Development and Validation of a Theory Based Screening Process for Suicide Risk...DATES COVERED 4. TITLE AND SUBTITLE Development and Validation of a Theory Based Screening Process for Suicide Risk 5a. CONTRACT NUMBER 5b...Distribution Unlimited 13. SUPPLEMENTARY NOTES 14. ABSTRACT The ultimate objective of this study is to assist in increasing the capacity of military-based

  2. An Information Theory-Based Approach to Assessing the Sustainability and Stability of an Island System

    EPA Science Inventory

    It is well-documented that a sustainable system is based on environmental stewardship, economic viability and social equity. What is often overlooked is the need for continuity such that desirable system behavior is maintained with mechanisms in place that facilitate the ability ...

  3. Interconnected but underprotected? Parents' methods and motivations for information seeking on digital safety issues.

    PubMed

    Davis, Vauna

    2012-12-01

    Parents need information and skills to meet the demands of mediating connected technology in their homes. Parents' methods and motivations for learning to protect children from digital risks were reported through a survey. This study explores relationships between information seeking, parents' concerns, risks children have experienced, and access to connected devices, in addition to the use and satisfaction of various digital safety resources. Three types of information-seeking behavior were identified: (a) protective information seeking, to protect children from being confronted with harmful content; (b) problem-solving information seeking, to help children who have been negatively affected by connected technology; and (c) attentive learning, by attending to media resources passively encountered on this topic. Friends and family are the dominant source of digital safety information, followed by presentations and the Internet. Parents' top concerns for their children using connected technology were accidental exposure to pornography, and sexual content in Internet-based entertainment. Higher numbers of risks experienced by children were positively associated with parents' problem-solving information seeking and level of attentive learning. Parents who were more concerned exhibited more problem-solving information seeking; but despite the high level of concern for children's safety online, 65 percent of parents seek information on this subject less than twice per year. Children have access to a mean of five connected devices at home; a higher number of devices was correlated with increased risks experienced by children, but was not associated with increased concern or information seeking from parents.

  4. Farmers' Preferences for Methods of Receiving Information on New or Innovative Farming Practices.

    ERIC Educational Resources Information Center

    Riesenberg, Lou E.; Gor, Christopher Obel

    1989-01-01

    Survey of 386 Idaho farmers (response rate 58 percent) identified preferred methods of receiving information on new or innovative farming practices. Analysis revealed preference for interpersonal methods (demonstrations, tours, and field trips) over mass media such as computer-assisted instruction (CAI) and home study, although younger farmers,…

  5. Mixed Methods Research of Adult Family Care Home Residents and Informal Caregivers

    ERIC Educational Resources Information Center

    Jeanty, Guy C.; Hibel, James

    2011-01-01

    This article describes a mixed methods approach used to explore the experiences of adult family care home (AFCH) residents and informal caregivers (IC). A rationale is presented for using a mixed methods approach employing the sequential exploratory design with this poorly researched population. The unique challenges attendant to the sampling…

  6. A Method for the Analysis of Information Use in Source-Based Writing

    ERIC Educational Resources Information Center

    Sormunen, Eero; Heinstrom, Jannica; Romu, Leena; Turunen, Risto

    2012-01-01

    Introduction: Past research on source-based writing assignments has hesitated to scrutinize how students actually use information afforded by sources. This paper introduces a method for the analysis of text transformations from sources to texts composed. The method is aimed to serve scholars in building a more detailed understanding of how…

  7. A Qualitative Study about Performance Based Assesment Methods Used in Information Technologies Lesson

    ERIC Educational Resources Information Center

    Daghan, Gökhan; Akkoyunlu, Buket

    2014-01-01

    In this study, Information Technologies teachers' views and usage cases on performance based assesment methods (PBAMs) are examined. It is aimed to find out which of the PBAMs are used frequently or not used, preference reasons of these methods and opinions about the applicability of them. Study is designed with the phenomenological design which…

  8. Using Financial Information in Continuing Education. Accepted Methods and New Approaches.

    ERIC Educational Resources Information Center

    Matkin, Gary W.

    This book, which is intended as a resource/reference guide for experienced financial managers and course planners, examines accepted methods and new approaches for using financial information in continuing education. The introduction reviews theory and practice, traditional and new methods, planning and organizational management, and technology.…

  9. Evaluation of Television as a Method of Disseminating Solar Energy Information.

    ERIC Educational Resources Information Center

    Edington, Everett D.; And Others

    This project included three separate studies undertaken to determine the effectiveness of television instruction as a method of effectively delivering information about solar energy systems to present and future workers in related industries, and as a method of delivery for adult continuing education instruction. All three studies used a series of…

  10. Effects of a social cognitive theory-based hip fracture prevention web site for older adults.

    PubMed

    Nahm, Eun-Shim; Barker, Bausell; Resnick, Barbara; Covington, Barbara; Magaziner, Jay; Brennan, Patricia Flatley

    2010-01-01

    The purposes of this study were to develop a Social Cognitive Theory-based, structured Hip Fracture Prevention Web site for older adults and conduct a preliminary evaluation of its effectiveness. The Theory-based, structured Hip Fracture Prevention Web site is composed of learning modules and a moderated discussion board. A total of 245 older adults recruited from two Web sites and a newspaper advertisement were randomized into the Theory-based, structured Hip Fracture Prevention Web site and the conventional Web sites groups. Outcomes included (1) knowledge (hip fractures and osteoporosis), (2) self-efficacy and outcome expectations, and (3) calcium intake and exercise and were assessed at baseline, end of treatment (2 weeks), and follow-up (3 months). Both groups showed significant improvement in most outcomes. For calcium intake, only the Theory-based, structured Hip Fracture Prevention Web site group showed improvement. None of the group and time interactions were significant. The Theory-based, structured Hip Fracture Prevention Web site group, however, was more satisfied with the intervention. The discussion board usage was significantly correlated with outcome gains. Despite several limitations, the findings showed some preliminary effectiveness of Web-based health interventions for older adults and the use of a Theory-based, structured Hip Fracture Prevention Web site as a sustainable Web structure for online health behavior change interventions.

  11. [Development method of healthcare information system integration based on business collaboration model].

    PubMed

    Li, Shasha; Nie, Hongchao; Lu, Xudong; Duan, Huilong

    2015-02-01

    Integration of heterogeneous systems is the key to hospital information construction due to complexity of the healthcare environment. Currently, during the process of healthcare information system integration, people participating in integration project usually communicate by free-format document, which impairs the efficiency and adaptability of integration. A method utilizing business process model and notation (BPMN) to model integration requirement and automatically transforming it to executable integration configuration was proposed in this paper. Based on the method, a tool was developed to model integration requirement and transform it to integration configuration. In addition, an integration case in radiology scenario was used to verify the method.

  12. A Privacy-Preserved Analytical Method for eHealth Database with Minimized Information Loss

    PubMed Central

    Chen, Ya-Ling; Cheng, Bo-Chao; Chen, Hsueh-Lin; Lin, Chia-I; Liao, Guo-Tan; Hou, Bo-Yu; Hsu, Shih-Chun

    2012-01-01

    Digitizing medical information is an emerging trend that employs information and communication technology (ICT) to manage health records, diagnostic reports, and other medical data more effectively, in order to improve the overall quality of medical services. However, medical information is highly confidential and involves private information, even legitimate access to data raises privacy concerns. Medical records provide health information on an as-needed basis for diagnosis and treatment, and the information is also important for medical research and other health management applications. Traditional privacy risk management systems have focused on reducing reidentification risk, and they do not consider information loss. In addition, such systems cannot identify and isolate data that carries high risk of privacy violations. This paper proposes the Hiatus Tailor (HT) system, which ensures low re-identification risk for medical records, while providing more authenticated information to database users and identifying high-risk data in the database for better system management. The experimental results demonstrate that the HT system achieves much lower information loss than traditional risk management methods, with the same risk of re-identification. PMID:22969273

  13. [Review of data quality dimensions and applied methods in the evaluation of health information systems].

    PubMed

    Lima, Claudia Risso de Araujo; Schramm, Joyce Mendes de Andrade; Coeli, Claudia Medina; da Silva, Márcia Elizabeth Marinho

    2009-10-01

    In Brazil, quality monitoring of data from the various health information systems does not follow a regular evaluation plan. This paper reviews quality evaluation initiatives related to the Brazilian information systems, identifying the selected quality dimensions and the method employed. The SciELO and LILACS databases were searched, as were the bibliographical references from articles identified in the search. 375 articles were initially identified, leaving a final total of 78 after exclusions. The four most frequent dimensions in articles totaled approximately 90% of the analyses. The studies prioritized certain quality dimensions: reliability, validity, coverage, and completeness. Half of the studies were limited to data from Rio de Janeiro and São Paulo. The limited number of studies on some systems and their unequal distribution between regions of the country hinder a comprehensive quality assessment of Brazil's health information systems. The importance of accurate information highlights the need to implement a data management policy for health information systems in Brazil.

  14. A Theory-Based Exercise App to Enhance Exercise Adherence: A Pilot Study

    PubMed Central

    Voth, Elizabeth C; Oelke, Nelly D

    2016-01-01

    Background Use of mobile health (mHealth) technology is on an exponential rise. mHealth apps have the capability to reach a large number of individuals, but until now have lacked the integration of evidence-based theoretical constructs to increase exercise behavior in users. Objective The purpose of this study was to assess the effectiveness of a theory-based, self-monitoring app on exercise and self-monitoring behavior over 8 weeks. Methods A total of 56 adults (mean age 40 years, SD 13) were randomly assigned to either receive the mHealth app (experimental; n=28) or not to receive the app (control; n=28). All participants engaged in an exercise goal-setting session at baseline. Experimental condition participants received weekly short message service (SMS) text messages grounded in social cognitive theory and were encouraged to self-monitor exercise bouts on the app on a daily basis. Exercise behavior, frequency of self-monitoring exercise behavior, self-efficacy to self-monitor, and self-management of exercise behavior were collected at baseline and at postintervention. Results Engagement in exercise bouts was greater in the experimental condition (mean 7.24, SD 3.40) as compared to the control condition (mean 4.74, SD 3.70, P=.03, d=0.70) at week 8 postintervention. Frequency of self-monitoring increased significantly over the 8-week investigation between the experimental and control conditions (P<.001, partial η2=.599), with participants in the experimental condition self-monitoring significantly more at postintervention (mean 6.00, SD 0.93) in comparison to those in the control condition (mean 1.95, SD 2.58, P<.001, d=2.10). Self-efficacy to self-monitor and perceived self-management of exercise behavior were unaffected by this intervention. Conclusions The successful integration of social cognitive theory into an mHealth exercise self-monitoring app provides support for future research to feasibly integrate theoretical constructs into existing exercise apps

  15. Measuring information flow in cellular networks by the systems biology method through microarray data.

    PubMed

    Chen, Bor-Sen; Li, Cheng-Wei

    2015-01-01

    In general, it is very difficult to measure the information flow in a cellular network directly. In this study, based on an information flow model and microarray data, we measured the information flow in cellular networks indirectly by using a systems biology method. First, we used a recursive least square parameter estimation algorithm to identify the system parameters of coupling signal transduction pathways and the cellular gene regulatory network (GRN). Then, based on the identified parameters and systems theory, we estimated the signal transductivities of the coupling signal transduction pathways from the extracellular signals to each downstream protein and the information transductivities of the GRN between transcription factors in response to environmental events. According to the proposed method, the information flow, which is characterized by signal transductivity in coupling signaling pathways and information transductivity in the GRN, can be estimated by microarray temporal data or microarray sample data. It can also be estimated by other high-throughput data such as next-generation sequencing or proteomic data. Finally, the information flows of the signal transduction pathways and the GRN in leukemia cancer cells and non-leukemia normal cells were also measured to analyze the systematic dysfunction in this cancer from microarray sample data. The results show that the signal transductivities of signal transduction pathways change substantially from normal cells to leukemia cancer cells.

  16. Measuring information flow in cellular networks by the systems biology method through microarray data

    PubMed Central

    Chen, Bor-Sen; Li, Cheng-Wei

    2015-01-01

    In general, it is very difficult to measure the information flow in a cellular network directly. In this study, based on an information flow model and microarray data, we measured the information flow in cellular networks indirectly by using a systems biology method. First, we used a recursive least square parameter estimation algorithm to identify the system parameters of coupling signal transduction pathways and the cellular gene regulatory network (GRN). Then, based on the identified parameters and systems theory, we estimated the signal transductivities of the coupling signal transduction pathways from the extracellular signals to each downstream protein and the information transductivities of the GRN between transcription factors in response to environmental events. According to the proposed method, the information flow, which is characterized by signal transductivity in coupling signaling pathways and information transductivity in the GRN, can be estimated by microarray temporal data or microarray sample data. It can also be estimated by other high-throughput data such as next-generation sequencing or proteomic data. Finally, the information flows of the signal transduction pathways and the GRN in leukemia cancer cells and non-leukemia normal cells were also measured to analyze the systematic dysfunction in this cancer from microarray sample data. The results show that the signal transductivities of signal transduction pathways change substantially from normal cells to leukemia cancer cells. PMID:26082788

  17. A method of building information extraction based on mathematical morphology and multiscale

    NASA Astrophysics Data System (ADS)

    Li, Jing-wen; Wang, Ke; Zhang, Zi-ping; Xue, Long-li; Yin, Shou-qiang; Zhou, Song

    2015-12-01

    In view of monitoring the changes of buildings on Earth's surface ,by analyzing the distribution characteristics of building in remote sensing image, combined with multi-scale in image segmentation and the advantages of mathematical morphology, this paper proposes a multi-scale combined with mathematical morphology of high resolution remote sensing image segmentation method, and uses the multiple fuzzy classification method and the shadow of auxiliary method to extract information building, With the comparison of k-means classification, and the traditional maximum likelihood classification method, the results of experiment object based on multi-scale combined with mathematical morphology of image segmentation and extraction method, can accurately extract the structure of the information is more clear classification data, provide the basis for the intelligent monitoring of earth data and theoretical support.

  18. Mixture theory-based poroelasticity as a model of interstitial tissue growth.

    PubMed

    Cowin, Stephen C; Cardoso, Luis

    2012-01-01

    This contribution presents an alternative approach to mixture theory-based poroelasticity by transferring some poroelastic concepts developed by Maurice Biot to mixture theory. These concepts are a larger RVE and the subRVE-RVE velocity average tensor, which Biot called the micro-macro velocity average tensor. This velocity average tensor is assumed here to depend upon the pore structure fabric. The formulation of mixture theory presented is directed toward the modeling of interstitial growth, that is to say changing mass and changing density of an organism. Traditional mixture theory considers constituents to be open systems, but the entire mixture is a closed system. In this development the mixture is also considered to be an open system as an alternative method of modeling growth. Growth is slow and accelerations are neglected in the applications. The velocity of a solid constituent is employed as the main reference velocity in preference to the mean velocity concept from the original formulation of mixture theory. The standard development of statements of the conservation principles and entropy inequality employed in mixture theory are modified to account for these kinematic changes and to allow for supplies of mass, momentum and energy to each constituent and to the mixture as a whole. The objective is to establish a basis for the development of constitutive equations for growth of tissues.

  19. Parallel implementation of multireference coupled-cluster theories based on the reference-level parallelism

    SciTech Connect

    Brabec, Jiri; Pittner, Jiri; van Dam, Hubertus JJ; Apra, Edoardo; Kowalski, Karol

    2012-02-01

    A novel algorithm for implementing general type of multireference coupled-cluster (MRCC) theory based on the Jeziorski-Monkhorst exponential Ansatz [B. Jeziorski, H.J. Monkhorst, Phys. Rev. A 24, 1668 (1981)] is introduced. The proposed algorithm utilizes processor groups to calculate the equations for the MRCC amplitudes. In the basic formulation each processor group constructs the equations related to a specific subset of references. By flexible choice of processor groups and subset of reference-specific sufficiency conditions designated to a given group one can assure optimum utilization of available computing resources. The performance of this algorithm is illustrated on the examples of the Brillouin-Wigner and Mukherjee MRCC methods with singles and doubles (BW-MRCCSD and Mk-MRCCSD). A significant improvement in scalability and in reduction of time to solution is reported with respect to recently reported parallel implementation of the BW-MRCCSD formalism [J.Brabec, H.J.J. van Dam, K. Kowalski, J. Pittner, Chem. Phys. Lett. 514, 347 (2011)].

  20. The Effect of Theory Based Nutritional Education on Fat Intake, Weight and Blood Lipids

    PubMed Central

    Kamran, Aziz; Sharifirad, Gholamreza; Heydari, Heshmatolah; Sharifian, Elham

    2016-01-01

    Introduction Though Nutrition plays a key role in the control of hypertension, it is often forgotten in Iranian patients’ diet. In fact, dietary behavior can be regarded as unsatisfactory among Iranian patients. This study was aimed to assess the effectiveness of theory based educational intervention on fat intake, weight, and blood lipids among rural hypertensive patients. Methods This quasi experimental study was conducted on 138 hypertensive patients who had referred to Ardabil rural health centers during 2014. The nutritional education based on DASH and Health Promotion Model (HPM) was treated for six sessions. The pre-test and post-test had intervals of two and six months. Data were analyzed using SPSS-18 and Chi-square, independent-samples t-test, paired-samples t-test and repeated measure ANOVA. Results After treating intervention, weight, dietary fat, LDL_C and Total cholesterol, systolic and diastolic blood pressures decreased significantly in the intervention group compared with the control group (p < 0.001). In contrast, HDL_C increased significantly in the intervention group. Conclusion Educational intervention, provided based on Pender’s health promotion model, affecting fat intake, blood lipids, and blood pressure, led to their decrease PMID:28163845

  1. Reducing sedentary behavior in minority girls via a theory-based, tailored classroom media intervention

    PubMed Central

    SPRUIJT-METZ, DONNA; NGUYEN-MICHEL, SELENA T.; GORAN, MICHAEL I.; CHOU, CHIH-PING; HUANG, TERRY T-K.

    2010-01-01

    Objective To develop, implement and test an innovative, theory-based classroom media intervention known as Get Moving! to increase physical activity and decrease sedentary behaviors in predominantly Latina middle school girls. Research methods and procedures School-based intervention on five to seven consecutive school days in seven schools (four intervention and three control) with high Latino populations (above 60%). Intervention schools were matched to control schools by ethnic makeup and socioeconomic status (SES). Measures conducted 3 months before and 3 months after intervention included height, weight, percentage body fat (bioimpedance analysis), physical activity and psychosocial aspects of activity by questionnaire. Subjects were middle school girls, mean age 12.5 years old, 73% Latina (N=459 girls). Results Get Moving! significantly reduced time spent on sedentary behavior (β± standard error, SE=−0.27±0.14, p<0.05) and significantly increased intrinsic motivation (β±SE=0.11±0.05, p<0.05). There was a trend for mediation effects of intrinsic motivation, but this did not reach significance. Discussion Get Moving! is a promising school-based approach that specifically targets physical activity and sedentary behavior in Latina girls, a population at high risk for obesity and related diseases. PMID:19023773

  2. Using a Marginal Structural Model to Design a Theory-Based Mass Media Campaign

    PubMed Central

    Taguri, Masataka; Ishikawa, Yoshiki

    2016-01-01

    Background The essential first step in the development of mass media health campaigns is to identify specific beliefs of the target audience. The challenge is to prioritize suitable beliefs derived from behavioral theory. The purpose of this study was to identify suitable beliefs to target in a mass media campaign to change behavior using a new method to estimate the possible effect size of a small set of beliefs. Methods Data were drawn from the 2010 Japanese Young Female Smoker Survey (n = 500), conducted by the Japanese Ministry of Health, Labor and Welfare. Survey measures included intention to quit smoking, psychological beliefs (attitude, norms, and perceived control) based on the theory of planned behavior and socioeconomic status (age, education, household income, and marital status). To identify suitable candidate beliefs for a mass media health campaign, we estimated the possible effect size required to change the intention to quit smoking among the population of young Japanese women using the population attributable fraction from a marginal structural model. Results Thirteen percent of study participants intended to quit smoking. The marginal structural model estimated a population attributable fraction of 47 psychological beliefs (21 attitudes, 6 norms, and 19 perceived controls) after controlling for socioeconomic status. The belief, “I could quit smoking if my husband or significant other recommended it” suggested a promising target for a mass media campaign (population attributable fraction = 0.12, 95% CI = 0.02–0.23). Messages targeting this belief could possibly improve intention rates by up to 12% among this population. The analysis also suggested the potential for regulatory action. Conclusions This study proposed a method by which campaign planners can develop theory-based mass communication strategies to change health behaviors at the population level. This method might contribute to improving the quality of future mass health

  3. A feature extraction method based on information theory for fault diagnosis of reciprocating machinery.

    PubMed

    Wang, Huaqing; Chen, Peng

    2009-01-01

    This paper proposes a feature extraction method based on information theory for fault diagnosis of reciprocating machinery. A method to obtain symptom parameter waves is defined in the time domain using the vibration signals, and an information wave is presented based on information theory, using the symptom parameter waves. A new way to determine the difference spectrum of envelope information waves is also derived, by which the feature spectrum can be extracted clearly and machine faults can be effectively differentiated. This paper also compares the proposed method with the conventional Hilbert-transform-based envelope detection and with a wavelet analysis technique. Practical examples of diagnosis for a rolling element bearing used in a diesel engine are provided to verify the effectiveness of the proposed method. The verification results show that the bearing faults that typically occur in rolling element bearings, such as outer-race, inner-race, and roller defects, can be effectively identified by the proposed method, while these bearing faults are difficult to detect using either of the other techniques it was compared to.

  4. A Feature Extraction Method Based on Information Theory for Fault Diagnosis of Reciprocating Machinery

    PubMed Central

    Wang, Huaqing; Chen, Peng

    2009-01-01

    This paper proposes a feature extraction method based on information theory for fault diagnosis of reciprocating machinery. A method to obtain symptom parameter waves is defined in the time domain using the vibration signals, and an information wave is presented based on information theory, using the symptom parameter waves. A new way to determine the difference spectrum of envelope information waves is also derived, by which the feature spectrum can be extracted clearly and machine faults can be effectively differentiated. This paper also compares the proposed method with the conventional Hilbert-transform-based envelope detection and with a wavelet analysis technique. Practical examples of diagnosis for a rolling element bearing used in a diesel engine are provided to verify the effectiveness of the proposed method. The verification results show that the bearing faults that typically occur in rolling element bearings, such as outer-race, inner-race, and roller defects, can be effectively identified by the proposed method, while these bearing faults are difficult to detect using either of the other techniques it was compared to. PMID:22574021

  5. The Effect of Health Information Technology on Health Care Provider Communication: A Mixed-Method Protocol

    PubMed Central

    Adler-Milstein, Julia; Harrod, Molly; Sales, Anne; Hofer, Timothy P; Saint, Sanjay; Krein, Sarah L

    2015-01-01

    Background Communication failures between physicians and nurses are one of the most common causes of adverse events for hospitalized patients, as well as a major root cause of all sentinel events. Communication technology (ie, the electronic medical record, computerized provider order entry, email, and pagers), which is a component of health information technology (HIT), may help reduce some communication failures but increase others because of an inadequate understanding of how communication technology is used. Increasing use of health information and communication technologies is likely to affect communication between nurses and physicians. Objective The purpose of this study is to describe, in detail, how health information and communication technologies facilitate or hinder communication between nurses and physicians with the ultimate goal of identifying how we can optimize the use of these technologies to support effective communication. Effective communication is the process of developing shared understanding between communicators by establishing, testing, and maintaining relationships. Our theoretical model, based in communication and sociology theories, describes how health information and communication technologies affect communication through communication practices (ie, use of rich media; the location and availability of computers) and work relationships (ie, hierarchies and team stability). Therefore we seek to (1) identify the range of health information and communication technologies used in a national sample of medical-surgical acute care units, (2) describe communication practices and work relationships that may be influenced by health information and communication technologies in these same settings, and (3) explore how differences in health information and communication technologies, communication practices, and work relationships between physicians and nurses influence communication. Methods This 4-year study uses a sequential mixed-methods

  6. Moderators of Theory-Based Interventions to Promote Physical Activity in 77 Randomized Controlled Trials.

    PubMed

    Bernard, Paquito; Carayol, Marion; Gourlan, Mathieu; Boiché, Julie; Romain, Ahmed Jérôme; Bortolon, Catherine; Lareyre, Olivier; Ninot, Gregory

    2017-04-01

    A meta-analysis of randomized controlled trials (RCTs) has recently showed that theory-based interventions designed to promote physical activity (PA) significantly increased PA behavior. The objective of the present study was to investigate the moderators of the efficacy of these theory-based interventions. Seventy-seven RCTs evaluating theory-based interventions were systematically identified. Sample, intervention, methodology, and theory implementation characteristics were extracted, coded by three duos of independent investigators, and tested as moderators of interventions effect in a multiple-meta-regression model. Three moderators were negatively associated with the efficacy of theory-based interventions on PA behavior: intervention length (≥14 weeks; β = -.22, p = .004), number of experimental patients (β = -.10, p = .002), and global methodological quality score (β = -.08, p = .04). Our findings suggest that the efficacy of theory-based interventions to promote PA could be overestimated consequently due to methodological weaknesses of RCTs and that interventions shorter than 14 weeks could maximize the increase of PA behavior.

  7. An overview of methods and applications to value informal care in economic evaluations of healthcare.

    PubMed

    Koopmanschap, Marc A; van Exel, Job N A; van den Berg, Bernard; Brouwer, Werner B F

    2008-01-01

    This paper compares several applied valuation methods for including informal care in economic evaluations of healthcare programmes: the proxy good method; the opportunity cost method; the contingent valuation method (CVM); conjoint measurement (CM); and valuation of health effects in terms of health-related quality of life (HR-QOL) and well-being. The comparison focuses on three questions: what outcome measures are available for including informal care in economic evaluations of healthcare programmes; whether these measures are compatible with the common types of economic evaluation; and, when applying these measures, whether all relevant aspects of informal care are incorporated. All types of economic evaluation can incorporate a monetary value of informal care (using the opportunity cost method, the proxy good method, CVM and CM) on the cost side of an analysis, but only when the relevant aspects of time costs have been valued. On the effect side of a cost-effectiveness or cost-utility analysis, the health effects (for the patient and/or caregiver) measured in natural units or QALYs can be combined with cost estimates based on the opportunity cost method or the proxy good method. One should be careful when incorporating CVM and CM in cost-minimization, cost-effectiveness and cost-utility analyses, as the health effects of patients receiving informal care and the carers themselves may also have been valued separately. One should determine whether the caregiver valuation exercise allows combination with other valuation techniques. In cost-benefit analyses, CVM and CM appear to be the best tools for the valuation of informal care. When researchers decide to use the well-being method, we recommend applying it in a cost-benefit analysis framework. This method values overall QOL (happiness); hence it is broader than just HR-QOL, which complicates inclusion in traditional health economic evaluations that normally define outcomes more narrowly. Using broader, non

  8. Development of a theory-based (PEN-3 and Health Belief Model), culturally relevant intervention on cervical cancer prevention among Latina immigrants using intervention mapping.

    PubMed

    Scarinci, Isabel C; Bandura, Lisa; Hidalgo, Bertha; Cherrington, Andrea

    2012-01-01

    The development of efficacious theory-based, culturally relevant interventions to promote cervical cancer prevention among underserved populations is crucial to the elimination of cancer disparities. The purpose of this article is to describe the development of a theory-based, culturally relevant intervention focusing on primary (sexual risk reduction) and secondary (Pap smear) prevention of cervical cancer among Latina immigrants using intervention mapping (IM). The PEN-3 and Health Belief Model provided theoretical guidance for the intervention development and implementation. IM provides a logical five-step framework in intervention development: delineating proximal program objectives, selecting theory-based intervention methods and strategies, developing a program plan, planning for adoption in implementation, and creating evaluation plans and instruments. We first conducted an extensive literature review and qualitatively examined the sociocultural factors associated with primary and secondary prevention of cervical cancer. We then proceeded to quantitatively validate the qualitative findings, which led to development matrices linking the theoretical constructs with intervention objectives and strategies as well as evaluation. IM was a helpful tool in the development of a theory-based, culturally relevant intervention addressing primary and secondary prevention among Latina immigrants.

  9. Development of a Theory-Based (PEN-3 and Health Belief Model), Culturally Relevant Intervention on Cervical Cancer Prevention Among Latina Immigrants Using Intervention Mapping

    PubMed Central

    Scarinci, Isabel C.; Bandura, Lisa; Hidalgo, Bertha; Cherrington, Andrea

    2014-01-01

    The development of efficacious theory-based, culturally relevant interventions to promote cervical cancer prevention among underserved populations is crucial to the elimination of cancer disparities. The purpose of this article is to describe the development of a theory-based, culturally relevant intervention focusing on primary (sexual risk reduction) and secondary (Pap smear) prevention of cervical cancer among Latina immigrants using intervention mapping (IM). The PEN-3 and Health Belief Model provided theoretical guidance for the intervention development and implementation. IM provides a logical five-step framework in intervention development: delineating proximal program objectives, selecting theory-based intervention methods and strategies, developing a program plan, planning for adoption in implementation, and creating evaluation plans and instruments. We first conducted an extensive literature review and qualitatively examined the socio-cultural factors associated with primary and secondary prevention of cervical cancer. We then proceeded to quantitatively validate the qualitative findings, which led to development matrices linking the theoretical constructs with intervention objectives and strategies as well as evaluation. IM was a helpful tool in the development of a theory-based, culturally relevant intervention addressing primary and secondary prevention among Latina immigrants. PMID:21422254

  10. A Comparison of Limited-Information and Full-Information Methods in M"plus" for Estimating Item Response Theory Parameters for Nonnormal Populations

    ERIC Educational Resources Information Center

    DeMars, Christine E.

    2012-01-01

    In structural equation modeling software, either limited-information (bivariate proportions) or full-information item parameter estimation routines could be used for the 2-parameter item response theory (IRT) model. Limited-information methods assume the continuous variable underlying an item response is normally distributed. For skewed and…

  11. Application of information-retrieval methods to the classification of physical data

    NASA Technical Reports Server (NTRS)

    Mamotko, Z. N.; Khorolskaya, S. K.; Shatrovskiy, L. I.

    1975-01-01

    Scientific data received from satellites are characterized as a multi-dimensional time series, whose terms are vector functions of a vector of measurement conditions. Information retrieval methods are used to construct lower dimensional samples on the basis of the condition vector, in order to obtain these data and to construct partial relations. The methods are applied to the joint Soviet-French Arkad project.

  12. Theory based design and optimization of materials for spintronics applications

    NASA Astrophysics Data System (ADS)

    Xu, Tianyi

    The Spintronics industry has developed rapidly in the past decade. Finding the right material is very important for Spintronics applications, which requires good understanding of the physics behind specific phenomena. In this dissertation, we will focus on two types of perpendicular transport phenomena, the current-perpendicular-to-plane giant-magneto-resistance (CPP-GMR) phenomenon and the tunneling phenomenon in the magnetic tunnel junctions. The Valet-Fert model is a very useful semi-classical approach for understanding the transport and spin-flip process in CPP-GMR. We will present a finite element based implementation for the Valet-Fert model which enables a practical way to calculate the electron transport in real CPP-GMR spin valves. It is very important to find high spin polarized materials for CPP-GMR spin valves. The half-metal, due to its full spin polarization, is of interest. We will propose a rational way to find half-metals based on the gap theorem. Then we will focus on the high-MR TMR phenomenon. The tunneling theory of electron transport in mesoscopic systems will be covered. Then we will calculate the transport properties of certain junctions with the help of Green's function under the Landauer-Buttiker formalism, also known as the scattering formalism. The damping constant determines the switching rate of a device. We can calculate it using a method based on the Extended Huckel Tight-Binding theory (EHTB). The symmetry filtering effect is very helpful for finding materials for TMR junctions. Based upon which, we find a good candidate material, MnAl, for TMR applications.

  13. Parenting Practices of Anxious and Nonanxious Mothers: A Multi-Method, Multi-Informant Approach

    ERIC Educational Resources Information Center

    Drake, Kelly L.; Ginsburg, Golda S.

    2011-01-01

    Anxious and nonanxious mothers were compared on theoretically derived parenting and family environment variables (i.e., overcontrol, warmth, criticism, anxious modeling) using multiple informants and methods. Mother-child dyads completed questionnaires about parenting and were observed during an interactional task. Findings reveal that, after…

  14. Method and Apparatus Providing Deception and/or Altered Operation in an Information System Operating System

    DOEpatents

    Cohen, Fred; Rogers, Deanna T.; Neagoe, Vicentiu

    2008-10-14

    A method and/or system and/or apparatus providing deception and/or execution alteration in an information system. In specific embodiments, deceptions and/or protections are provided by intercepting and/or modifying operation of one or more system calls of an operating system.

  15. Reduction in redundancy of multichannel telemetric information by the method of adaptive discretization with associative sorting

    NASA Technical Reports Server (NTRS)

    Kantor, A. V.; Timonin, V. G.; Azarova, Y. S.

    1974-01-01

    The method of adaptive discretization is the most promising for elimination of redundancy from telemetry messages characterized by signal shape. Adaptive discretization with associative sorting was considered as a way to avoid the shortcomings of adaptive discretization with buffer smoothing and adaptive discretization with logical switching in on-board information compression devices (OICD) in spacecraft. Mathematical investigations of OICD are presented.

  16. Genetically Informative Research on Adolescent Substance Use: Methods, Findings, and Challenges

    ERIC Educational Resources Information Center

    Lynskey, Michael T.; Agrawal, Arpana; Heath, Andrew C.

    2010-01-01

    Objective: To provide an overview of the genetic epidemiology of substance use and misuse in adolescents. Method: A selective review of genetically informative research strategies, their limitations, and key findings examining issues related to the heritability of substance use and substance use disorders in children and adolescents is presented.…

  17. A Method for Analyzing Volunteered Geographic Information to Visualize Community Valuation of Ecosystem Services

    EPA Science Inventory

    Volunteered geographic information (VGI) can be used to identify public valuation of ecosystem services in a defined geographic area using photos as a representation of lived experiences. This method can help researchers better survey and report on the values and preferences of s...

  18. Blended Lessons of Teaching Method for Information Studies in Which Students Produce a Learning Guidance Plan

    ERIC Educational Resources Information Center

    Miyaji, Isao

    2013-01-01

    Adopting exercise-making and evaluation activities, we conducted a teaching method of Information Studies which is a teaching-training course subject. We surveyed the learners' recognition rate of terms related to lessons at both the beginning and the end of lessons. Then we tested the significance of the differences between both rates. Those…

  19. An Inquiry-Based Approach to Teaching Research Methods in Information Studies

    ERIC Educational Resources Information Center

    Albright, Kendra; Petrulis, Robert; Vasconcelos, Ana; Wood, Jamie

    2012-01-01

    This paper presents the results of a project that aimed at restructuring the delivery of research methods training at the Information School at the University of Sheffield, UK, based on an Inquiry-Based Learning (IBL) approach. The purpose of this research was to implement inquiry-based learning that would allow customization of research methods…

  20. Systems, methods and apparatus for implementation of formal specifications derived from informal requirements

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G. (Inventor); Rash, James L. (Inventor); Erickson, John D. (Inventor); Gracinin, Denis (Inventor); Rouff, Christopher A. (Inventor)

    2010-01-01

    Systems, methods and apparatus are provided through which in some embodiments an informal specification is translated without human intervention into a formal specification. In some embodiments the formal specification is a process-based specification. In some embodiments, the formal specification is translated into a high-level computer programming language which is further compiled into a set of executable computer instructions.

  1. A Proposed Method of Measuring the Utility of Individual Information Retrieval Tools.

    ERIC Educational Resources Information Center

    Meadow, Charles T.

    1996-01-01

    Proposes a method of evaluating information retrieval systems by concentrating on individual tools (commands, their menus or graphic interface equivalents, or a move/stratagem). A user would assess the relative success of a small part of a search, and every tool used in that part would be credited with a contribution to the result. Cumulative…

  2. Development of the CODER System: A Testbed for Artificial Intelligence Methods in Information Retrieval.

    ERIC Educational Resources Information Center

    Fox, Edward A.

    1987-01-01

    Discusses the CODER system, which was developed to investigate the application of artificial intelligence methods to increase the effectiveness of information retrieval systems, particularly those involving heterogeneous documents. Highlights include the use of PROLOG programing, blackboard-based designs, knowledge engineering, lexicological…

  3. Preferred Methods for Delivery of Technological Information by the North Carolina Agricultural Extension Service: Opinions of Agricultural Producers Who Use Extension Information.

    ERIC Educational Resources Information Center

    Richardson, John G.; Mustian, R. David

    The findings of a questionnaire survey of 702 North Carolina agricultural producers indicated that communication methods historically used by the North Carolina Agricultural Extension Service for information dissemination are accepted by state farmers and continue to be popular. Information delivery methods most frequently preferred are…

  4. An innovative method for extracting isotopic information from low-resolution gamma spectra

    SciTech Connect

    Miko, D.; Estep, R.J.; Rawool-Sullivan, M.W.

    1998-12-01

    A method is described for the extraction of isotopic information from attenuated gamma ray spectra using the gross-count material basis set (GC-MBS) model. This method solves for the isotopic composition of an unknown mixture of isotopes attenuated through an absorber of unknown material. For binary isotopic combinations the problem is nonlinear in only one variable and is easily solved using standard line optimization techniques. Results are presented for NaI spectrum analyses of various binary combinations of enriched uranium, depleted uranium, low burnup Pu, {sup 137}Cs, and {sup 133}Ba attenuated through a suite of absorbers ranging in Z from polyethylene through lead. The GC-MBS method results are compared to those computed using ordinary response function fitting and with a simple net peak area method. The GC-MBS method was found to be significantly more accurate than the other methods over the range of absorbers and isotopic blends studied.

  5. Improvements in recall and food choices using a graphical method to deliver information of select nutrients.

    PubMed

    Pratt, Nathan S; Ellison, Brenna D; Benjamin, Aaron S; Nakamura, Manabu T

    2016-01-01

    Consumers have difficulty using nutrition information. We hypothesized that graphically delivering information of select nutrients relative to a target would allow individuals to process information in time-constrained settings more effectively than numerical information. Objectives of the study were to determine the efficacy of the graphical method in (1) improving memory of nutrient information and (2) improving consumer purchasing behavior in a restaurant. Values of fiber and protein per calorie were 2-dimensionally plotted alongside a target box. First, a randomized cued recall experiment was conducted (n=63). Recall accuracy of nutrition information improved by up to 43% when shown graphically instead of numerically. Second, the impact of graphical nutrition signposting on diner choices was tested in a cafeteria. Saturated fat and sodium information was also presented using color coding. Nutrient content of meals (n=362) was compared between 3 signposting phases: graphical, nutrition facts panels (NFP), or no nutrition label. Graphical signposting improved nutrient content of purchases in the intended direction, whereas NFP had no effect compared with the baseline. Calories ordered from total meals, entrées, and sides were significantly less during graphical signposting than no-label and NFP periods. For total meal and entrées, protein per calorie purchased was significantly higher and saturated fat significantly lower during graphical signposting than the other phases. Graphical signposting remained a predictor of calories and protein per calorie purchased in regression modeling. These findings demonstrate that graphically presenting nutrition information makes that information more available for decision making and influences behavior change in a realistic setting.

  6. “Please Don’t Send Us Spam!” A Participative, Theory-Based Methodology for Developing an mHealth Intervention

    PubMed Central

    2016-01-01

    Background Mobile health solutions have the potential of reducing burdens on health systems and empowering patients with important information. However, there is a lack of theory-based mHealth interventions. Objective The purpose of our study was to develop a participative, theory-based, mobile phone, audio messaging intervention attractive to recently circumcised men at voluntary medical male circumcision (VMMC) clinics in the Cape Town area in South Africa. We aimed to shift some of the tasks related to postoperative counselling on wound management and goal setting on safe sex. We place an emphasis on describing the full method of message generation to allow for replication. Methods We developed an mHealth intervention using a staggered qualitative methodology: (1) focus group discussions with 52 recently circumcised men and their partners to develop initial voice messages they felt were relevant and appropriate, (2) thematic analysis and expert consultation to select the final messages for pilot testing, and (3) cognitive interviews with 12 recent VMMC patients to judge message comprehension and rank the messages. Message content and phasing were guided by the theory of planned behavior and the health action process approach. Results Patients and their partners came up with 245 messages they thought would help men during the wound-healing period. Thematic analysis revealed 42 different themes. Expert review and cognitive interviews with more patients resulted in 42 messages with a clear division in terms of needs and expectations between the initial wound-healing recovery phase (weeks 1–3) and the adjustment phase (weeks 4–6). Discussions with patients also revealed potential barriers to voice messaging, such as lack of technical knowledge of mobile phones and concerns about the invasive nature of the intervention. Patients’ own suggested messages confirmed Ajzen’s theory of planned behavior that if a health promotion intervention can build trust and be

  7. Exploring racial/ethnic differences in substance use: a preliminary theory-based investigation with juvenile justice-involved youth

    PubMed Central

    2011-01-01

    Background Racial/ethnic differences in representation, substance use, and its correlates may be linked to differential long-term health outcomes for justice-involved youth. Determining the nature of these differences is critical to informing more efficacious health prevention and intervention efforts. In this study, we employed a theory-based approach to evaluate the nature of these potential differences. Specifically, we hypothesized that (1) racial/ethnic minority youth would be comparatively overrepresented in the juvenile justice system, (2) the rates of substance use would be different across racial/ethnic groups, and (3) individual-level risk factors would be better predictors of substance use for Caucasian youth than for youth of other racial/ethnic groups. Methods To evaluate these hypotheses, we recruited a large, diverse sample of justice-involved youth in the southwest (N = 651; M age = 15.7, SD = 1.05, range = 14-18 years); 66% male; 41% Hispanic, 24% African American, 15% Caucasian, 11% American Indian/Alaska Native). All youth were queried about their substance use behavior (alcohol, marijuana, tobacco, illicit hard drug use) and individual-level risk factors (school involvement, employment, self-esteem, level of externalizing behaviors). Results As predicted, racial/ethnic minority youth were significantly overrepresented in the juvenile justice system. Additionally, Caucasian youth reported the greatest rates of substance use and substance-related individual-level risk factors. In contrast, African American youth showed the lowest rates for substance use and individual risk factors. Contrary to predictions, a racial/ethnic group by risk factor finding emerged for only one risk factor and one substance use category. Conclusions This research highlights the importance of more closely examining racial/ethnic differences in justice populations, as there are likely to be differing health needs, and subsequent treatment approaches, by racial/ethnic group

  8. Identification of depth information with stereoscopic mammography using different display methods

    NASA Astrophysics Data System (ADS)

    Morikawa, Takamitsu; Kodera, Yoshie

    2013-03-01

    Stereoscopy in radiography was widely used in the late 80's because it could be used for capturing complex structures in the human body, thus proving beneficial for diagnosis and screening. When radiologists observed the images stereoscopically, radiologists usually needed the training of their eyes in order to perceive the stereoscopic effect. However, with the development of three-dimensional (3D) monitors and their use in the medical field, only a visual inspection is no longer required in the medical field. The question then arises as to whether there is any difference in recognizing depth information when using conventional methods and that when using a 3D monitor. We constructed a phantom and evaluated the difference in capacity to identify the depth information between the two methods. The phantom consists of acryl steps and 3mm diameter acryl pillars on the top and bottom of each step. Seven observers viewed these images stereoscopically using the two display methods and were asked to judge the direction of the pillar that was on the top. We compared these judged direction with the direction of the real pillar arranged on the top, and calculated the percentage of correct answerers (PCA). The results showed that PCA obtained using the 3D monitor method was higher PCA by about 5% than that obtained using the naked-eye method. This indicated that people could view images stereoscopically more precisely using the 3D monitor method than when using with conventional methods, like the crossed or parallel eye viewing. We were able to estimate the difference in capacity to identify the depth information between the two display methods.

  9. [Complex method of the diagnosis of the mandibular injury based of informational technologies].

    PubMed

    Korotkikh, N G; Bakhmet'ev, V I; Shalaev, O Iu; Antimenko, O O

    2004-01-01

    Special method of complex diagnosis of mandibular injury is under consideration. It is based on the informational technologies. The study of the mechanisms of the injury's formation has been made on 109 patients of the skull-jaw-facial surgery department of the hospital were under investigation. Two main types of jaw-facial injuries have been revealed. The first type: falling down from the height of one's own size (stature). The second type: blow (stroke) of a blunt object. The decrease of the number of the inflammatory complications of the broken jaw due to the usage of new algorithms on the basis of informational technologies has been noted.

  10. An organizational model to distinguish between and integrate research and evaluation activities in a theory based evaluation.

    PubMed

    Sample McMeeking, Laura B; Basile, Carole; Brian Cobb, R

    2012-11-01

    Theory-based evaluation (TBE) is an evaluation method that shows how a program will work under certain conditions and has been supported as a viable, evidence-based option in cases where randomized trials or high-quality quasi-experiments are not feasible. Despite the model's widely accepted theoretical appeal there are few examples of its well-implemented use, probably due to time and money limitations necessary for planning and a confusion over the definitions between research and evaluation functions and roles. In this paper, we describe the development of a theory-based evaluation design in a Math and Science Partnership (MSP) research project funded by the U.S. National Science Foundation (NSF). Through this work we developed an organizational model distinguishing between and integrating evaluation and research functions, explicating personnel roles and responsibilities, and highlighting connections between research and evaluation work. Although the research and evaluation components operated on independent budgeting, staffing, and implementation activities, we were able to combine datasets across activities to allow us to assess the integrity of the program theory, not just the hypothesized connections within it. This model has since been used for proposal development and has been invaluable as it creates a research and evaluation plan that is seamless from the beginning.

  11. A nonparametric statistical method for image segmentation using information theory and curve evolution.

    PubMed

    Kim, Junmo; Fisher, John W; Yezzi, Anthony; Cetin, Müjdat; Willsky, Alan S

    2005-10-01

    In this paper, we present a new information-theoretic approach to image segmentation. We cast the segmentation problem as the maximization of the mutual information between the region labels and the image pixel intensities, subject to a constraint on the total length of the region boundaries. We assume that the probability densities associated with the image pixel intensities within each region are completely unknown a priori, and we formulate the problem based on nonparametric density estimates. Due to the nonparametric structure, our method does not require the image regions to have a particular type of probability distribution and does not require the extraction and use of a particular statistic. We solve the information-theoretic optimization problem by deriving the associated gradient flows and applying curve evolution techniques. We use level-set methods to implement the resulting evolution. The experimental results based on both synthetic and real images demonstrate that the proposed technique can solve a variety of challenging image segmentation problems. Futhermore, our method, which does not require any training, performs as good as methods based on training.

  12. Closing the digital divide in HIV/AIDS care: development of a theory-based intervention to increase Internet access.

    PubMed

    Kalichman, S C; Weinhardt, L; Benotsch, E; Cherry, C

    2002-08-01

    Advances in information technology are revolutionizing medical patient education and the Internet is becoming a major source of information for people with chronic medical conditions, including HIV/AIDS. However, many AIDS patients do not have equal access to the Internet and are therefore at an information disadvantage, particularly minorities, persons of low-income levels and individuals with limited education. This paper describes the development and pilot testing of a workshop-style intervention designed to close the digital divide in AIDS care. Grounded in the Information-Motivation-Behavioral Skills (IMB) model of health behaviour change, we developed an intervention for persons with no prior history of using the Internet. The intervention included instruction in using hardware and search engines, motivational enhancement to increase interest and perceived relevance of the Internet, and skills for critically evaluating and using health information accessed via the Internet. Participants were also introduced to communication and support functions of the Internet including e-mail, newsgroups and chat groups. Pilot testing demonstrated feasibility, acceptability and promise for closing the digital divide in HIV/AIDS care using a relatively brief and intensive theory-based intervention that could be implemented in community settings.

  13. Extracting important information from Chinese Operation Notes with natural language processing methods.

    PubMed

    Wang, Hui; Zhang, Weide; Zeng, Qiang; Li, Zuofeng; Feng, Kaiyan; Liu, Lei

    2014-04-01

    Extracting information from unstructured clinical narratives is valuable for many clinical applications. Although natural Language Processing (NLP) methods have been profoundly studied in electronic medical records (EMR), few studies have explored NLP in extracting information from Chinese clinical narratives. In this study, we report the development and evaluation of extracting tumor-related information from operation notes of hepatic carcinomas which were written in Chinese. Using 86 operation notes manually annotated by physicians as the training set, we explored both rule-based and supervised machine-learning approaches. Evaluating on unseen 29 operation notes, our best approach yielded 69.6% in precision, 58.3% in recall and 63.5% F-score.

  14. Post-reconstruction non-local means filtering methods using CT side information for quantitative SPECT

    NASA Astrophysics Data System (ADS)

    Chun, Se Young; Fessler, Jeffrey A.; Dewaraja, Yuni K.

    2013-09-01

    Quantitative SPECT techniques are important for many applications including internal emitter therapy dosimetry where accurate estimation of total target activity and activity distribution within targets are both potentially important for dose-response evaluations. We investigated non-local means (NLM) post-reconstruction filtering for accurate I-131 SPECT estimation of both total target activity and the 3D activity distribution. We first investigated activity estimation versus number of ordered-subsets expectation-maximization (OSEM) iterations. We performed simulations using the XCAT phantom with tumors containing a uniform and a non-uniform activity distribution, and measured the recovery coefficient (RC) and the root mean squared error (RMSE) to quantify total target activity and activity distribution, respectively. We observed that using more OSEM iterations is essential for accurate estimation of RC, but may or may not improve RMSE. We then investigated various post-reconstruction filtering methods to suppress noise at high iteration while preserving image details so that both RC and RMSE can be improved. Recently, NLM filtering methods have shown promising results for noise reduction. Moreover, NLM methods using high-quality side information can improve image quality further. We investigated several NLM methods with and without CT side information for I-131 SPECT imaging and compared them to conventional Gaussian filtering and to unfiltered methods. We studied four different ways of incorporating CT information in the NLM methods: two known (NLM CT-B and NLM CT-M) and two newly considered (NLM CT-S and NLM CT-H). We also evaluated the robustness of NLM filtering using CT information to erroneous CT. NLM CT-S and NLM CT-H yielded comparable RC values to unfiltered images while substantially reducing RMSE. NLM CT-S achieved -2.7 to 2.6% increase of RC compared to no filtering and NLM CT-H yielded up to 6% decrease in RC while other methods yielded lower RCs

  15. Post-reconstruction non-local means filtering methods using CT side information for quantitative SPECT.

    PubMed

    Chun, Se Young; Fessler, Jeffrey A; Dewaraja, Yuni K

    2013-09-07

    Quantitative SPECT techniques are important for many applications including internal emitter therapy dosimetry where accurate estimation of total target activity and activity distribution within targets are both potentially important for dose–response evaluations. We investigated non-local means (NLM) post-reconstruction filtering for accurate I-131 SPECT estimation of both total target activity and the 3D activity distribution. We first investigated activity estimation versus number of ordered-subsets expectation–maximization (OSEM) iterations. We performed simulations using the XCAT phantom with tumors containing a uniform and a non-uniform activity distribution, and measured the recovery coefficient (RC) and the root mean squared error (RMSE) to quantify total target activity and activity distribution, respectively. We observed that using more OSEM iterations is essential for accurate estimation of RC, but may or may not improve RMSE. We then investigated various post-reconstruction filtering methods to suppress noise at high iteration while preserving image details so that both RC and RMSE can be improved. Recently, NLM filtering methods have shown promising results for noise reduction. Moreover, NLM methods using high-quality side information can improve image quality further. We investigated several NLM methods with and without CT side information for I-131 SPECT imaging and compared them to conventional Gaussian filtering and to unfiltered methods. We studied four different ways of incorporating CT information in the NLM methods: two known (NLM CT-B and NLM CT-M) and two newly considered (NLM CT-S and NLM CT-H). We also evaluated the robustness of NLM filtering using CT information to erroneous CT. NLM CT-S and NLM CT-H yielded comparable RC values to unfiltered images while substantially reducing RMSE. NLM CT-S achieved −2.7 to 2.6% increase of RC compared to no filtering and NLM CT-H yielded up to 6% decrease in RC while other methods yielded lower

  16. A method for tailoring the information content of a software process model

    NASA Technical Reports Server (NTRS)

    Perkins, Sharon; Arend, Mark B.

    1990-01-01

    The framework is defined for a general method for selecting a necessary and sufficient subset of a general software life cycle's information products, to support new software development process. Procedures for characterizing problem domains in general and mapping to a tailored set of life cycle processes and products is presented. An overview of the method is shown using the following steps: (1) During the problem concept definition phase, perform standardized interviews and dialogs between developer and user, and between user and customer; (2) Generate a quality needs profile of the software to be developed, based on information gathered in step 1; (3) Translate the quality needs profile into a profile of quality criteria that must be met by the software to satisfy the quality needs; (4) Map the quality criteria to a set of accepted processes and products for achieving each criterion; (5) select the information products which match or support the accepted processes and product of step 4; and (6) Select the design methodology which produces the information products selected in step 5.

  17. A method for tailoring the information content of a software process model

    NASA Technical Reports Server (NTRS)

    Perkins, Sharon; Arend, Mark B.

    1990-01-01

    The framework is defined for a general method for selecting a necessary and sufficient subset of a general software life cycle's information products, to support new software development process. Procedures for characterizing problem domains in general and mapping to a tailored set of life cycle processes and products is presented. An overview of the method is shown using the following steps: (1) During the problem concept definition phase, perform standardized interviews and dialogs between developer and user, and between user and customer; (2) Generate a quality needs profile of the software to be developed, based on information gathered in step 1; (3) Translate the quality needs profile into a profile of quality criteria that must be met by the software to satisfy the quality needs; (4) Map the quality criteria to set of accepted processes and products for achieving each criterion; (5) Select the information products which match or support the accepted processes and product of step 4; and (6) Select the design methodology which produces the information products selected in step 5.

  18. NIST method for determining model-independent structural information by X-ray reflectometry

    SciTech Connect

    Windover, D.; Cline, J. P.; Henins, A.; Gil, D. L.; Armstrong, N.; Hung, P. Y.; Song, S. C.; Jammy, R.; Diebold, A.

    2007-09-26

    This work provides a method for determining when X-ray reflectometry (XRR) data provides useful information about structural model parameters. State-of-the-art analysis approaches for XRR data emphasize fitting measured data to a single structural model using fast optimization methods, such as genetic algorithms (GA). Though such optimization may find the best solution for a given model, it does not adequately map the parameter space to provide uncertainty estimates or test structural model validity. We present here two approaches for determining which structural parameters convey accurate information about the physical reality. First, using GA refinement, we repeatedly fit the data to several structural models. By comparing the maximum-likelihood estimates of the parameters in each model, we identify model-independent information. Second, we perform a Monte Carlo Markov Chain (MCMC) analysis using the most self-consistent structural model to provide uncertainty estimates for structural parameters. This two step approach uses fast, optimized refinement to search a range of models to locate structural information and a more detailed MCMC sampling to estimate parameter uncertainties. Here we present an example of this approach on a ZrN/TiN/Si structure, concentrating on thickness.

  19. The method providing fault-tolerance for information and control systems of the industrial mechatronic objects

    NASA Astrophysics Data System (ADS)

    Melnik, E. V.; Klimenko, A. B.; Korobkin, V. V.

    2017-02-01

    The paper deals with the provision of information and control system fault-tolerance. Nowadays, a huge quantity of industrial mechatronic objects operate within hazardous environments, where the human is not supposed to be. So the question of fault-tolerant information and control system design and development becomes the cornerstone of a large amount of industrial mechatronic objects. Within this paper, a new complex method of providing the reconfigurable systems fault-tolerance is represented. It bases on performance redundancy and decentralized dispatching principles. The key term within the method presented is a ‘configuration’, so the model of the configuration forming problem is represented too, and simulation results are given and discussed briefly.

  20. Liminality in cultural transition: applying ID-EA to advance a concept into theory-based practice.

    PubMed

    Baird, Martha B; Reed, Pamela G

    2015-01-01

    As global migration increases worldwide, nursing interventions are needed to address the effects of migration on health. The concept of liminality emerged as a pivotal concept in the situation-specific theory of well-being in refugee women experiencing cultural transition. As a relatively new concept in the discipline of nursing, liminality is explored using a method, called ID-EA, which we developed to advance a theoretical concept for application to nursing practice. Liminality in the context of cultural transition is further developed using the five steps of inquiry of the ID-EA method. The five steps are as follows: (1) inductive inquiry: qualitative research, (2) deductive inquiry: literature review, (3) synthesis of inductive and deductive inquiry, (4) evaluation inquiry, and (5) application-to-practice inquiry. The overall goal of this particular work was to develop situation-specific, theory-based interventions that facilitate cultural transitions for immigrants and refugees.

  1. Using Emergence Theory-Based Curriculum to Teach Compromise Skills to Students with Autistic Spectrum Disorders

    ERIC Educational Resources Information Center

    Fein, Lance; Jones, Don

    2015-01-01

    This study addresses the compromise skills that are taught to students diagnosed with autistic spectrum disorders (ASD) and related social and communication deficits. A private school in the southeastern United States implemented an emergence theory-based curriculum to address these skills, yet no formal analysis was conducted to determine its…

  2. A Theory-Based Approach to Reading Assessment in the Army. Technical Report 625.

    ERIC Educational Resources Information Center

    Oxford-Carpenter, Rebecca L.; Schultz-Shiner, Linda J.

    Noting that the United States Army Research Institute for the Behavioral and Social Sciences (ARI) has been involved in research on reading assessment in the Army from both practical and theoretical perspectives, this paper addresses practical Army problems in reading assessment from a theory base that reflects the most recent and most sound…

  3. Validating a Theory-Based Survey to Evaluate Teaching Effectiveness in Higher Education

    ERIC Educational Resources Information Center

    Amrein-Beardsley, A.; Haladyna, T.

    2012-01-01

    Surveys to evaluate instructor effectiveness are commonly used in higher education. Yet the survey items included are often drawn from other surveys without reference to a theory of adult learning. The authors present the results from a validation study of such a theory-based survey. They evidence that an evaluation survey based on a theory that…

  4. Schema Theory-Based Pre-Reading Tasks: A Neglected Essential in the ESL Reading Class.

    ERIC Educational Resources Information Center

    Ajideh, Parviz

    2003-01-01

    Describes a study in which an English-as-a-Second-Language reading instructor worked with a group of intermediate students that focused on schema theory-based pre-reading activities. Highlights the students' impressions on the strategies covered during the term. (Author/VWL)

  5. Ninter-Networked Interaction: Theory-based Cases in Teaching and Learning.

    ERIC Educational Resources Information Center

    Saarenkunnas, Maarit; Jarvela, Sanna; Hakkinen, Paivi; Kuure, Leena; Taalas, Peppi; Kunelius, Esa

    2000-01-01

    Describes the pedagogical framework of an interdisciplinary, international project entitled NINTER (Networked Interaction: Theory-Based Cases in Teaching and Learning). Discusses a pedagogical model for teacher and staff development programs in a networked environment; distributed cognition; cognitive apprenticeship; challenges for educational…

  6. Effects of a Theory-Based, Peer-Focused Drug Education Course.

    ERIC Educational Resources Information Center

    Gonzalez, Gerardo M.

    1990-01-01

    Describes innovative, theory-based, peer-focused college drug education academic course and its effect on perceived levels of risk associated with the use of alcohol, marijuana, and cocaine. Evaluation of the effects of the course indicated the significant effect on perceived risk of cocaine, but not alcohol or marijuana. (Author/ABL)

  7. Advancing the Development and Application of Theory-Based Evaluation in the Practice of Public Health.

    ERIC Educational Resources Information Center

    Cole, Galen E.

    1999-01-01

    Provides strategies for constructing theories of theory-based evaluation and provides examples in the field of public health. Techniques are designed to systematize and bring objectivity to the process of theory construction. Also introduces a framework of program theory. (SLD)

  8. Development and Evaluation of a Theory-Based Physical Activity Guidebook for Breast Cancer Survivors

    ERIC Educational Resources Information Center

    Vallance, Jeffrey K.; Courneya, Kerry S.; Taylor, Lorian M.; Plotnikoff, Ronald C.; Mackey, John R.

    2008-01-01

    This study's objective was to develop and evaluate the suitability and appropriateness of a theory-based physical activity (PA) guidebook for breast cancer survivors. Guidebook content was constructed based on the theory of planned behavior (TPB) using salient exercise beliefs identified by breast cancer survivors in previous research. Expert…

  9. Assessment of Prevalence of Persons with Down Syndrome: A Theory-Based Demographic Model

    ERIC Educational Resources Information Center

    de Graaf, Gert; Vis, Jeroen C.; Haveman, Meindert; van Hove, Geert; de Graaf, Erik A. B.; Tijssen, Jan G. P.; Mulder, Barbara J. M.

    2011-01-01

    Background: The Netherlands are lacking reliable empirical data in relation to the development of birth and population prevalence of Down syndrome. For the UK and Ireland there are more historical empirical data available. A theory-based model is developed for predicting Down syndrome prevalence in the Netherlands from the 1950s onwards. It is…

  10. Towards a Theory-Based Design Framework for an Effective E-Learning Computer Programming Course

    ERIC Educational Resources Information Center

    McGowan, Ian S.

    2016-01-01

    Built on Dabbagh (2005), this paper presents a four component theory-based design framework for an e-learning session in introductory computer programming. The framework, driven by a body of exemplars component, emphasizes the transformative interaction between the knowledge building community (KBC) pedagogical model, a mixed instructional…

  11. Lessons Learnt from Employing van Hiele Theory Based Instruction in Senior Secondary School Geometry Classrooms

    ERIC Educational Resources Information Center

    Alex, Jogymol Kalariparambil; Mammen, Kuttickattu John

    2016-01-01

    This paper reports on a part of a study which was conducted to determine the effect of van Hiele theory based instruction in the teaching of geometry to Grade 10 learners. The sample consisted of 359 participants from five conveniently selected schools from Mthatha District in the Eastern Cape Province in South Africa. There were 195 learners in…

  12. Assessing Instructional Reform in San Diego: A Theory-Based Approach

    ERIC Educational Resources Information Center

    O'Day, Jennifer; Quick, Heather E.

    2009-01-01

    This article provides an overview of the approach, methodology, and key findings from a theory-based evaluation of the district-led instructional reform effort in San Diego City Schools, under the leadership of Alan Bersin and Anthony Alvarado, that began in 1998. Beginning with an analysis of the achievement trends in San Diego relative to other…

  13. Genetic algorithm and graph theory based matrix factorization method for online friend recommendation.

    PubMed

    Li, Qu; Yao, Min; Yang, Jianhua; Xu, Ning

    2014-01-01

    Online friend recommendation is a fast developing topic in web mining. In this paper, we used SVD matrix factorization to model user and item feature vector and used stochastic gradient descent to amend parameter and improve accuracy. To tackle cold start problem and data sparsity, we used KNN model to influence user feature vector. At the same time, we used graph theory to partition communities with fairly low time and space complexity. What is more, matrix factorization can combine online and offline recommendation. Experiments showed that the hybrid recommendation algorithm is able to recommend online friends with good accuracy.

  14. Evaluation of optimization methods for nonrigid medical image registration using mutual information and B-splines.

    PubMed

    Klein, Stefan; Staring, Marius; Pluim, Josien P W

    2007-12-01

    A popular technique for nonrigid registration of medical images is based on the maximization of their mutual information, in combination with a deformation field parameterized by cubic B-splines. The coordinate mapping that relates the two images is found using an iterative optimization procedure. This work compares the performance of eight optimization methods: gradient descent (with two different step size selection algorithms), quasi-Newton, nonlinear conjugate gradient, Kiefer-Wolfowitz, simultaneous perturbation, Robbins-Monro, and evolution strategy. Special attention is paid to computation time reduction by using fewer voxels to calculate the cost function and its derivatives. The optimization methods are tested on manually deformed CT images of the heart, on follow-up CT chest scans, and on MR scans of the prostate acquired using a BFFE, T1, and T2 protocol. Registration accuracy is assessed by computing the overlap of segmented edges. Precision and convergence properties are studied by comparing deformation fields. The results show that the Robbins-Monro method is the best choice in most applications. With this approach, the computation time per iteration can be lowered approximately 500 times without affecting the rate of convergence by using a small subset of the image, randomly selected in every iteration, to compute the derivative of the mutual information. From the other methods the quasi-Newton and the nonlinear conjugate gradient method achieve a slightly higher precision, at the price of larger computation times.

  15. ROI-preserving 3D video compression method utilizing depth information

    NASA Astrophysics Data System (ADS)

    Ti, Chunli; Xu, Guodong; Guan, Yudong; Teng, Yidan

    2015-09-01

    Efficiently transmitting the extra information of three dimensional (3D) video is becoming a key issue of the development of 3DTV. 2D plus depth format not only occupies the smaller bandwidth and is compatible transmission under the condition of the existing channel, but also can provide technique support for advanced 3D video compression in some extend. This paper proposes an ROI-preserving compression scheme to further improve the visual quality at a limited bit rate. According to the connection between the focus of Human Visual System (HVS) and depth information, region of interest (ROI) can be automatically selected via depth map progressing. The main improvement from common method is that a meanshift based segmentation is executed to the depth map before foreground ROI selection to keep the integrity of scene. Besides, the sensitive areas along the edges are also protected. The Spatio-temporal filtering adapting to H.264 is used to the non-ROI of both 2D video and depth map before compression. Experiments indicate that, the ROI extracted by this method is more undamaged and according with subjective feeling, and the proposed method can keep the key high-frequency information more effectively while the bit rate is reduced.

  16. Aphasic speech with and without SentenceShaper: Two methods for assessing informativeness.

    PubMed

    Fink, Ruth B; Bartlett, Megan R; Lowery, Jennifer S; Linebarger, Marcia C; Schwartz, Myrna F

    2008-01-01

    BACKGROUND: SentenceShaper((R)) (SSR) is a computer program that is for speech what a word-processing program is for written text; it allows the user to record words and phrases, play them back, and manipulate them on-screen to build sentences and narratives. A recent study demonstrated that when listeners rated the informativeness of functional narratives produced by chronic aphasic speakers with and without the program they gave higher informativeness ratings to the language produced with the aid of the program (Bartlett, Fink, Schwartz, & Linebarger, 2007). Bartlett et al. (2007) also compared unaided (spontaneous) narratives produced before and after the aided version of the narrative was obtained. In a subset of comparisons, the sample created after was judged to be more informative; they called this "topic-specific carryover". AIMS: (1) To determine whether differences in informativeness that Bartlett et al.'s listeners perceived are also revealed by Correct Information Unit (CIU) analysis (Nicholas & Brookshire, 1993)-a well studied, objective method for measuring informativeness-and (2) to demonstrate the usefulness of CIU analysis for samples of this type. METHODS #ENTITYSTARTX00026; PROCEDURES: A modified version of the CIU analysis was applied to the speech samples obtained by Bartlett et al. (2007). They had asked five individuals with chronic aphasia to create functional narratives on two topics, under three conditions: Unaided ("U"), Aided ("SSR"), & Post-SSR Unaided ("Post-U"). Here, these samples were analysed for differences in % CIUs across conditions. Linear associations between listener judgements and CIU measures were evaluated with bivariate correlations and multiple regression analysis. OUTCOMES #ENTITYSTARTX00026; RESULTS: (1) The aided effect was confirmed: samples produced with SentenceShaper had higher % CIUs, in most cases exceeding 90%. (2) There was little CONCLUSIONS: That the percentage of CIUs was higher in SSR-aided samples than in

  17. 30 CFR 48.23 - Training plans; time of submission; where filed; information required; time for approval; method...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...; information required; time for approval; method for disapproval; commencement of training; approval of... filed; information required; time for approval; method for disapproval; commencement of training... miners as a normal method of operation by the operator. The operator to be so excepted shall maintain...

  18. Optical methods for molecular sensing: Supplementing imaging of tissue microstructure with molecular information

    NASA Astrophysics Data System (ADS)

    Winkler, Amy Marie

    More and more researchers and clinicians are looking to molecular sensing to predict how cells will behave, seeking the answers to questions like will these tumor cells become malignant? or how will these cells respond to chemotherapy? Optical methods are attractive for answering these questions because optical radiation is safer and less expensive than alternative methods, such as CT which uses X-ray radiation, PET/SPECT which use gamma radiation, or MRI which is expensive and only available in a hospital setting. In this dissertation, three distinct optical methods are explored to detect at the molecular level: optical coherence tomography (OCT), laser-induced fluorescence (LIF), and optical polarimetry. OCT has the capability to simultaneously capture anatomical information as well as molecular information using targeted contrast agents such as gold nanoshells. LIF is less useful for capturing anatomical information, but it can achieve significantly better molecular sensitivity with the use of targeted fluorescent dyes. Optical polarimetry has potential to detect the concentration of helical molecules, such as glucose. All of these methods are noninvasive or minimally invasive. The work is organized into four specific aims. The first is the design and implementation of a fast, high resolution, endoscopic OCT system to facilitate minimally invasive mouse colon imaging. The second aim is to demonstrate the utility of this system for automatically identifying tumor lesions based on tissue microstructure. The third is to demonstrate the use of contrast agents to detect molecular expression using OCT and LIF. The last aim is to demonstrate a new method based on optical polarimetry for noninvasive glucose sensing.

  19. A Method to Quantify Visual Information Processing in Children Using Eye Tracking

    PubMed Central

    Kooiker, Marlou J.G.; Pel, Johan J.M.; van der Steen-Kant, Sanny P.; van der Steen, Johannes

    2016-01-01

    Visual problems that occur early in life can have major impact on a child's development. Without verbal communication and only based on observational methods, it is difficult to make a quantitative assessment of a child's visual problems. This limits accurate diagnostics in children under the age of 4 years and in children with intellectual disabilities. Here we describe a quantitative method that overcomes these problems. The method uses a remote eye tracker and a four choice preferential looking paradigm to measure eye movement responses to different visual stimuli. The child sits without head support in front of a monitor with integrated infrared cameras. In one of four monitor quadrants a visual stimulus is presented. Each stimulus has a specific visual modality with respect to the background, e.g., form, motion, contrast or color. From the reflexive eye movement responses to these specific visual modalities, output parameters such as reaction times, fixation accuracy and fixation duration are calculated to quantify a child's viewing behavior. With this approach, the quality of visual information processing can be assessed without the use of communication. By comparing results with reference values obtained in typically developing children from 0-12 years, the method provides a characterization of visual information processing in visually impaired children. The quantitative information provided by this method can be advantageous for the field of clinical visual assessment and rehabilitation in multiple ways. The parameter values provide a good basis to: (i) characterize early visual capacities and consequently to enable early interventions; (ii) compare risk groups and follow visual development over time; and (iii), construct an individual visual profile for each child. PMID:27500922

  20. Aircraft target onboard detecting technology via Circular Information Matching method for remote sensing satellite

    NASA Astrophysics Data System (ADS)

    Xiao, Huachao; Zhou, Quan; Li, Li

    2015-10-01

    Image information onboard processing is one o f important technology to rapidly achieve intelligence for remote sensing satellites. As a typical target, aircraft onboard detection has been getting more attention. In this paper, we propose an efficient method of aircraft detection for remote sensing satellite onboard processing. According to the feature of aircraft performance in remote sensing image, the detection algorithm consists of two steps: First Salient Object Detection (SOD) is employed to reduce the amount of calculation on large remote sensing image. SOD uses Gabor filtering and a simple binary test between pixels in a filtered image. White points are connected as regions. Plane candidate regions are screened from white regions by area, length and width of connected region. Next a new algorithm, called Circumferential Information Matching method, is used to detect aircraft on candidate regions. The results of tests show circumference curve around the plane center is stable shape, so the candidate region can be accurately detecting with this feature. In order to rotation invariant, we use circle matched filter to detect target. And discrete fast Fourier transform (DFFT) is used to accelerate and reduce calculation. Experiments show the detection accuracy rate of proposed algorithm is 90% with less than 0.5s processing time. In addition, the calculation of the proposed method through quantitative anglicized is very small. Experimental results and theoretical analysis show that the proposed method is reasonable and highly-efficient.

  1. Sex education and contraceptive methods: knowledge and sources of information among the Estonian population.

    PubMed

    Kalda, R; Sarapuu, H; Pikk, A; Lember, M

    1998-06-01

    A survey on sex education and contraceptive methods was carried out within a monthly EMOR Omnibus Survey. By using a questionnaire, knowledge and attitudes, as well as the main sources of information on contraceptive methods and sex education, among the Estonian adult population (n = 618) was investigated. Of the respondents, 68% were female and 32% were males: the mean age was 34 years. Almost all respondents expressed the opinion that sex education should start at school and that education on contraceptive methods would reduce the number of abortions. The majority of the respondents believed that it would be more convenient to visit a family doctor than a gynecologist for family planning. Main sources of information on contraception were: literature, doctors and journals, as rated by females; and literature, partners and television, as rated by males. The roles of the school nurse, father and siblings were rated as comparatively small. The level of respondents' knowledge of contraceptive methods was not too high. It is concluded that the prerequisites for changing sexual behavior and knowledge over a short time are wider use of mass media and better sex education at schools. Also, it is necessary to prepare family doctors to offer family planning services to their patients.

  2. Data Delivery Method Based on Neighbor Nodes' Information in a Mobile Ad Hoc Network

    PubMed Central

    Hayashi, Takuma; Taenaka, Yuzo; Okuda, Takeshi; Yamaguchi, Suguru

    2014-01-01

    This paper proposes a data delivery method based on neighbor nodes' information to achieve reliable communication in a mobile ad hoc network (MANET). In a MANET, it is difficult to deliver data reliably due to instabilities in network topology and wireless network condition which result from node movement. To overcome such unstable communication, opportunistic routing and network coding schemes have lately attracted considerable attention. Although an existing method that employs such schemes, MAC-independent opportunistic routing and encoding (MORE), Chachulski et al. (2007), improves the efficiency of data delivery in an unstable wireless mesh network, it does not address node movement. To efficiently deliver data in a MANET, the method proposed in this paper thus first employs the same opportunistic routing and network coding used in MORE and also uses the location information and transmission probabilities of neighbor nodes to adapt to changeable network topology and wireless network condition. The simulation experiments showed that the proposed method can achieve efficient data delivery with low network load when the movement speed is relatively slow. PMID:24672371

  3. Improved method for calculating the respiratory line length in the Concealed Information Test.

    PubMed

    Matsuda, Izumi; Ogawa, Tokihiro

    2011-08-01

    The Concealed Information Test (CIT) assesses an examinee's knowledge about a crime based on response differences between crime-relevant and crime-irrelevant items. One effective measure in the CIT is the respiration line length, which is the average of the moving distances of the respiration curve in a specified time interval after the item onset. However, the moving distance differs between parts of a respiratory cycle. As a result, the calculated respiration line length is biased by how the parts of the respiratory cycles are included in the time interval. To resolve this problem, we propose a weighted average method, which calculates the respiration line length per cycle and weights it with the proportion that the cycle occupies in the time interval. Simulation results indicated that the weighted average method removes the bias of respiration line lengths compared to the original method. The results of experimental CIT data demonstrated that the weighted average method significantly increased the discrimination performance as compared with the original method. The weighted average method is a promising method for assessing respiration changes in response to question items more accurately, which improves the respiration-based discrimination performance of the CIT.

  4. Support of Wheelchairs Using Pheromone Information with Two Types of Communication Methods

    NASA Astrophysics Data System (ADS)

    Yamamoto, Koji; Nitta, Katsumi

    In this paper, we propose a communication framework which combined two types of communication among wheelchairs and mobile devices. Due to restriction of range of activity, there is a problem that wheelchair users tend to shut themselves up in their houses. We developed a navigational wheelchair which loads a system that displays information on a map through WWW. However, this wheelchair is expensive because it needs a solid PC, a precise GPS, a battery, and so on. We introduce mobile devices and use this framework to provide information to wheelchair users and to facilitate them to go out. When a user encounters other users, they exchange messages which they have by short-distance wireless communication. Once a message is delivered to a navigational wheelchair, the wheelchair uploads the message to the system. We use two types of pheromone information which represent trends of user's movement and existences of a crowd of users. First, when users gather, ``crowd of people pheromone'' is emitted virtually. Users do not send these pheromones to the environment but carry them. If the density exceeds the threshold, messages that express ``people gethered'' are generated automatically. The other pheromone is ``movement trend pheromone'', which is used to improve probability of successful transmissions. From results of experiments, we concluded that our method can deliver information that wheelchair users gathered to other wheelchairs.

  5. A Novel Evaluation Method for Building Construction Project Based on Integrated Information Entropy with Reliability Theory

    PubMed Central

    Bai, Xiao-ping; Zhang, Xi-wei

    2013-01-01

    Selecting construction schemes of the building engineering project is a complex multiobjective optimization decision process, in which many indexes need to be selected to find the optimum scheme. Aiming at this problem, this paper selects cost, progress, quality, and safety as the four first-order evaluation indexes, uses the quantitative method for the cost index, uses integrated qualitative and quantitative methodologies for progress, quality, and safety indexes, and integrates engineering economics, reliability theories, and information entropy theory to present a new evaluation method for building construction project. Combined with a practical case, this paper also presents detailed computing processes and steps, including selecting all order indexes, establishing the index matrix, computing score values of all order indexes, computing the synthesis score, sorting all selected schemes, and making analysis and decision. Presented method can offer valuable references for risk computing of building construction projects. PMID:23533352

  6. A robust medical image segmentation method using KL distance and local neighborhood information.

    PubMed

    Zheng, Qian; Lu, Zhentai; Yang, Wei; Zhang, Minghui; Feng, Qianjin; Chen, Wufan

    2013-06-01

    In this paper, we propose an improved Chan-Vese (CV) model that uses Kullback-Leibler (KL) distances and local neighborhood information (LNI). Due to the effects of heterogeneity and complex constructions, the performance of level set segmentation is subject to confounding by the presence of nearby structures of similar intensity, preventing it from discerning the exact boundary of the object. Moreover, the CV model cannot usually obtain accurate results in medical image segmentation in cases of optimal configuration of controlling parameters, which requires substantial manual intervention. To overcome the above deficiency, we improve the segmentation accuracy by the usage of KL distance and LNI, thereby introducing the image local characteristics. Performance evaluation of the present method was achieved through experiments on the synthetic images and a series of real medical images. The extensive experimental results showed the superior performance of the proposed method over the state-of-the-art methods, in terms of both robustness and efficiency.

  7. A novel evaluation method for building construction project based on integrated information entropy with reliability theory.

    PubMed

    Bai, Xiao-ping; Zhang, Xi-wei

    2013-01-01

    Selecting construction schemes of the building engineering project is a complex multiobjective optimization decision process, in which many indexes need to be selected to find the optimum scheme. Aiming at this problem, this paper selects cost, progress, quality, and safety as the four first-order evaluation indexes, uses the quantitative method for the cost index, uses integrated qualitative and quantitative methodologies for progress, quality, and safety indexes, and integrates engineering economics, reliability theories, and information entropy theory to present a new evaluation method for building construction project. Combined with a practical case, this paper also presents detailed computing processes and steps, including selecting all order indexes, establishing the index matrix, computing score values of all order indexes, computing the synthesis score, sorting all selected schemes, and making analysis and decision. Presented method can offer valuable references for risk computing of building construction projects.

  8. Bilateral Teleoperation Method Using an Autonomous Control Based on Information on Contact Environment

    NASA Astrophysics Data System (ADS)

    Taguchi, Keiichi; Ohnishi, Kouhei

    In procedures that involve remote control, such as remote surgery, it is necessary to operate a robot in a remote location in a sensitive environment; the treatment of internal organs is an example of such a procedure. In this paper, we propose a method for autonomous hazard avoidance control that is based on information on the contact environment. The proposed method involves the use of bilateral control. During safe operations, systems are controlled by bilateral control. During dangerous operations, a slave system is controlled autonomously so as to avoid dangerous operations. In order to determine the degree of operation risk, fuzzy set theory is applied to the force exerted on the environment. Further, variable compliance control based on the force exerted on the environment is utilized to avoid the risk. The effectiveness of the proposed method is confirmed by experimental results.

  9. Co-design of RAD and ETHICS methodologies: a combination of information system development methods

    NASA Astrophysics Data System (ADS)

    Nasehi, Arezo; Shahriyari, Salman

    2011-12-01

    Co-design is a new trend in the social world which tries to capture different ideas in order to use the most appropriate features for a system. In this paper, co-design of two information system methodologies is regarded; rapid application development (RAD) and effective technical and human implementation of computer-based systems (ETHICS). We tried to consider the characteristics of these methodologies to see the possibility of having a co-design or combination of them for developing an information system. To reach this purpose, four different aspects of them are analyzed: social or technical approach, user participation and user involvement, job satisfaction, and overcoming change resistance. Finally, a case study using the quantitative method is analyzed in order to examine the possibility of co-design using these factors. The paper concludes that RAD and ETHICS are appropriate to be co-designed and brings some suggestions for the co-design.

  10. Method for Detecting Core Malware Sites Related to Biomedical Information Systems

    PubMed Central

    Kim, Dohoon; Choi, Donghee; Jin, Jonghyun

    2015-01-01

    Most advanced persistent threat attacks target web users through malicious code within landing (exploit) or distribution sites. There is an urgent need to block the affected websites. Attacks on biomedical information systems are no exception to this issue. In this paper, we present a method for locating malicious websites that attempt to attack biomedical information systems. Our approach uses malicious code crawling to rearrange websites in the order of their risk index by analyzing the centrality between malware sites and proactively eliminates the root of these sites by finding the core-hub node, thereby reducing unnecessary security policies. In particular, we dynamically estimate the risk index of the affected websites by analyzing various centrality measures and converting them into a single quantified vector. On average, the proactive elimination of core malicious websites results in an average improvement in zero-day attack detection of more than 20%. PMID:25821511

  11. Methods and apparatuses for information analysis on shared and distributed computing systems

    DOEpatents

    Bohn, Shawn J [Richland, WA; Krishnan, Manoj Kumar [Richland, WA; Cowley, Wendy E [Richland, WA; Nieplocha, Jarek [Richland, WA

    2011-02-22

    Apparatuses and computer-implemented methods for analyzing, on shared and distributed computing systems, information comprising one or more documents are disclosed according to some aspects. In one embodiment, information analysis can comprise distributing one or more distinct sets of documents among each of a plurality of processes, wherein each process performs operations on a distinct set of documents substantially in parallel with other processes. Operations by each process can further comprise computing term statistics for terms contained in each distinct set of documents, thereby generating a local set of term statistics for each distinct set of documents. Still further, operations by each process can comprise contributing the local sets of term statistics to a global set of term statistics, and participating in generating a major term set from an assigned portion of a global vocabulary.

  12. DEVELOPMENT OF AUTOMATIC EXTRACTION METHOD FOR ROAD UPDATE INFORMATION BASED ON PUBLIC WORK ORDER OUTLOOK

    NASA Astrophysics Data System (ADS)

    Sekimoto, Yoshihide; Nakajo, Satoru; Minami, Yoshitaka; Yamaguchi, Syohei; Yamada, Harutoshi; Fuse, Takashi

    Recently, disclosure of statistic data, representing financial effects or burden for public work, through each web site of national or local government, enables us to discuss macroscopic financial trends. However, it is still difficult to grasp a basic property nationwide how each spot was changed by public work. In this research, our research purpose is to collect road update information reasonably which various road managers provide, in order to realize efficient updating of various maps such as car navigation maps. In particular, we develop the system extracting public work concerned and registering summary including position information to database automatically from public work order outlook, released by each local government, combinating some web mining technologies. Finally, we collect and register several tens of thousands from web site all over Japan, and confirm the feasibility of our method.

  13. Laser pulse design using optimal control theory-based adaptive simulated annealing technique: vibrational transitions and photo-dissociation

    NASA Astrophysics Data System (ADS)

    Nath, Bikram; Mondal, Chandan Kumar

    2014-08-01

    We have designed and optimised a combined laser pulse using optimal control theory-based adaptive simulated annealing technique for selective vibrational excitations and photo-dissociation. Since proper choice of pulses for specific excitation and dissociation phenomena is very difficult, we have designed a linearly combined pulse for such processes and optimised the different parameters involved in those pulses so that we can get an efficient combined pulse. The technique makes us free from choosing any arbitrary type of pulses and makes a ground to check their suitability. We have also emphasised on how we can improve the performance of simulated annealing technique by introducing an adaptive step length of the different variables during the optimisation processes. We have also pointed out on how we can choose the initial temperature for the optimisation process by introducing heating/cooling step to reduce the annealing steps so that the method becomes cost effective.

  14. Comparison of Seven Methods for Boolean Factor Analysis and Their Evaluation by Information Gain.

    PubMed

    Frolov, Alexander A; Húsek, Dušan; Polyakov, Pavel Yu

    2016-03-01

    An usual task in large data set analysis is searching for an appropriate data representation in a space of fewer dimensions. One of the most efficient methods to solve this task is factor analysis. In this paper, we compare seven methods for Boolean factor analysis (BFA) in solving the so-called bars problem (BP), which is a BFA benchmark. The performance of the methods is evaluated by means of information gain. Study of the results obtained in solving BP of different levels of complexity has allowed us to reveal strengths and weaknesses of these methods. It is shown that the Likelihood maximization Attractor Neural Network with Increasing Activity (LANNIA) is the most efficient BFA method in solving BP in many cases. Efficacy of the LANNIA method is also shown, when applied to the real data from the Kyoto Encyclopedia of Genes and Genomes database, which contains full genome sequencing for 1368 organisms, and to text data set R52 (from Reuters 21578) typically used for label categorization.

  15. A Novel Method of Multi-Information Acquisition for Electromagnetic Flow Meters

    PubMed Central

    Cui, Wenhua; Li, Bin; Chen, Jie; Li, Xinwei

    2015-01-01

    In this paper, a novel method is proposed for multi-information acquisition from the electromagnetic flow meter, using magnetic excitation to measure the fluid velocity and electrochemistry impedance spectroscopy (EIS) for both the fluid quality and the contamination level of the transducer. The impedance spectra of the transducer are measured with an additional electrical stimulus in series with the electrode measurement loop. The series connection mode instead of the parallel one improves the signal-to-noise ratio (SNR) of the fluid velocity measurement and offers a wide range of impedance measurements by using a sample capacitance. In addition, a multi-frequency synchronous excitation source is synthesized based on the method of dual-base power sequences for fast EIS measurement. The conductivity measurements in the range of 1.7 μS/cm–2 mS/cm showed a relatively high accuracy with a measurement error of 5%, and the electrode adhesion detection on both with coating and no coating showed the ability of the qualitative determination of the electrode adhesion, which validated the feasibility of the multi-information acquisition method for the electromagnetic flow meter (EMFM). PMID:26712762

  16. Geographic Information System Software to Remodel Population Data Using Dasymetric Mapping Methods

    USGS Publications Warehouse

    Sleeter, Rachel; Gould, Michael

    2007-01-01

    The U.S. Census Bureau provides decadal demographic data collected at the household level and aggregated to larger enumeration units for anonymity purposes. Although this system is appropriate for the dissemination of large amounts of national demographic data, often the boundaries of the enumeration units do not reflect the distribution of the underlying statistical phenomena. Conventional mapping methods such as choropleth mapping, are primarily employed due to their ease of use. However, the analytical drawbacks of choropleth methods are well known ranging from (1) the artificial transition of population at the boundaries of mapping units to (2) the assumption that the phenomena is evenly distributed across the enumeration unit (when in actuality there can be significant variation). Many methods to map population distribution have been practiced in geographic information systems (GIS) and remote sensing fields. Many cartographers prefer dasymetric mapping to map population because of its ability to more accurately distribute data over geographic space. Similar to ?choropleth maps?, a dasymetric map utilizes standardized data (for example, census data). However, rather than using arbitrary enumeration zones to symbolize population distribution, a dasymetric approach introduces ancillary information to redistribute the standardized data into zones relative to land use and land cover (LULC), taking into consideration actual changing densities within the boundaries of the enumeration unit. Thus, new zones are created that correlate to the function of the map, capturing spatial variations in population density. The transfer of data from census enumeration units to ancillary-driven homogenous zones is performed by a process called areal interpolation.

  17. A Novel Method of Multi-Information Acquisition for Electromagnetic Flow Meters.

    PubMed

    Cui, Wenhua; Li, Bin; Chen, Jie; Li, Xinwei

    2015-12-26

    In this paper, a novel method is proposed for multi-information acquisition from the electromagnetic flow meter, using magnetic excitation to measure the fluid velocity and electrochemistry impedance spectroscopy (EIS) for both the fluid quality and the contamination level of the transducer. The impedance spectra of the transducer are measured with an additional electrical stimulus in series with the electrode measurement loop. The series connection mode instead of the parallel one improves the signal-to-noise ratio (SNR) of the fluid velocity measurement and offers a wide range of impedance measurements by using a sample capacitance. In addition, a multi-frequency synchronous excitation source is synthesized based on the method of dual-base power sequences for fast EIS measurement. The conductivity measurements in the range of 1.7 μS/cm-2 mS/cm showed a relatively high accuracy with a measurement error of 5%, and the electrode adhesion detection on both with coating and no coating showed the ability of the qualitative determination of the electrode adhesion, which validated the feasibility of the multi-information acquisition method for the electromagnetic flow meter (EMFM).

  18. Development of a theory-based sexual and reproductive health promotion and HIV prevention program for Chinese early adolescents.

    PubMed

    Hong, Jingfang; Fongkaew, Warunee; Senaratana, Wilawan; Tonmukayakul, Ouyporn

    2010-09-01

    The purpose of this study was to develop a theory-based program for Chinese early adolescents in order to promote their sexual and reproductive health and to prevent HIV infection. The program was designed based on the Information-Motivation-Behavioral skills model and a needs assessment among the stakeholders. A technical collaborative action research approach was applied. The study's participants were 102 early adolescents in a public middle school in mainland China, with the involvement of other key stakeholders, including 15 teachers and 12 parents. The results revealed a statistically significant improvement in the scores of sexual and reproductive health promotion and HIV prevention information, motivation, and behavioral skills after the program's implementation. Meanwhile, qualitative data from the early adolescents' reflection indicated that the content was useful and comprehensive, the trainers were friendly and knowledgeable, and participatory learning with an "edutainment" style was especially impressive. Additionally, the early adolescents expressed that they could apply the knowledge and skills in their daily life, which would benefit themselves and their family and peers. The Information-Motivation-Behavioral skills model could be explored in a non-Western context and the program was shown to be acceptable for use in a Chinese middle school setting.

  19. Inventing the future of physicians and information technology: methods and results of the 1997 Lafayette Parish Medical Society Information Systems and Information Technology Project.

    PubMed

    Caillouet, L P; Lipstate, J; Carroll, D J

    1999-06-01

    This paper challenges physicians to consider how to best invent a future in which they can personally leverage emerging information and communication technologies to maximize their effectiveness and efficiency as care givers. One Louisiana State Medical Society component medical society has already posed this challenge to its members. The paper describes the 1997 Lafayette Parish Medical Society Physicians' Information Systems and Information Project, conducted on behalf of the society by faculty of the Healthcare Administration MBA Program at the University of Southwestern Louisiana. Specific recommendations for application of health care information technologies by physicians and by health care institutions, based on findings and conclusions of the project, are highlighted.

  20. Comparison of Information Dissemination Methods in Inle Lake: A Lesson for Reconsidering Framework for Environmental Education Strategies

    ERIC Educational Resources Information Center

    Oo, Htun Naing; Sutheerawatthana, Pitch; Minato, Takayuki

    2010-01-01

    This article analyzes the practice of information dissemination regarding pesticide usage in floating gardening in a rural area. The analysis reveals reasons why the current information dissemination methods employed by relevant stakeholders do not work. It then puts forward a proposition that information sharing within organizations of and among…

  1. Benchmarking Clinical Speech Recognition and Information Extraction: New Data, Methods, and Evaluations

    PubMed Central

    Zhou, Liyuan; Hanlen, Leif; Ferraro, Gabriela

    2015-01-01

    Background Over a tenth of preventable adverse events in health care are caused by failures in information flow. These failures are tangible in clinical handover; regardless of good verbal handover, from two-thirds to all of this information is lost after 3-5 shifts if notes are taken by hand, or not at all. Speech recognition and information extraction provide a way to fill out a handover form for clinical proofing and sign-off. Objective The objective of the study was to provide a recorded spoken handover, annotated verbatim transcriptions, and evaluations to support research in spoken and written natural language processing for filling out a clinical handover form. This dataset is based on synthetic patient profiles, thereby avoiding ethical and legal restrictions, while maintaining efficacy for research in speech-to-text conversion and information extraction, based on realistic clinical scenarios. We also introduce a Web app to demonstrate the system design and workflow. Methods We experiment with Dragon Medical 11.0 for speech recognition and CRF++ for information extraction. To compute features for information extraction, we also apply CoreNLP, MetaMap, and Ontoserver. Our evaluation uses cross-validation techniques to measure processing correctness. Results The data provided were a simulation of nursing handover, as recorded using a mobile device, built from simulated patient records and handover scripts, spoken by an Australian registered nurse. Speech recognition recognized 5276 of 7277 words in our 100 test documents correctly. We considered 50 mutually exclusive categories in information extraction and achieved the F1 (ie, the harmonic mean of Precision and Recall) of 0.86 in the category for irrelevant text and the macro-averaged F1 of 0.70 over the remaining 35 nonempty categories of the form in our 101 test documents. Conclusions The significance of this study hinges on opening our data, together with the related performance benchmarks and some

  2. The method of earthquake landslide information extraction with high-resolution remote sensing

    NASA Astrophysics Data System (ADS)

    Wu, Jian; Chen, Peng; Liu, Yaolin; Wang, Jing

    2014-05-01

    As a kind of secondary geological disaster caused by strong earthquake, the earthquake-induced landslide has drawn much attention in the world due to the severe hazard. The high-resolution remote sensing, as a new technology for investigation and monitoring, has been widely applied in landslide susceptibility and hazard mapping. The Ms 8.0 Wenchuan earthquake, occurred on 12 May 2008, caused many buildings collapse and half million people be injured. Meanwhile, damage caused by earthquake-induced landslides, collapse and debris flow became the major part of total losses. By analyzing the property of the Zipingpu landslide occurred in the Wenchuan earthquake, the present study advanced a quick-and-effective way for landslide extraction based on NDVI and slope information, and the results were validated with pixel-oriented and object-oriented methods. The main advantage of the idea lies in the fact that it doesn't need much professional knowledge and data such as crustal movement, geological structure, fractured zone, etc. and the researchers can provide the landslide monitoring information for earthquake relief as soon as possible. In pixel-oriented way, the NDVI-differential image as well as slope image was analyzed and segmented to extract the landslide information. When it comes to object-oriented method, the multi-scale segmentation algorithm was applied in order to build up three-layer hierarchy. The spectral, textural, shape, location and contextual information of individual object classes, and GLCM (Grey Level Concurrence Matrix homogeneity, shape index etc. were extracted and used to establish the fuzzy decision rule system of each layer for earthquake landslide extraction. Comparison of the results generated from the two methods, showed that the object-oriented method could successfully avoid the phenomenon of NDVI-differential bright noise caused by the spectral diversity of high-resolution remote sensing data and achieved better result with an overall

  3. A Method for Developing 3D User Interfaces of Information Systems

    NASA Astrophysics Data System (ADS)

    Calleros, Juan Manuel González; Vanderdonckt, Jean; Arteaga, Jaime Muñoz

    A transformational method for developing tri-dimensional user interfaces of interactive information systems is presented that starts from a task model and a domain model to progressively derive a final user interface. This method consists of three steps: deriving one or many abstract user interfaces from a task model and a domain model, deriving one or many concrete user interfaces from each abstract interface, and producing the code of the final user interfaces corresponding to each concrete interface. To ensure the two first steps, trans-formations are encoded as graph transformations performed on the involved models expressed in their graph equivalent. In addition, a graph grammar gathers relevant graph transformations for accomplishing the sub-steps involved in each step. Once a concrete user interface is resulting from these two first steps, it is converted in a development environment for 3D user interfaces where it can be edited for fine tuning and personalization. From this environment, the user interface code is automatically generated. The method is defined by its steps, input/output, and exemplified on a case study. By expressing the steps of the method through transformations between models, the method adheres to Model-Driven Engineering paradigm where models and transformations are explicitly defined and used

  4. An Energy-Efficient Game-Theory-Based Spectrum Decision Scheme for Cognitive Radio Sensor Networks.

    PubMed

    Salim, Shelly; Moh, Sangman

    2016-06-30

    A cognitive radio sensor network (CRSN) is a wireless sensor network in which sensor nodes are equipped with cognitive radio. In this paper, we propose an energy-efficient game-theory-based spectrum decision (EGSD) scheme for CRSNs to prolong the network lifetime. Note that energy efficiency is the most important design consideration in CRSNs because it determines the network lifetime. The central part of the EGSD scheme consists of two spectrum selection algorithms: random selection and game-theory-based selection. The EGSD scheme also includes a clustering algorithm, spectrum characterization with a Markov chain, and cluster member coordination. Our performance study shows that EGSD outperforms the existing popular framework in terms of network lifetime and coordination overhead.

  5. An Energy-Efficient Game-Theory-Based Spectrum Decision Scheme for Cognitive Radio Sensor Networks

    PubMed Central

    Salim, Shelly; Moh, Sangman

    2016-01-01

    A cognitive radio sensor network (CRSN) is a wireless sensor network in which sensor nodes are equipped with cognitive radio. In this paper, we propose an energy-efficient game-theory-based spectrum decision (EGSD) scheme for CRSNs to prolong the network lifetime. Note that energy efficiency is the most important design consideration in CRSNs because it determines the network lifetime. The central part of the EGSD scheme consists of two spectrum selection algorithms: random selection and game-theory-based selection. The EGSD scheme also includes a clustering algorithm, spectrum characterization with a Markov chain, and cluster member coordination. Our performance study shows that EGSD outperforms the existing popular framework in terms of network lifetime and coordination overhead. PMID:27376290

  6. Split operator method for fluorescence diffuse optical tomography using anisotropic diffusion regularisation with prior anatomical information

    PubMed Central

    Correia, Teresa; Aguirre, Juan; Sisniega, Alejandro; Chamorro-Servent, Judit; Abascal, Juan; Vaquero, Juan J.; Desco, Manuel; Kolehmainen, Ville; Arridge, Simon

    2011-01-01

    Fluorescence diffuse optical tomography (fDOT) is an imaging modality that provides images of the fluorochrome distribution within the object of study. The image reconstruction problem is ill-posed and highly underdetermined and, therefore, regularisation techniques need to be used. In this paper we use a nonlinear anisotropic diffusion regularisation term that incorporates anatomical prior information. We introduce a split operator method that reduces the nonlinear inverse problem to two simpler problems, allowing fast and efficient solution of the fDOT problem. We tested our method using simulated, phantom and ex-vivo mouse data, and found that it provides reconstructions with better spatial localisation and size of fluorochrome inclusions than using the standard Tikhonov penalty term. PMID:22091447

  7. A method to unmix multiple fluorophores in microscopy images with minimal a priori information.

    PubMed

    Schlachter, S; Schwedler, S; Esposito, A; Kaminski Schierle, G S; Moggridge, G D; Kaminski, C F

    2009-12-07

    The ability to quantify the fluorescence signals from multiply labeled biological samples is highly desirable in the life sciences but often difficult, because of spectral overlap between fluorescent species and the presence of autofluorescence. Several so called unmixing algorithms have been developed to address this problem. Here, we present a novel algorithm that combines measurements of lifetime and spectrum to achieve unmixing without a priori information on the spectral properties of the fluorophore labels. The only assumption made is that the lifetimes of the fluorophores differ. Our method combines global analysis for a measurement of lifetime distributions with singular value decomposition to recover individual fluorescence spectra. We demonstrate the technique on simulated datasets and subsequently by an experiment on a biological sample. The method is computationally efficient and straightforward to implement. Applications range from histopathology of complex and multiply labelled samples to functional imaging in live cells.

  8. Hybrid modelling framework by using mathematics-based and information-based methods

    NASA Astrophysics Data System (ADS)

    Ghaboussi, J.; Kim, J.; Elnashai, A.

    2010-06-01

    Mathematics-based computational mechanics involves idealization in going from the observed behaviour of a system into mathematical equations representing the underlying mechanics of that behaviour. Idealization may lead mathematical models that exclude certain aspects of the complex behaviour that may be significant. An alternative approach is data-centric modelling that constitutes a fundamental shift from mathematical equations to data that contain the required information about the underlying mechanics. However, purely data-centric methods often fail for infrequent events and large state changes. In this article, a new hybrid modelling framework is proposed to improve accuracy in simulation of real-world systems. In the hybrid framework, a mathematical model is complemented by information-based components. The role of informational components is to model aspects which the mathematical model leaves out. The missing aspects are extracted and identified through Autoprogressive Algorithms. The proposed hybrid modelling framework has a wide range of potential applications for natural and engineered systems. The potential of the hybrid methodology is illustrated through modelling highly pinched hysteretic behaviour of beam-to-column connections in steel frames.

  9. 30 CFR 48.3 - Training plans; time of submission; where filed; information required; time for approval; method...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... will be given for each course. (5) A description of the teaching methods and the course materials which...; information required; time for approval; method for disapproval; commencement of training; approval of....3 Training plans; time of submission; where filed; information required; time for approval;...

  10. A simplified orthotropic formulation of the viscoplasticity theory based on overstress

    NASA Technical Reports Server (NTRS)

    Sutcu, M.; Krempl, E.

    1988-01-01

    An orthotropic, small strain viscoplasticity theory based on overstress is presented. In each preferred direction the stress is composed of time (rate) independent (or plastic) and viscous (or rate dependent) contributions. Tension-compression asymmetry can depend on direction and is included in the model. Upon a proper choice of a material constant one preferred direction can exhibit linear elastic response while the other two deform in a viscoplastic manner.

  11. towards a theory-based multi-dimensional framework for assessment in mathematics: The "SEA" framework

    NASA Astrophysics Data System (ADS)

    Anku, Sitsofe E.

    1997-09-01

    Using the reform documents of the National Council of Teachers of Mathematics (NCTM) (NCTM, 1989, 1991, 1995), a theory-based multi-dimensional assessment framework (the "SEA" framework) which should help expand the scope of assessment in mathematics is proposed. This framework uses a context based on mathematical reasoning and has components that comprise mathematical concepts, mathematical procedures, mathematical communication, mathematical problem solving, and mathematical disposition.

  12. A Quantitative Quasispecies Theory-Based Model of Virus Escape Mutation Under Immune Selection

    DTIC Science & Technology

    2012-01-01

    A quantitative quasispecies theory-based model of virus escape mutation under immune selection Hyung-June Woo and Jaques Reifman1 Biotechnology High...Viral infections involve a complex interplay of the immune response and escape mutation of the virus quasispecies inside a single host. Although...response. The virus quasispecies dynamics are explicitly repre- sented by mutations in the combined sequence space of a set of epitopes within the viral

  13. High-speed readout method of ID information on a large amount of electronic tags

    NASA Astrophysics Data System (ADS)

    Nagate, Wataru; Sasabe, Masahiro; Nakano, Hirotaka

    2006-10-01

    An electronic tag such as RFID is expected to create new services that cannot be achieved by the traditional bar code. Specifically, in a distribution system, simultaneous readout method of a large amount of electronic tags embedded in products is required to reduce costs and time. In this paper, we propose novel methods, called Response Probability Control (RPC), to accomplish this requirement. In RPC, a reader firstly sends an ID request to electronic tags in its access area. It succeeds reading information on a tag only if other tags do not respond. To improve the readout efficiency, the reader appropriately controls the response probability in accordance with the number of tags. However, this approach cannot entirely avoid a collision of multiple responses. When a collision occurs, ID information is lost. To reduce the amount of lost data, we divide the ID registration process into two steps. The reader first gathers the former part of the original ID, called temporal ID, according to the above method. After obtaining the temporal ID, it sequentially collects the latter part of ID, called remaining ID, based on the temporal ID. Note that we determine the number of bits of a temporal ID in accordance with the number of tags in the access area so that each tag can be distinguishable. Through simulation experiments, we evaluate RPC in terms of the readout efficiency. Simulation results show that RPC can accomplish the readout efficiency 1.17 times higher than the traditional method where there are a thousand of electronic tags whose IDs are 128 bits.

  14. Information Accessibility of the Charcoal Burning Suicide Method in Mainland China

    PubMed Central

    Cheng, Qijin; Chang, Shu-Sen; Guo, Yingqi; Yip, Paul S. F.

    2015-01-01

    Background There has been a marked rise in suicide by charcoal burning (CB) in some East Asian countries but little is known about its incidence in mainland China. We examined media-reported CB suicides and the availability of online information about the method in mainland China. Methods We extracted and analyzed data for i) the characteristics and trends of fatal and nonfatal CB suicides reported by mainland Chinese newspapers (1998–2014); ii) trends and geographic variations in online searches using keywords relating to CB suicide (2011–2014); and iii) the content of Internet search results. Results 109 CB suicide attempts (89 fatal and 20 nonfatal) were reported by newspapers in 13 out of the 31 provinces or provincial-level-municipalities in mainland China. There were increasing trends in the incidence of reported CB suicides and in online searches using CB-related keywords. The province-level search intensities were correlated with CB suicide rates (Spearman’s correlation coefficient = 0.43 [95% confidence interval: 0.08–0.68]). Two-thirds of the web links retrieved using the search engine contained detailed information about the CB suicide method, of which 15% showed pro-suicide attitudes, and the majority (86%) did not encourage people to seek help. Limitations The incidence of CB suicide was based on newspaper reports and likely to be underestimated. Conclusions Mental health and suicide prevention professionals in mainland China should be alert to the increased use of this highly lethal suicide method. Better surveillance and intervention strategies need to be developed and implemented. PMID:26474297

  15. Separation of Stochastic and Deterministic Information from Seismological Time Series with Nonlinear Dynamics and Maximum Entropy Methods

    SciTech Connect

    Gutierrez, Rafael M.; Useche, Gina M.; Buitrago, Elias

    2007-11-13

    We present a procedure developed to detect stochastic and deterministic information contained in empirical time series, useful to characterize and make models of different aspects of complex phenomena represented by such data. This procedure is applied to a seismological time series to obtain new information to study and understand geological phenomena. We use concepts and methods from nonlinear dynamics and maximum entropy. The mentioned method allows an optimal analysis of the available information.

  16. An inversion method for retrieving soil moisture information from satellite altimetry observations

    NASA Astrophysics Data System (ADS)

    Uebbing, Bernd; Forootan, Ehsan; Kusche, Jürgen; Braakmann-Folgmann, Anne

    2016-04-01

    Soil moisture represents an important component of the terrestrial water cycle that controls., evapotranspiration and vegetation growth. Consequently, knowledge on soil moisture variability is essential to understand the interactions between land and atmosphere. Yet, terrestrial measurements are sparse and their information content is limited due to the large spatial variability of soil moisture. Therefore, over the last two decades, several active and passive radar and satellite missions such as ERS/SCAT, AMSR, SMOS or SMAP have been providing backscatter information that can be used to estimate surface conditions including soil moisture which is proportional to the dielectric constant of the upper (few cm) soil layers . Another source of soil moisture information are satellite radar altimeters, originally designed to measure sea surface height over the oceans. Measurements of Jason-1/2 (Ku- and C-Band) or Envisat (Ku- and S-Band) nadir radar backscatter provide high-resolution along-track information (~ 300m along-track resolution) on backscatter every ~10 days (Jason-1/2) or ~35 days (Envisat). Recent studies found good correlation between backscatter and soil moisture in upper layers, especially in arid and semi-arid regions, indicating the potential of satellite altimetry both to reconstruct and to monitor soil moisture variability. However, measuring soil moisture using altimetry has some drawbacks that include: (1) the noisy behavior of the altimetry-derived backscatter (due to e.g., existence of surface water in the radar foot-print), (2) the strong assumptions for converting altimetry backscatters to the soil moisture storage changes, and (3) the need for interpolating between the tracks. In this study, we suggest a new inversion framework that allows to retrieve soil moisture information from along-track Jason-2 and Envisat satellite altimetry data, and we test this scheme over the Australian arid and semi-arid regions. Our method consists of: (i

  17. Improved Methods for Fire Risk Assessment in Low-Income and Informal Settlements.

    PubMed

    Twigg, John; Christie, Nicola; Haworth, James; Osuteye, Emmanuel; Skarlatidou, Artemis

    2017-02-01

    Fires cause over 300,000 deaths annually worldwide and leave millions more with permanent injuries: some 95% of these deaths are in low- and middle-income countries. Burn injury risk is strongly associated with low-income and informal (or slum) settlements, which are growing rapidly in an urbanising world. Fire policy and mitigation strategies in poorer countries are constrained by inadequate data on incidence, impacts, and causes, which is mainly due to a lack of capacity and resources for data collection, analysis, and modelling. As a first step towards overcoming such challenges, this project reviewed the literature on the subject to assess the potential of a range of methods and tools for identifying, assessing, and addressing fire risk in low-income and informal settlements; the process was supported by an expert workshop at University College London in May 2016. We suggest that community-based risk and vulnerability assessment methods, which are widely used in disaster risk reduction, could be adapted to urban fire risk assessment, and could be enhanced by advances in crowdsourcing and citizen science for geospatial data creation and collection. To assist urban planners, emergency managers, and community organisations who are working in resource-constrained settings to identify and assess relevant fire risk factors, we also suggest an improved analytical framework based on the Haddon Matrix.

  18. Improved Methods for Fire Risk Assessment in Low-Income and Informal Settlements

    PubMed Central

    Twigg, John; Christie, Nicola; Haworth, James; Osuteye, Emmanuel; Skarlatidou, Artemis

    2017-01-01

    Fires cause over 300,000 deaths annually worldwide and leave millions more with permanent injuries: some 95% of these deaths are in low- and middle-income countries. Burn injury risk is strongly associated with low-income and informal (or slum) settlements, which are growing rapidly in an urbanising world. Fire policy and mitigation strategies in poorer countries are constrained by inadequate data on incidence, impacts, and causes, which is mainly due to a lack of capacity and resources for data collection, analysis, and modelling. As a first step towards overcoming such challenges, this project reviewed the literature on the subject to assess the potential of a range of methods and tools for identifying, assessing, and addressing fire risk in low-income and informal settlements; the process was supported by an expert workshop at University College London in May 2016. We suggest that community-based risk and vulnerability assessment methods, which are widely used in disaster risk reduction, could be adapted to urban fire risk assessment, and could be enhanced by advances in crowdsourcing and citizen science for geospatial data creation and collection. To assist urban planners, emergency managers, and community organisations who are working in resource-constrained settings to identify and assess relevant fire risk factors, we also suggest an improved analytical framework based on the Haddon Matrix. PMID:28157149

  19. A Novel Group Decision-Making Method Based on Sensor Data and Fuzzy Information

    PubMed Central

    Bai, Yu-Ting; Zhang, Bai-Hai; Wang, Xiao-Yi; Jin, Xue-Bo; Xu, Ji-Ping; Su, Ting-Li; Wang, Zhao-Yang

    2016-01-01

    Algal bloom is a typical phenomenon of the eutrophication of rivers and lakes and makes the water dirty and smelly. It is a serious threat to water security and public health. Most scholars studying solutions for this pollution have studied the principles of remediation approaches, but few have studied the decision-making and selection of the approaches. Existing research uses simplex decision-making information which is highly subjective and uses little of the data from water quality sensors. To utilize these data and solve the rational decision-making problem, a novel group decision-making method is proposed using the sensor data with fuzzy evaluation information. Firstly, the optimal similarity aggregation model of group opinions is built based on the modified similarity measurement of Vague values. Secondly, the approaches’ ability to improve the water quality indexes is expressed using Vague evaluation methods. Thirdly, the water quality sensor data are analyzed to match the features of the alternative approaches with grey relational degrees. This allows the best remediation approach to be selected to meet the current water status. Finally, the selection model is applied to the remediation of algal bloom in lakes. The results show this method’s rationality and feasibility when using different data from different sources. PMID:27801827

  20. Adaptive broadcasting method using neighbor type information in wireless sensor networks.

    PubMed

    Jeong, Hyocheol; Kim, Jeonghyun; Yoo, Younghwan

    2011-01-01

    Flooding is the simplest and most effective way to disseminate a packet to all nodes in a wireless sensor network (WSN). However, basic flooding makes all nodes transmit the packet at least once, resulting in the broadcast storm problem in a worst case, and in turn, network resources are severely wasted. Particularly, power is the most valuable resource of WSNs as nodes are powered by batteries, then the waste of energy by the basic flooding lessens the lifetime of WSNs. In order to solve the broadcast storm problem, this paper proposes a dynamic probabilistic flooding that utilizes the neighbor information like the numbers of child and sibling nodes. In general, the more sibling nodes there are, the higher is the probability that a broadcast packet may be sent by one of the sibling nodes. The packet is not retransmitted by itself, though. Meanwhile, if a node has many child nodes its retransmission probability should be high to achieve the high packet delivery ratio. Therefore, these two terms-the numbers of child and sibling nodes-are adopted in the proposed method in order to attain more reliable flooding. The proposed method also adopts the back-off delay scheme to avoid collisions between close neighbors. Simulation results prove that the proposed method outperforms previous flooding methods in respect of the number of duplicate packets and packet delivery ratio.

  1. Applications of geographic information systems (GIS) data and methods in obesity-related research.

    PubMed

    Jia, P; Cheng, X; Xue, H; Wang, Y

    2017-04-01

    Geographic information systems (GIS) data/methods offer good promise for public health programs including obesity-related research. This study systematically examined their applications and identified gaps and limitations in current obesity-related research. A systematic search of PubMed for studies published before 20 May 2016, utilizing synonyms for GIS in combination with synonyms for obesity as search terms, identified 121 studies that met our inclusion criteria. We found primary applications of GIS data/methods in obesity-related research included (i) visualization of spatial distribution of obesity and obesity-related phenomena, and basic obesogenic environmental features, and (ii) construction of advanced obesogenic environmental indicators. We found high spatial heterogeneity in obesity prevalence/risk and obesogenic environmental factors. Also, study design and characteristics varied considerably across studies because of lack of established guidance and protocols in the field, which may also have contributed to the mixed findings about environmental impacts on obesity. Existing findings regarding built environment are more robust than those regarding food environment. Applications of GIS data/methods in obesity research are still limited, and related research faces many challenges. More and better GIS data and more friendly analysis methods are needed to expand future GIS applications in obesity-related research.

  2. Development and Validation of an Instrument Measuring Theory-Based Determinants of Monitoring Obesogenic Behaviors of Pre-Schoolers among Hispanic Mothers

    PubMed Central

    Branscum, Paul; Lora, Karina R.

    2016-01-01

    Public health interventions are greatly needed for obesity prevention, and planning for such strategies should include community participation. The study’s purpose was to develop and validate a theory-based instrument with low-income, Hispanic mothers of preschoolers, to assess theory-based determinants of maternal monitoring of child’s consumption of fruits and vegetables and sugar-sweetened beverages (SSB). Nine focus groups with mothers were conducted to determine nutrition-related behaviors that mothers found as most obesogenic for their children. Next, behaviors were operationally defined and rated for importance and changeability. Two behaviors were selected for investigation (fruits and vegetable and SSB). Twenty semi-structured interviews with mothers were conducted next to develop culturally appropriate items for the instrument. Afterwards, face and content validity were established using a panel of six experts. Finally, the instrument was tested with a sample of 238 mothers. Psychometric properties evaluated included construct validity (using the maximum likelihood extraction method of factor analysis), and internal consistency reliability (Cronbach’s alpha). Results suggested that all scales on the instrument were valid and reliable, except for the autonomy scales. Researchers and community planners working with Hispanic families can use this instrument to measure theory-based determinants of parenting behaviors related to preschoolers’ consumption of fruits and vegetables, and SSB. PMID:27271643

  3. Evaluation of Statistical Rainfall Disaggregation Methods Using Rain-Gauge Information for West-Central Florida

    SciTech Connect

    Murch, Renee Rokicki; Zhang, Jing; Ross, Mark; Ganguly, Auroop R; Nachabe, Mahmood

    2008-01-01

    Rainfall disaggregation in time can be useful for the simulation of hydrologic systems and the prediction of floods and flash floods. Disaggregation of rainfall to timescales less than 1 h can be especially useful for small urbanized watershed study, and for continuous hydrologic simulations and when Hortonian or saturation-excess runoff dominates. However, the majority of rain gauges in any region record rainfall in daily time steps or, very often, hourly records have extensive missing data. Also, the convective nature of the rainfall can result in significant differences in the measured rainfall at nearby gauges. This study evaluates several statistical approaches for rainfall disaggregation which may be applicable using data from West-Central Florida, specifically from 1 h observations to 15 min records, and proposes new methodologies that have the potential to outperform existing approaches. Four approaches are examined. The first approach is an existing direct scaling method that utilizes observed 15 min rainfall at secondary rain gauges, to disaggregate observed 1 h rainfall at more numerous primary rain gauges. The second approach is an extension of an existing method for continuous rainfall disaggregation through statistical distributional assumptions. The third approach relies on artificial neural networks for the disaggregation process without sorting and the fourth approach extends the neural network methods through statistical preprocessing via new sorting and desorting schemes. The applicability and performance of these methods were evaluated using information from a fairly dense rain gauge network in West-Central Florida. Of the four methods compared, the sorted neural networks and the direct scaling method predicted peak rainfall magnitudes significantly better than the remaining techniques. The study also suggests that desorting algorithms would also be useful to randomly replace the artificial hyetograph within a rainfall period.

  4. WE-G-207-07: Iterative CT Shading Correction Method with No Prior Information

    SciTech Connect

    Wu, P; Mao, T; Niu, T; Xie, S; Sheng, K

    2015-06-15

    Purpose: Shading artifacts are caused by scatter contamination, beam hardening effects and other non-ideal imaging condition. Our Purpose is to propose a novel and general correction framework to eliminate low-frequency shading artifacts in CT imaging (e.g., cone-beam CT, low-kVp CT) without relying on prior information. Methods: Our method applies general knowledge of the relatively uniform CT number distribution in one tissue component. Image segmentation is applied to construct template image where each structure is filled with the same CT number of that specific tissue. By subtracting the ideal template from CT image, the residual from various error sources are generated. Since the forward projection is an integration process, the non-continuous low-frequency shading artifacts in the image become continuous and low-frequency signals in the line integral. Residual image is thus forward projected and its line integral is filtered using Savitzky-Golay filter to estimate the error. A compensation map is reconstructed on the error using standard FDK algorithm and added to the original image to obtain the shading corrected one. Since the segmentation is not accurate on shaded CT image, the proposed scheme is iterated until the variation of residual image is minimized. Results: The proposed method is evaluated on a Catphan600 phantom, a pelvic patient and a CT angiography scan for carotid artery assessment. Compared to the one without correction, our method reduces the overall CT number error from >200 HU to be <35 HU and increases the spatial uniformity by a factor of 1.4. Conclusion: We propose an effective iterative algorithm for shading correction in CT imaging. Being different from existing algorithms, our method is only assisted by general anatomical and physical information in CT imaging without relying on prior knowledge. Our method is thus practical and attractive as a general solution to CT shading correction. This work is supported by the National Science

  5. Systems and methods for supplemental weather information presentation on a display

    NASA Technical Reports Server (NTRS)

    Bunch, Brian (Inventor)

    2010-01-01

    An embodiment of the supplemental weather display system presents supplemental weather information on a display in a craft. An exemplary embodiment receives the supplemental weather information from a remote source, determines a location of the supplemental weather information relative to the craft, receives weather information from an on-board radar system, and integrates the supplemental weather information with the weather information received from the on-board radar system.

  6. A defocus-information-free autostereoscopic three-dimensional (3D) digital reconstruction method using direct extraction of disparity information (DEDI)

    NASA Astrophysics Data System (ADS)

    Li, Da; Cheung, Chifai; Zhao, Xing; Ren, Mingjun; Zhang, Juan; Zhou, Liqiu

    2016-10-01

    Autostereoscopy based three-dimensional (3D) digital reconstruction has been widely applied in the field of medical science, entertainment, design, industrial manufacture, precision measurement and many other areas. The 3D digital model of the target can be reconstructed based on the series of two-dimensional (2D) information acquired by the autostereoscopic system, which consists multiple lens and can provide information of the target from multiple angles. This paper presents a generalized and precise autostereoscopic three-dimensional (3D) digital reconstruction method based on Direct Extraction of Disparity Information (DEDI) which can be used to any transform autostereoscopic systems and provides accurate 3D reconstruction results through error elimination process based on statistical analysis. The feasibility of DEDI method has been successfully verified through a series of optical 3D digital reconstruction experiments on different autostereoscopic systems which is highly efficient to perform the direct full 3D digital model construction based on tomography-like operation upon every depth plane with the exclusion of the defocused information. With the absolute focused information processed by DEDI method, the 3D digital model of the target can be directly and precisely formed along the axial direction with the depth information.

  7. Information needs assessment of medical equipment offices based on Critical Success Factors (CSF) and Business System Planning (BSP) methods.

    PubMed

    Khorrami, F; Ahmadi, M; Alizadeh, A; Roozbeh, N; Mohseni, S

    2015-01-01

    Introduction: Given the ever-increasing importance and value of information, providing the management with a reliable information system, which can facilitate decision-making regarding planning, organization and control, is vitally important. This study aimed to analyze and evaluate the information needs of medical equipment offices. Methods: This descriptive applied cross-sectional study was carried out in 2010. The population of the study included the managers of statistic and medical records at the offices of vice-chancellor for treatment in 39 medical universities in Iran. Data were collected by using structured questioners. With regard to different kinds of designing information systems, sampling was done by two methods, BSP (based on processes of job description) and CSF method (based on critical success factors). The data were analyzed by SPSS-16. Results: Our study showed that 41% of information needs were found to be critical success factors of managers of office. The first priority of managers was "the number of bed and bed occupancy in hospitals". Of 29 identified information needs, 62% were initial information needs of managers (from the viewpoints of managers). Of all, 4% of the information needs were obtained through the form, 14% through both the form and database, 11% through the web site, and 71% had no sources (forms, databases, web site). Conclusion: Since 71% of the information needs of medical equipment offices managers had no information sources, the development of information system in these offices seems to be necessary. Despite the important role of users in designing the information systems (identifying 62% of information needs), other scientific methods is also needed to be utilized in designing the information systems.

  8. Information needs assessment of medical equipment offices based on Critical Success Factors (CSF) and Business System Planning (BSP) methods

    PubMed Central

    Khorrami, F; Ahmadi, M; Alizadeh, A; Roozbeh, N; Mohseni, S

    2015-01-01

    Introduction: Given the ever-increasing importance and value of information, providing the management with a reliable information system, which can facilitate decision-making regarding planning, organization and control, is vitally important. This study aimed to analyze and evaluate the information needs of medical equipment offices. Methods: This descriptive applied cross-sectional study was carried out in 2010. The population of the study included the managers of statistic and medical records at the offices of vice-chancellor for treatment in 39 medical universities in Iran. Data were collected by using structured questioners. With regard to different kinds of designing information systems, sampling was done by two methods, BSP (based on processes of job description) and CSF method (based on critical success factors). The data were analyzed by SPSS-16. Results: Our study showed that 41% of information needs were found to be critical success factors of managers of office. The first priority of managers was “the number of bed and bed occupancy in hospitals”. Of 29 identified information needs, 62% were initial information needs of managers (from the viewpoints of managers). Of all, 4% of the information needs were obtained through the form, 14% through both the form and database, 11% through the web site, and 71% had no sources (forms, databases, web site). Conclusion: Since 71% of the information needs of medical equipment offices managers had no information sources, the development of information system in these offices seems to be necessary. Despite the important role of users in designing the information systems (identifying 62% of information needs), other scientific methods is also needed to be utilized in designing the information systems.

  9. Spectral-spatial classification combined with diffusion theory based inverse modeling of hyperspectral images

    NASA Astrophysics Data System (ADS)

    Paluchowski, Lukasz A.; Bjorgan, Asgeir; Nordgaard, Hâvard B.; Randeberg, Lise L.

    2016-02-01

    Hyperspectral imagery opens a new perspective for biomedical diagnostics and tissue characterization. High spectral resolution can give insight into optical properties of the skin tissue. However, at the same time the amount of collected data represents a challenge when it comes to decomposition into clusters and extraction of useful diagnostic information. In this study spectral-spatial classification and inverse diffusion modeling were employed to hyperspectral images obtained from a porcine burn model using a hyperspectral push-broom camera. The implemented method takes advantage of spatial and spectral information simultaneously, and provides information about the average optical properties within each cluster. The implemented algorithm allows mapping spectral and spatial heterogeneity of the burn injury as well as dynamic changes of spectral properties within the burn area. The combination of statistical and physics informed tools allowed for initial separation of different burn wounds and further detailed characterization of the injuries in short post-injury time.

  10. Safety margins estimation method considering uncertainties within the risk-informed decision-making framework

    SciTech Connect

    Martorell, S.; Nebot, Y.; Vilanueva, J. F.; Carlos, S.; Serradell, V.

    2006-07-01

    The adoption by regulators of the risk-informed decision-making philosophy has opened the debate on the role of the deterministic and probabilistic approaches to support regulatory matters of concern to NPP safety (e.g. safety margins, core damage frequency, etc.). However, the typical separation of the application fields does not imply that both methods cannot benefit from each other. On the contrary, there is a growing interest nowadays aimed at developing methods for using probabilistic safety analysis results into requirements and assumptions in deterministic analysis and vice versa. Thus, it appears an interesting challenge for the technical community aimed at combining best estimate thermal-hydraulic codes with probabilistic techniques to produce an effective and feasible technology, which should provide more realistic, complete and logical measure of reactor safety. This paper proposes a new unified framework to estimate safety margins using a best estimate thermal-hydraulic code with help of data and models from a level 1 LPSA (low power and shutdown probabilistic safety assessment - PSA) and considering simultaneously the uncertainty associated to both probabilistic and thermal-hydraulic codes. It is also presented an application example that demonstrates the performance and significance of the method and the relevance of the results achieved to the safety of nuclear power plants. (authors)

  11. A review of data quality assessment methods for public health information systems.

    PubMed

    Chen, Hong; Hailey, David; Wang, Ning; Yu, Ping

    2014-05-14

    High quality data and effective data quality assessment are required for accurately evaluating the impact of public health interventions and measuring public health outcomes. Data, data use, and data collection process, as the three dimensions of data quality, all need to be assessed for overall data quality assessment. We reviewed current data quality assessment methods. The relevant study was identified in major databases and well-known institutional websites. We found the dimension of data was most frequently assessed. Completeness, accuracy, and timeliness were the three most-used attributes among a total of 49 attributes of data quality. The major quantitative assessment methods were descriptive surveys and data audits, whereas the common qualitative assessment methods were interview and documentation review. The limitations of the reviewed studies included inattentiveness to data use and data collection process, inconsistency in the definition of attributes of data quality, failure to address data users' concerns and a lack of systematic procedures in data quality assessment. This review study is limited by the coverage of the databases and the breadth of public health information systems. Further research could develop consistent data quality definitions and attributes. More research efforts should be given to assess the quality of data use and the quality of data collection process.

  12. A Review of Data Quality Assessment Methods for Public Health Information Systems

    PubMed Central

    Chen, Hong; Hailey, David; Wang, Ning; Yu, Ping

    2014-01-01

    High quality data and effective data quality assessment are required for accurately evaluating the impact of public health interventions and measuring public health outcomes. Data, data use, and data collection process, as the three dimensions of data quality, all need to be assessed for overall data quality assessment. We reviewed current data quality assessment methods. The relevant study was identified in major databases and well-known institutional websites. We found the dimension of data was most frequently assessed. Completeness, accuracy, and timeliness were the three most-used attributes among a total of 49 attributes of data quality. The major quantitative assessment methods were descriptive surveys and data audits, whereas the common qualitative assessment methods were interview and documentation review. The limitations of the reviewed studies included inattentiveness to data use and data collection process, inconsistency in the definition of attributes of data quality, failure to address data users’ concerns and a lack of systematic procedures in data quality assessment. This review study is limited by the coverage of the databases and the breadth of public health information systems. Further research could develop consistent data quality definitions and attributes. More research efforts should be given to assess the quality of data use and the quality of data collection process. PMID:24830450

  13. The Model and Control Methods of Access to Information and Technology Resources of Automated Control Systems in Water Supply Industry

    NASA Astrophysics Data System (ADS)

    Rytov, M. Yu; Spichyack, S. A.; Fedorov, V. P.; Petreshin, D. I.

    2017-01-01

    The paper describes a formalized control model of access to information and technological resources of automated control systems at water supply enterprises. The given model considers the availability of various communication links with information systems and technological equipment. There are also studied control methods of access to information and technological resources of automated control systems at water supply enterprises. On the basis of the formalized control model and appropriate methods there was developed a software-hardware complex for rapid access to information and technological resources of automated control systems, which contains an administrator’s automated workplace and ultimate users.

  14. Information content and analysis methods for Multi-Modal High-Throughput Biomedical Data

    NASA Astrophysics Data System (ADS)

    Ray, Bisakha; Henaff, Mikael; Ma, Sisi; Efstathiadis, Efstratios; Peskin, Eric R.; Picone, Marco; Poli, Tito; Aliferis, Constantin F.; Statnikov, Alexander

    2014-03-01

    The spectrum of modern molecular high-throughput assaying includes diverse technologies such as microarray gene expression, miRNA expression, proteomics, DNA methylation, among many others. Now that these technologies have matured and become increasingly accessible, the next frontier is to collect ``multi-modal'' data for the same set of subjects and conduct integrative, multi-level analyses. While multi-modal data does contain distinct biological information that can be useful for answering complex biology questions, its value for predicting clinical phenotypes and contributions of each type of input remain unknown. We obtained 47 datasets/predictive tasks that in total span over 9 data modalities and executed analytic experiments for predicting various clinical phenotypes and outcomes. First, we analyzed each modality separately using uni-modal approaches based on several state-of-the-art supervised classification and feature selection methods. Then, we applied integrative multi-modal classification techniques. We have found that gene expression is the most predictively informative modality. Other modalities such as protein expression, miRNA expression, and DNA methylation also provide highly predictive results, which are often statistically comparable but not superior to gene expression data. Integrative multi-modal analyses generally do not increase predictive signal compared to gene expression data.

  15. A method for 3D scene recognition using shadow information and a single fixed viewpoint

    NASA Astrophysics Data System (ADS)

    Bamber, David C.; Rogers, Jeremy D.; Page, Scott F.

    2012-05-01

    The ability to passively reconstruct a scene in 3D provides significant benefit to Situational Awareness systems employed in security and surveillance applications. Traditionally, passive 3D scene modelling techniques, such as Shape from Silhouette, require images from multiple sensor viewpoints, acquired either through the motion of a single sensor or from multiple sensors. As a result, the application of these techniques often attracts high costs, and presents numerous practical challenges. This paper presents a 3D scene reconstruction approach based on exploiting scene shadows, which only requires information from a single static sensor. This paper demonstrates that a large amount of 3D information about a scene can be interpreted from shadows; shadows reveal the shape of objects as viewed from a solar perspective and additional perspectives are gained as the sun arcs across the sky. The approach has been tested on synthetic and real data and is shown to be capable of reconstructing 3D scene objects where traditional 3D imaging methods fail. Providing the shadows within a scene are discernible, the proposed technique is able to reconstruct 3D objects that are camouflaged, obscured or even outside of the sensor's Field of View. The proposed approach can be applied in a range of applications, for example urban surveillance, checkpoint and border control, critical infrastructure protection and for identifying concealed or suspicious objects or persons which would normally be hidden from the sensor viewpoint.

  16. Text Messaging as a Method for Health Ministry Leaders to Disseminate Cancer Information.

    PubMed

    Schoenberger, Yu-Mei M; Phillips, Janice M; Mohiuddin, M Omar

    2015-12-01

    Mobile phone-based interventions can play a significant role in decreasing health disparities by enhancing population and individual health. The purpose of this study was to explore health ministry leaders (HMLs) and congregation members' communication technology usage and to assess the acceptability of mobile technology for delivery of cancer information. Six focus groups were conducted in two urban African-American churches with trained HMLs (n=7) and congregation members (n=37) to determine mobile phone technology usage and identify barriers and facilitators to a mobile phone intervention. All participants were African-American, majority were female (80% of HMLs; 73% of congregation members), and the mean age was 54 (HMLs) and 41 (congregation members). All of the HMLs and 95% of congregation members indicated owning a mobile phone. All HMLs reported sending/receiving text messages, whereas of the congregation members, 85% sent and 91% received text messages. The facilitators of a text messaging system mentioned by participants included alternative form of communication, quick method for disseminating information, and accessibility. The overall main barriers reported by both groups to using mobile technology include receiving multiple messages, difficulty texting, and cost. Ways to overcome barriers were explored with participants, and education was the most proposed solution. The findings from this study indicate that HMLs and congregation members are interested in receiving text messages to promote healthy lifestyles and cancer awareness. These findings represent the first step in the development of a mobile phone-based program designed to enhance the work of health ministry leaders.

  17. Clinical simulation: A method for development and evaluation of clinical information systems.

    PubMed

    Jensen, Sanne; Kushniruk, Andre W; Nøhr, Christian

    2015-04-01

    Use of clinical simulation in the design and evaluation of eHealth systems and applications has increased during the last decade. This paper describes a methodological approach for using clinical simulations in the design and evaluation of clinical information systems. The method is based on experiences from more than 20 clinical simulation studies conducted at the ITX-lab in the Capital Region of Denmark during the last 5 years. A ten-step approach to conducting simulations is presented in this paper. To illustrate the approach, a clinical simulation study concerning implementation of Digital Clinical Practice Guidelines in a prototype planning and coordination module is presented. In the case study potential benefits were assessed in a full-scale simulation test including 18 health care professionals. The results showed that health care professionals can benefit from such a module. Unintended consequences concerning terminology and changes in the division of responsibility amongst healthcare professionals were also identified, and questions were raised concerning future workflow across sector borders. Furthermore unexpected new possible benefits concerning improved communication, content of information in discharge letters and quality management emerged during the testing. In addition new potential groups of users were identified. The case study is used to demonstrate the potential of using the clinical simulation approach described in the paper.

  18. Structure analysis of the Polish academic information society using MDS method

    NASA Astrophysics Data System (ADS)

    Kaliczynska, Malgorzata

    2006-03-01

    The article presents the methodology of webometrics research and analysis aiming at determining similar features of objects belonging to the Polish information society, which uses the Internet and its www resources for communication purposes. In particular, the analysis applies to the selected Polish technical universities. The research was carried out in several phases - on different data groups - with regards to the Internet space and time changes. The results have been presented in a form of two and three-dimensional topography maps. For the purposes of this analysis, the computer methods of multidimensional scaling were used. The research will be further continued for a selected group of objects over a longer time frame. Its next stage will be the research on more diversified objects, also in a multinational aspect.

  19. Municipal Solid Waste Management using Geographical Information System aided methods: a mini review.

    PubMed

    Khan, Debishree; Samadder, Sukha Ranjan

    2014-11-01

    Municipal Solid Waste Management (MSWM) is one of the major environmental challenges in developing countries. Many efforts to reduce and recover the wastes have been made, but still land disposal of solid wastes is the most popular one. Finding an environmentally sound landfill site is a challenging task. This paper addresses a mini review on various aspects of MSWM (suitable landfill site selection, route optimization and public acceptance) using the Geographical Information System (GIS) coupled with other tools. The salient features of each of the integrated tools with GIS are discussed in this paper. It is also addressed how GIS can help in optimizing routes for collection of solid wastes from transfer stations to disposal sites to reduce the overall cost of solid waste management. A detailed approach on performing a public acceptance study of a proposed landfill site is presented in this study. The study will help municipal authorities to identify the most effective method of MSWM.

  20. Development and Usability of REACH: A Tailored Theory-Based Text Messaging Intervention for Disadvantaged Adults With Type 2 Diabetes

    PubMed Central

    Nelson, Lyndsay A; Mayberry, Lindsay S; Wallston, Kenneth; Kripalani, Sunil; Bergner, Erin M

    2016-01-01

    Background Among adults with type 2 diabetes mellitus (T2DM), adherence to recommended self-care activities is suboptimal, especially among racial and ethnic minorities with low income. Self-care nonadherence is associated with having worse glycemic control and diabetes complications. Text messaging interventions are improving the self-care of adults with T2DM, but few have been tested with disadvantaged populations. Objective To develop Rapid Education/Encouragement And Communications for Health (REACH), a tailored, text messaging intervention to support the self-care adherence of disadvantaged patients with T2DM, based on the Information-Motivation-Behavioral skills model. We then tested REACH’s usability to make improvements before evaluating its effects. Methods We developed REACH’s content and functionality using an empirical and theory-based approach, findings from a previously pilot-tested intervention, and the expertise of our interdisciplinary research team. We recruited 36 adults with T2DM from Federally Qualified Health Centers to participate in 1 of 3 rounds of usability testing. For 2 weeks, participants received daily text messages assessing and promoting self-care, including tailored messages addressing users’ unique barriers to adherence, and weekly text messages with adherence feedback. We analyzed quantitative and qualitative user feedback and system-collected data to improve REACH. Results Participants were, on average, 52.4 (SD 9.5) years old, 56% (20/36) female, 63% (22/35) were a racial or ethnic minority, and 67% (22/33) had an income less than US $35,000. About half were taking insulin, and average hemoglobin A1c level was 8.2% (SD 2.2%). We identified issues (eg, user concerns with message phrasing, technical restrictions with responding to assessment messages) and made improvements between testing rounds. Overall, participants favorably rated the ease of understanding (mean 9.6, SD 0.7) and helpfulness (mean 9.3, SD 1.4) of self

  1. Retrieval of Aerosol information from UV measurement by using optimal estimation method

    NASA Astrophysics Data System (ADS)

    KIM, M.; Kim, J.; Jeong, U.; Kim, W. V.; Kim, S. K.; Lee, S. D.; Moon, K. J.

    2014-12-01

    An algorithm to retrieve aerosol optical depth (AOD), single scattering albedo (SSA), and aerosol loading height is developed for GEMS (Geostationary Environment Monitoring Spectrometer) measurement. The GEMS is planned to be launched in geostationary orbit in 2018, and employs hyper-spectral imaging with 0.6 nm resolution to observe solar backscatter radiation in the UV and Visible range. In the UV range, the low surface contribution to the backscattered radiation and strong interaction between aerosol absorption and molecular scattering can be advantageous in retrieving aerosol information such as AOD and SSA [Torres et al., 2007; Torres et al., 2013; Ahn et al., 2014]. However, the large contribution of atmospheric scattering results in the increase of the sensitivity of the backward radiance to aerosol loading height. Thus, the assumption of aerosol loading height becomes important issue to obtain accurate result. Accordingly, this study focused on the simultaneous retrieval of aerosol loading height with AOD and SSA by utilizing the optimal estimation method. For the RTM simulation, the aerosol optical properties were analyzed from AERONET inversion data (level 2.0) at 46 AERONET sites over ASIA. Also, 2-channel inversion method is applied to estimate a priori value of the aerosol information to solve the Lavenberg Marquardt equation. The GEMS aerosol algorithm is tested with OMI level-1B dataset, a provisional data for GEMS measurement, and the result is compared with OMI standard aerosol product and AERONET values. The retrieved AOD and SSA show reasonable distribution compared with OMI products, and are well correlated with the value measured from AERONET. However, retrieval uncertainty in aerosol loading height is relatively larger than other results.

  2. Obstetric fistula in Southern Sudan: situational analysis and Key Informant Method to estimate prevalence

    PubMed Central

    2013-01-01

    Background Obstetric fistula is a severe condition which can have devastating consequences for a woman’s life. Despite a considerable literature, very little is known about its prevalence. This project was conducted to carry out a situational analysis of fistula services in South Sudan and to pilot test the Key Informant Method (KIM) to estimate the prevalence of fistula in a region of South Sudan. Methods Key stakeholder interviews, document reviews and fistula surgery record reviews were undertaken. A KIM survey was conducted in a district of Western Bahr-el-Ghazal in January 2012. One hundred sixty-six community-based distributors, traditional birth attendants and village midwives were trained as key informants to identify women with fistula in the community. Women identified were subsequently examined by an obstetrician and nurse to verify whether they had a fistula. Results There were limited fistula repair services in South Sudan. Approximately 50–80 women per year attend periodic campaigns, with around half having a fistula and receiving a repair. On average a further 5 women a year received fistula repair from hospital services. Ten women with potential fistula were identified via KIM; all confirmed by the obstetrician. Of these, three were from the survey area, which had 8,865 women of reproductive age (15–49 years). This gives a minimal estimated prevalence of at least 30 fistulas per 100,000 women of reproductive age (95% CI 10–100). Conclusions Routine fistula repair services available do not meet the population’s needs. The pilot study suggests that KIM can be used to identify women with fistula in the community. Data on fistula are generally poor; the KIM methodology we used in South Sudan yielded a lower fistula prevalence than estimates reported previously in the region. PMID:23497241

  3. A practical method for skin dose estimation in interventional cardiology based on fluorographic DICOM information.

    PubMed

    Matthews, Lucy; Dixon, Matthew; Rowles, Nick; Stevens, Greg

    2016-03-01

    A practical method for skin dose estimation for interventional cardiology patients has been developed to inform pre-procedure planning and post-procedure patient management. Absorbed dose to the patient skin for certain interventional radiology procedures can exceed thresholds for deterministic skin injury, requiring documentation within the patient notes and appropriate patient follow-up. The primary objective was to reduce uncertainty associated with current methods, particularly surrounding field overlap. This was achieved by considering rectangular field geometry incident on a spherical patient model in a polar coordinate system. The angular size of each field was quantified at surface of the sphere, i.e. the skin surface. Computer-assisted design software enabled the modelling of a sufficient dataset that was subsequently validated with radiochromic film. Modelled overlap was found to agree with overlap measured using film to within 2.2° ± 2.0°, showing that the overall error associated with the model was < 1 %. Mathematical comparison against exposure data extracted from procedural Digital Imaging and Communication in Medicine files was used to generate a graphical skin dose map, demonstrating the dose distribution over a sphere centred at the interventional reference point. Dosimetric accuracy of the software was measured as between 3.5 and 17 % for different variables.

  4. Generalized Cross Entropy Method for estimating joint distribution from incomplete information

    NASA Astrophysics Data System (ADS)

    Xu, Hai-Yan; Kuo, Shyh-Hao; Li, Guoqi; Legara, Erika Fille T.; Zhao, Daxuan; Monterola, Christopher P.

    2016-07-01

    Obtaining a full joint distribution from individual marginal distributions with incomplete information is a non-trivial task that continues to challenge researchers from various domains including economics, demography, and statistics. In this work, we develop a new methodology referred to as "Generalized Cross Entropy Method" (GCEM) that is aimed at addressing the issue. The objective function is proposed to be a weighted sum of divergences between joint distributions and various references. We show that the solution of the GCEM is unique and global optimal. Furthermore, we illustrate the applicability and validity of the method by utilizing it to recover the joint distribution of a household profile of a given administrative region. In particular, we estimate the joint distribution of the household size, household dwelling type, and household home ownership in Singapore. Results show a high-accuracy estimation of the full joint distribution of the household profile under study. Finally, the impact of constraints and weight on the estimation of joint distribution is explored.

  5. A Rapid Monitoring and Evaluation Method of Schistosomiasis Based on Spatial Information Technology.

    PubMed

    Wang, Yong; Zhuang, Dafang

    2015-12-12

    Thanks to Spatial Information Technologies (SITs) such as Remote Sensing (RS) and Geographical Information System (GIS) that are being quickly developed and updated, SITs are being used more widely in the public health field. The use of SITs to study the characteristics of the temporal and spatial distribution of Schistosoma japonicum and to assess the risk of infection provides methods for the control and prevention of schistosomiasis japonica has gradually become a hot topic in the field. The purpose of the present paper was to use RS and GIS technology to develop an efficient method of prediction and assessment of the risk of schistosomiasis japonica. We choose the Yueyang region, close to the east DongTing Lake (Hunan Province, China), as the study area, where a recent serious outbreak of schistosomiasis japonica took place. We monitored and evaluated the transmission risk of schistosomiasis japonica in the region using SITs. Water distribution data were extracted from RS images. The ground temperature, ground humidity and vegetation index were calculated based on RS images. Additionally, the density of oncomelania snails, which are the Schistosoma japonicum intermediate host, was calculated on the base of RS data and field measurements. The spatial distribution of oncomelania snails was explored using SITs in order to estimate the area surrounding the residents with transmission risk of schistosomiasis japonica. Our research result demonstrated: (1) the risk factors for the transmission of schistosomiasis japonica were closely related to the living environment of oncomelania snails. Key factors such as water distribution, ground temperature, ground humidity and vegetation index can be quickly obtained and calculated from RS images; (2) using GIS technology and a RS deduction technique along with statistical regression models, the density distribution model of oncomelania snails could be quickly built; (3) using SITs and analysis with overlaying population

  6. A Rapid Monitoring and Evaluation Method of Schistosomiasis Based on Spatial Information Technology

    PubMed Central

    Wang, Yong; Zhuang, Dafang

    2015-01-01

    Thanks to Spatial Information Technologies (SITs) such as Remote Sensing (RS) and Geographical Information System (GIS) that are being quickly developed and updated, SITs are being used more widely in the public health field. The use of SITs to study the characteristics of the temporal and spatial distribution of Schistosoma japonicum and to assess the risk of infection provides methods for the control and prevention of schistosomiasis japonica has gradually become a hot topic in the field. The purpose of the present paper was to use RS and GIS technology to develop an efficient method of prediction and assessment of the risk of schistosomiasis japonica. We choose the Yueyang region, close to the east DongTing Lake (Hunan Province, China), as the study area, where a recent serious outbreak of schistosomiasis japonica took place. We monitored and evaluated the transmission risk of schistosomiasis japonica in the region using SITs. Water distribution data were extracted from RS images. The ground temperature, ground humidity and vegetation index were calculated based on RS images. Additionally, the density of oncomelania snails, which are the Schistosoma japonicum intermediate host, was calculated on the base of RS data and field measurements. The spatial distribution of oncomelania snails was explored using SITs in order to estimate the area surrounding the residents with transmission risk of schistosomiasis japonica. Our research result demonstrated: (1) the risk factors for the transmission of schistosomiasis japonica were closely related to the living environment of oncomelania snails. Key factors such as water distribution, ground temperature, ground humidity and vegetation index can be quickly obtained and calculated from RS images; (2) using GIS technology and a RS deduction technique along with statistical regression models, the density distribution model of oncomelania snails could be quickly built; (3) using SITs and analysis with overlaying population

  7. A theory-based intervention to improve breast cancer awareness and screening in Jamaica.

    PubMed

    Anakwenze, Chidinma P; Coronado-Interis, Evelyn; Aung, Maung; Jolly, Pauline E

    2015-05-01

    Despite declines in breast cancer mortality rates in developed countries, mortality rates remain high in Jamaica due to low levels of screening and lack of early detection. We hypothesized that a theory-based health educational intervention would increase awareness of breast cancer and intention to screen among women in Western Jamaica. Two hundred and forty six women attending hospitals or clinics were enrolled in an educational intervention consisting of a pretest, breast cancer presentation, and posttest if they had never been screened or had not been screened in 5 years or more. The questionnaires assessed attitudes and knowledge of risk factors and symptoms related to breast cancer. Participants were followed approximately 6 months after the intervention to determine whether they accessed breast cancer screening. There were statistically significant increases (p < 0.0001) in the percentage of correct knowledge responses and in participants' intention to screen from pretest to posttest. The greatest posttest improvements were among items measuring knowledge of breast cancer screening tests and risk factors. Of the 134 women who were reached by phone for post-intervention follow-up, 30 women (22.4 %) were screened for breast cancer and 104 women (77.6 %) had not been screened. The use of a theory-based educational intervention positively influenced knowledge of breast cancer risk factors, symptoms, and types of screening and increased screening rates in screening-naïve women. This theory-based educational intervention may be replicated to promote awareness of breast cancer and further increase screening rates in other areas of Jamaica and other developing countries.

  8. Branes in Extended Spacetime: Brane Worldvolume Theory Based on Duality Symmetry.

    PubMed

    Sakatani, Yuho; Uehara, Shozo

    2016-11-04

    We propose a novel approach to the brane worldvolume theory based on the geometry of extended field theories: double field theory and exceptional field theory. We demonstrate the effectiveness of this approach by showing that one can reproduce the conventional bosonic string and membrane actions, and the M5-brane action in the weak-field approximation. At a glance, the proposed 5-brane action without approximation looks different from the known M5-brane actions, but it is consistent with the known nonlinear self-duality relation, and it may provide a new formulation of a single M5-brane action. Actions for exotic branes are also discussed.

  9. Theory-based scaling of the SOL width in circular limited tokamak plasmas

    NASA Astrophysics Data System (ADS)

    Halpern, F. D.; Ricci, P.; Labit, B.; Furno, I.; Jolliet, S.; Loizu, J.; Mosetto, A.; Arnoux, G.; Gunn, J. P.; Horacek, J.; Kočan, M.; LaBombard, B.; Silva, C.; Contributors, JET-EFDA

    2013-12-01

    A theory-based scaling for the characteristic length of a circular, limited tokamak scrape-off layer (SOL) is obtained by considering the balance between parallel losses and non-linearly saturated resistive ballooning mode turbulence driving anomalous perpendicular transport. The SOL size increases with plasma size, resistivity, and safety factor q. The scaling is verified against flux-driven non-linear turbulence simulations, which reveal good agreement within a wide range of dimensionless parameters, including parameters closely matching the TCV tokamak. An initial comparison of the theory against experimental data from several tokamaks also yields good agreement.

  10. Where is information quality lost at clinical level? A mixed-method study on information systems and data quality in three urban Kenyan ANC clinics.

    PubMed

    Hahn, Daniel; Wanjala, Pepela; Marx, Michael

    2013-01-01

    Background Well-working health information systems are considered vital with the quality of health data ranked of highest importance for decision making at patient care and policy levels. In particular, health facilities play an important role, since they are not only the entry point for the national health information system but also use health data (and primarily) for patient care. Design A multiple case study was carried out between March and August 2012 at the antenatal care (ANC) clinics of two private and one public Kenyan hospital to describe clinical information systems and assess the quality of information. The following methods were developed and employed in an iterative process: workplace walkthroughs, structured and in-depth interviews with staff members, and a quantitative assessment of data quality (completeness and accurate transmission of clinical information and reports in ANC). Views of staff and management on the quality of employed information systems, data quality, and influencing factors were captured qualitatively. Results Staff rated the quality of information higher in the private hospitals employing computers than in the public hospital which relies on paper forms. Several potential threats to data quality were reported. Limitations in data quality were common at all study sites including wrong test results, missing registers, and inconsistencies in reports. Feedback was seldom on content or quality of reports and usage of data beyond individual patient care was low. Conclusions We argue that the limited data quality has to be seen in the broader perspective of the information systems in which it is produced and used. The combination of different methods has proven to be useful for this. To improve the effectiveness and capabilities of these systems, combined measures are needed which include technical and organizational aspects (e.g. regular feedback to health workers) and individual skills and motivation.

  11. Cases on Successful E-Learning Practices in the Developed and Developing World: Methods for the Global Information Economy

    ERIC Educational Resources Information Center

    Olaniran, Bolanle A., Ed.

    2010-01-01

    E-learning has become a significant aspect of training and education in the worldwide information economy as an attempt to create and facilitate a competent global work force. "Cases on Successful E-Learning Practices in the Developed and Developing World: Methods for the Global Information Economy" provides eclectic accounts of case…

  12. 2002 Carolyn Sherif Award Address: Gender, Race, and Generation in a Midwest High School: Using Ethnographically Informed Methods in Psychology.

    ERIC Educational Resources Information Center

    Stewart, Abigail J.

    2003-01-01

    Suggests the value of ethnographically informed methods in the psychology of women, emphasizing the role of generation in psychology. Examines evidence from an ongoing, ethnographically informed study of high school graduates in the mid-1950s and late-1960s. The two generations of graduates have distinctive accounts of their experiences, with the…

  13. Healthcare information systems: data mining methods in the creation of a clinical recommender system

    NASA Astrophysics Data System (ADS)

    Duan, L.; Street, W. N.; Xu, E.

    2011-05-01

    Recommender systems have been extensively studied to present items, such as movies, music and books that are likely of interest to the user. Researchers have indicated that integrated medical information systems are becoming an essential part of the modern healthcare systems. Such systems have evolved to an integrated enterprise-wide system. In particular, such systems are considered as a type of enterprise information systems or ERP system addressing healthcare industry sector needs. As part of efforts, nursing care plan recommender systems can provide clinical decision support, nursing education, clinical quality control, and serve as a complement to existing practice guidelines. We propose to use correlations among nursing diagnoses, outcomes and interventions to create a recommender system for constructing nursing care plans. In the current study, we used nursing diagnosis data to develop the methodology. Our system utilises a prefix-tree structure common in itemset mining to construct a ranked list of suggested care plan items based on previously-entered items. Unlike common commercial systems, our system makes sequential recommendations based on user interaction, modifying a ranked list of suggested items at each step in care plan construction. We rank items based on traditional association-rule measures such as support and confidence, as well as a novel measure that anticipates which selections might improve the quality of future rankings. Since the multi-step nature of our recommendations presents problems for traditional evaluation measures, we also present a new evaluation method based on average ranking position and use it to test the effectiveness of different recommendation strategies.

  14. Applying Sequential Analytic Methods to Self-Reported Information to Anticipate Care Needs

    PubMed Central

    Bayliss, Elizabeth A.; Powers, J. David; Ellis, Jennifer L.; Barrow, Jennifer C.; Strobel, MaryJo; Beck, Arne

    2016-01-01

    Purpose: Identifying care needs for newly enrolled or newly insured individuals is important under the Affordable Care Act. Systematically collected patient-reported information can potentially identify subgroups with specific care needs prior to service use. Methods: We conducted a retrospective cohort investigation of 6,047 individuals who completed a 10-question needs assessment upon initial enrollment in Kaiser Permanente Colorado (KPCO), a not-for-profit integrated delivery system, through the Colorado State Individual Exchange. We used responses from the Brief Health Questionnaire (BHQ), to develop a predictive model for cost for receiving care in the top 25 percent, then applied cluster analytic techniques to identify different high-cost subpopulations. Per-member, per-month cost was measured from 6 to 12 months following BHQ response. Results: BHQ responses significantly predictive of high-cost care included self-reported health status, functional limitations, medication use, presence of 0–4 chronic conditions, self-reported emergency department (ED) use during the prior year, and lack of prior insurance. Age, gender, and deductible-based insurance product were also predictive. The largest possible range of predicted probabilities of being in the top 25 percent of cost was 3.5 percent to 96.4 percent. Within the top cost quartile, examples of potentially actionable clusters of patients included those with high morbidity, prior utilization, depression risk and financial constraints; those with high morbidity, previously uninsured individuals with few financial constraints; and relatively healthy, previously insured individuals with medication needs. Conclusions: Applying sequential predictive modeling and cluster analytic techniques to patient-reported information can identify subgroups of individuals within heterogeneous populations who may benefit from specific interventions to optimize initial care delivery. PMID:27563684

  15. A kinetic theory based numerical study of core collapse supernova dynamics

    NASA Astrophysics Data System (ADS)

    Strother, Terrance T.

    The explosion mechanism of core collapse supernovae remains an unsolved problem in astrophysics after many decades of theoretical and numerical study. The complex nature of this problem forces its consideration to rely heavily upon numerical simulations. Current state-of-the-art core collapse supernova simulations typically make use of hydrodynamic codes for the modeling of baryon dynamics coupled to a Boltzmann transport simulation for the neutrinos and other leptons. The results generated by such numerical simulations have given rise to the widely accepted notion that neutrino heating and convection are crucial for the explosion mechanism. However the precise roles that some factors such as neutrinos production and propagation, rotation, three-dimensional effects, the equation of state for asymmetric nuclear matter, general relativity, instabilities, magnetic fields, as well as others play in the explosion mechanism remain to be fully determined. In this work, we review sonic of the current methods used to simulate core collapse supernovae and the various scenarios that have been developed by numerical studies are discussed. Unlike most of the numerical simulations of core collapse supernovae, we employ a kinetic theory based approach that allows us to explicitly model the propagation of neutrinos and a full ensemble of nuclei. Both of these are significant advantages. The ability to explicitly model the propagation of neutrinos puts their treatment on equal footing with the modeling of baryon dynamics. No simplifying assumptions about the nature of neutrino-matter interactions need to be made and consequently our code is capable of producing output about the flow of neutrinos that most other simulations are inherently incapable of. Furthermore, neutrino flavor oscillations are readily incorporated with our approach. The ability to model the propagation of a full ensemble of nuclei is superior to the standard tracking of free baryons, alpha particles, and a

  16. The Evolution of Library Instruction Delivery in the Chemistry Curriculum Informed by Mixed Assessment Methods

    ERIC Educational Resources Information Center

    Mandernach, Meris A.; Shorish, Yasmeen; Reisner, Barbara A.

    2014-01-01

    As information continues to evolve over time, the information literacy expectations for chemistry students also change. This article examines transformations to an undergraduate chemistry course that focuses on chemical literature and information literacy and is co-taught by a chemistry professor and a chemistry librarian. This article also…

  17. Relative performance of mutual information estimation methods for quantifying the dependence among short and noisy data

    NASA Astrophysics Data System (ADS)

    Khan, Shiraj; Bandyopadhyay, Sharba; Ganguly, Auroop R.; Saigal, Sunil; Erickson, David J., III; Protopopescu, Vladimir; Ostrouchov, George

    2007-08-01

    Commonly used dependence measures, such as linear correlation, cross-correlogram, or Kendall’s τ , cannot capture the complete dependence structure in data unless the structure is restricted to linear, periodic, or monotonic. Mutual information (MI) has been frequently utilized for capturing the complete dependence structure including nonlinear dependence. Recently, several methods have been proposed for the MI estimation, such as kernel density estimators (KDEs), k -nearest neighbors (KNNs), Edgeworth approximation of differential entropy, and adaptive partitioning of the XY plane. However, outstanding gaps in the current literature have precluded the ability to effectively automate these methods, which, in turn, have caused limited adoptions by the application communities. This study attempts to address a key gap in the literature—specifically, the evaluation of the above methods to choose the best method, particularly in terms of their robustness for short and noisy data, based on comparisons with the theoretical MI estimates, which can be computed analytically, as well with linear correlation and Kendall’s τ . Here we consider smaller data sizes, such as 50, 100, and 1000, and within this study we characterize 50 and 100 data points as very short and 1000 as short. We consider a broader class of functions, specifically linear, quadratic, periodic, and chaotic, contaminated with artificial noise with varying noise-to-signal ratios. Our results indicate KDEs as the best choice for very short data at relatively high noise-to-signal levels whereas the performance of KNNs is the best for very short data at relatively low noise levels as well as for short data consistently across noise levels. In addition, the optimal smoothing parameter of a Gaussian kernel appears to be the best choice for KDEs while three nearest neighbors appear optimal for KNNs. Thus, in situations where the approximate data sizes are known in advance and exploratory data analysis and

  18. Determining the effectiveness of the usability problem inspector: a theory-based model and tool for finding usability problems.

    PubMed

    Andre, Terence S; Hartson, H Rex; Williges, Robert C

    2003-01-01

    Despite the increased focus on usability and on the processes and methods used to increase usability, a substantial amount of software is unusable and poorly designed. Much of this is attributable to the lack of cost-effective usability evaluation tools that provide an interaction-based framework for identifying problems. We developed the user action framework and a corresponding evaluation tool, the usability problem inspector (UPI), to help organize usability concepts and issues into a knowledge base. We conducted a comprehensive comparison study to determine if our theory-based framework and tool could be effectively used to find important usability problems in an interface design, relative to two other established inspection methods (heuristic evaluation and cognitive walkthrough). Results showed that the UPI scored higher than heuristic evaluation in terms of thoroughness, validity, and effectiveness and was consistent with cognitive walkthrough for these same measures. We also discuss other potential advantages of the UPI over heuristic evaluation and cognitive walkthrough when applied in practice. Potential applications of this work include a cost-effective alternative or supplement to lab-based formative usability evaluation during any stage of development.

  19. Control theory based airfoil design for potential flow and a finite volume discretization

    NASA Technical Reports Server (NTRS)

    Reuther, J.; Jameson, A.

    1994-01-01

    This paper describes the implementation of optimization techniques based on control theory for airfoil design. In previous studies it was shown that control theory could be used to devise an effective optimization procedure for two-dimensional profiles in which the shape is determined by a conformal transformation from a unit circle, and the control is the mapping function. The goal of our present work is to develop a method which does not depend on conformal mapping, so that it can be extended to treat three-dimensional problems. Therefore, we have developed a method which can address arbitrary geometric shapes through the use of a finite volume method to discretize the potential flow equation. Here the control law serves to provide computationally inexpensive gradient information to a standard numerical optimization method. Results are presented, where both target speed distributions and minimum drag are used as objective functions.

  20. Theory-Based Interventions in Physical Activity: A Systematic Review of Literature in Iran

    PubMed Central

    Abdi, Jalal; Eftekhar, Hassan; Estebsari, Fatemeh; Sadeghi, Roya

    2015-01-01

    Lack of physical activity is ranked fourth among the causes of human death and chronic diseases. Using models and theories to design, implement, and evaluate the health education and health promotion interventions has many advantages. Using models and theories of physical activity, we decided to systematically study the educational and promotional interventions carried out in Iran from 2003 to 2013.Three information databases were used to systematically select papers using key words including Iranian Magazine Database (MAGIRAN), Iran Medical Library (MEDLIB), and Scientific Information Database (SID). Twenty papers were selected and studied. Having been applied in 9 studies, The Trans Theoretical Model (TTM) was the most widespread model in Iran (PENDER in 3 studies, BASNEF in 2, and the Theory of Planned Behavior in 2 studies). With regards to the educational methods, almost all studies used a combination of methods. The most widely used Integrative educational method was group discussion. Only one integrated study was done. Behavior maintenance was not addressed in 75% of the studies. Almost all studies used self-reporting instruments. The effectiveness of educational methods was assessed in none of the studies. Most of the included studies had several methodological weaknesses, which hinder the validity and applicability of their results. According to the findings, the necessity of need assessment in using models, epidemiology and methodology consultation, addressing maintenance of physical activity, using other theories and models such as social marketing and social-cognitive theory, and other educational methods like empirical and complementary are suggested. PMID:25948454

  1. Optical mosaic method for orthogonally crossed gratings by utilizing information about both main periodic directions simultaneously

    NASA Astrophysics Data System (ADS)

    Zhou, Hengyan; Zeng, Lijiang

    2017-02-01

    We present a method to make optical mosaic orthogonally crossed gratings by utilizing information about both main periodic directions simultaneously. The whole mosaic system for orthogonally crossed gratings is set up based on the dual Lloyd's mirror interferometer that fabricates orthogonally crossed gratings through a single exposure. The interference fringes formed by the diffractions of the exposure beams from the exposed grating areas are used as the reference to fine tune the position and attitude of the exposure beams relative to the substrates during consecutive exposures. A procedure to make mosaic for two main periodic directions simultaneously is proposed based on the presupposition that the angle between two sets of main lattice lines is constant during the mosaic. Experimentally we made a 2×1 mosaic crossed grating with a period of 574 nm in (30+30) mm ×35 mm area. The peak-valley errors of the (-1, 0)th- and (0, -1)st-order diffraction wavefronts over the whole mosaic grating area are 0.104λ and 0.163λ, respectively.

  2. Survey on the estimation of mutual information methods as a measure of dependency versus correlation analysis

    NASA Astrophysics Data System (ADS)

    Gencaga, D.; Malakar, N. K.; Lary, D. J.

    2014-12-01

    In this survey, we present and compare different approaches to estimate Mutual Information (MI) from data to analyse general dependencies between variables of interest in a system. We demonstrate the performance difference of MI versus correlation analysis, which is only optimal in case of linear dependencies. First, we use a piece-wise constant Bayesian methodology using a general Dirichlet prior. In this estimation method, we use a two-stage approach where we approximate the probability distribution first and then calculate the marginal and joint entropies. Here, we demonstrate the performance of this Bayesian approach versus the others for computing the dependency between different variables. We also compare these with linear correlation analysis. Finally, we apply MI and correlation analysis to the identification of the bias in the determination of the aerosol optical depth (AOD) by the satellite based Moderate Resolution Imaging Spectroradiometer (MODIS) and the ground based AErosol RObotic NETwork (AERONET). Here, we observe that the AOD measurements by these two instruments might be different for the same location. The reason of this bias is explored by quantifying the dependencies between the bias and 15 other variables including cloud cover, surface reflectivity and others.

  3. Similarity landscapes: An improved method for scientific visualization of information from protein and DNA database searches

    SciTech Connect

    Dogget, N.; Myers, G.; Wills, C.J.

    1998-12-01

    This is the final report of a three-year, Laboratory Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). The authors have used computer simulations and examination of a variety of databases to answer questions about a wide range of evolutionary questions. The authors have found that there is a clear distinction in the evolution of HIV-1 and HIV-2, with the former and more virulent virus evolving more rapidly at a functional level. The authors have discovered highly non-random patterns in the evolution of HIV-1 that can be attributed to a variety of selective pressures. In the course of examination of microsatellite DNA (short repeat regions) in microorganisms, the authors have found clear differences between prokaryotes and eukaryotes in their distribution, differences that can be tied to different selective pressures. They have developed a new method (topiary pruning) for enhancing the phylogenetic information contained in DNA sequences. Most recently, the authors have discovered effects in complex rainforest ecosystems that indicate strong frequency-dependent interactions between host species and their parasites, leading to the maintenance of ecosystem variability.

  4. System and method for simultaneously collecting serial number information from numerous identity tags

    DOEpatents

    Doty, M.A.

    1997-01-07

    A system and method are disclosed for simultaneously collecting serial number information reports from numerous colliding coded-radio-frequency identity tags. Each tag has a unique multi-digit serial number that is stored in non-volatile RAM. A reader transmits an ASCII coded ``D`` character on a carrier of about 900 MHz and a power illumination field having a frequency of about 1.6 Ghz. A one MHz tone is modulated on the 1.6 Ghz carrier as a timing clock for a microprocessor in each of the identity tags. Over a thousand such tags may be in the vicinity and each is powered-up and clocked by the 1.6 Ghz power illumination field. Each identity tag looks for the ``D`` interrogator modulated on the 900 MHz carrier, and each uses a digit of its serial number to time a response. Clear responses received by the reader are repeated for verification. If no verification or a wrong number is received by any identity tag, it uses a second digital together with the first to time out a more extended period for response. Ultimately, the entire serial number will be used in the worst case collision environments; and since the serial numbers are defined as being unique, the final possibility will be successful because a clear time-slot channel will be available. 5 figs.

  5. System and method for simultaneously collecting serial number information from numerous identity tags

    DOEpatents

    Doty, Michael A.

    1997-01-01

    A system and method for simultaneously collecting serial number information reports from numerous colliding coded-radio-frequency identity tags. Each tag has a unique multi-digit serial number that is stored in non-volatile RAM. A reader transmits an ASCII coded "D" character on a carrier of about 900 MHz and a power illumination field having a frequency of about 1.6 Ghz. A one MHz tone is modulated on the 1.6 Ghz carrier as a timing clock for a microprocessor in each of the identity tags. Over a thousand such tags may be in the vicinity and each is powered-up and clocked by the 1.6 Ghz power illumination field. Each identity tag looks for the "D" interrogator modulated on the 900 MHz carrier, and each uses a digit of its serial number to time a response. Clear responses received by the reader are repeated for verification. If no verification or a wrong number is received by any identity tag, it uses a second digital together with the first to time out a more extended period for response. Ultimately, the entire serial number will be used in the worst case collision environments; and since the serial numbers are defined as being unique, the final possibility will be successful because a clear time-slot channel will be available.

  6. Methods for measuring the impact of health information technologies on clinicians' patterns of work and communication.

    PubMed

    Westbrook, Johanna I; Ampt, Amanda; Williamson, Margaret; Nguyen, Ken; Kearney, Leanne

    2007-01-01

    Evidence regarding how health information technologies influence clinical work patterns and support efficient practices is limited. Traditional paper-based data collection methods are unable to capture clinical work complexity and communication patterns. Our objective was to design and test an electronic data collection tool for work measurement studies which would allow efficient, accurate and reliable data collection, and capture work complexity. We developed software on a personal digital assistant (PDA) which captures details of nurses' work; what task, with whom, and with what; multi-tasking; interruptions and event duration. During field-testing over seven months across four hospital wards, fifty-two nurses were observed for 250 hours. Inter-rater reliability scores were maintained at over 85%. Only 1% of tasks did not match the classification developed. Over 40% of nurses' time was spent in direct care or professional communication, with 11.8% in multi-tasking. Nurses were interrupted approximately every 49 minutes. One quarter of interruptions occurred while nurses were preparing or administering medications. This approach produces data which provides greater insights into patterns of clinician's work than has previously been possible.

  7. Methods to evaluate health information systems in healthcare settings: a literature review.

    PubMed

    Rahimi, Bahlol; Vimarlund, Vivian

    2007-10-01

    Although information technology (IT)-based applications in healthcare have existed for more than three decades, methods to evaluate outputs and outcomes of the use of IT-based systems in medical informatics is still a challenge for decision makers, as well as to those who want to measure the effects of ICT in healthcare settings. The aim of this paper is to review published articles in the area evaluations of IT-based systems in order to gain knowledge about methodologies used and findings obtained from the evaluation of IT-based systems applied in healthcare settings. The literature review includes studies of IT-based systems between 2003 and 2005. The findings show that economic and organizational aspects dominate evaluation studies in this area. However, the results focus mostly on positive outputs such as user satisfaction, financial benefits and improved organizational work. This review shows that there is no standard framework for evaluation effects and outputs of implementation and use of IT in the healthcare setting and that until today no studies explore the impact of IT on the healthcare system' productivity and effectiveness.

  8. New method adaptive to geospatial information acquisition and share based on grid

    NASA Astrophysics Data System (ADS)

    Fu, Yingchun; Yuan, Xiuxiao

    2005-11-01

    As we all know, it is difficult and time-consuming to acquire and share multi-source geospatial information in grid computing environment, especially for the data of different geo-reference benchmark. Although middleware for data format transformation has been applied by many grid applications and GIS software systems, it remains difficult to on demand realize spatial data assembly jobs among various geo-reference benchmarks because of complex computation of rigorous coordinate transformation model. To address the problem, an efficient hierarchical quadtree structure referred as multi-level grids is designed and coded to express the multi-scale global geo-space. The geospatial objects located in a certain grid of multi-level grids may be expressed as an increment value which is relative to the grid central point and is constant in different geo-reference benchmark. A mediator responsible for geo-reference transformation function with multi-level grids has been developed and aligned with grid service. With help of the mediator, a map or query spatial data sets from individual source of different geo-references can be merged into an uniform composite result. Instead of complex data pre-processing prior to compatible spatial integration, the introduced method is adaptive to be integrated with grid-enable service.

  9. The role of local observations as evidence to inform effective mitigation methods for flood risk management

    NASA Astrophysics Data System (ADS)

    Quinn, Paul; ODonnell, Greg; Owen, Gareth

    2014-05-01

    This poster presents a case study that highlights two crucial aspects of a catchment-based flood management project that were used to encourage uptake of an effective flood management strategy. Specifically, (1) the role of detailed local scale observations and (2) a modelling method informed by these observations. Within a 6km2 study catchment, Belford UK, a number of Runoff Attenuation Features (RAFs) have been constructed (including ponds, wetlands and woody debris structures) to address flooding issues in the downstream village. The storage capacity of the RAFs is typically small (200 to 500m3), hence there was skepticism as to whether they would work during large flood events. Monitoring was performed using a dense network of water level recorders installed both within the RAFs and within the stream network. Using adjacent upstream and downstream water levels in the stream network and observations within the actual ponds, a detailed understanding of the local performance of the RAFs was gained. However, despite understanding the local impacts of the features, the impact on the downstream hydrograph at the catchment scale could still not be ascertained with any certainty. The local observations revealed that the RAFs typically filled on the rising limb of the hydrograph; hence there was no available storage at the time of arrival of a large flow peak. However, it was also clear that an impact on the rising limb of the hydrograph was being observed. This knowledge of the functioning of individual features was used to create a catchment model, in which a network of RAFs could then be configured to examine the aggregated impacts. This Pond Network Model (PNM) was based on the observed local physical relationships and allowed a user specified sequence of ponds to be configured into a cascade structure. It was found that there was a minimum number of RAFs needed before an impact on peak flow was achieved for a large flood event. The number of RAFs required in the

  10. An Approach to Reducing Information Loss and Achieving Diversity of Sensitive Attributes in k-anonymity Methods.

    PubMed

    Yoo, Sunyong; Shin, Moonshik; Lee, Doheon

    2012-11-13

    Electronic Health Records (EHRs) enable the sharing of patients' medical data. Since EHRs include patients' private data, access by researchers is restricted. Therefore k-anonymity is necessary to keep patients' private data safe without damaging useful medical information. However, k-anonymity cannot prevent sensitive attribute disclosure. An alternative, l-diversity, has been proposed as a solution to this problem and is defined as: each Q-block (ie, each set of rows corresponding to the same value for identifiers) contains at least l well-represented values for each sensitive attribute. While l-diversity protects against sensitive attribute disclosure, it is limited in that it focuses only on diversifying sensitive attributes. The aim of the study is to develop a k-anonymity method that not only minimizes information loss but also achieves diversity of the sensitive attribute. This paper proposes a new privacy protection method that uses conditional entropy and mutual information. This method considers both information loss as well as diversity of sensitive attributes. Conditional entropy can measure the information loss by generalization, and mutual information is used to achieve the diversity of sensitive attributes. This method can offer appropriate Q-blocks for generalization. We used the adult database from the UCI Machine Learning Repository and found that the proposed method can greatly reduce information loss compared with a recent l-diversity study. It can also achieve the diversity of sensitive attributes by counting the number of Q-blocks that have leaks of diversity. This study provides a privacy protection method that can improve data utility and protect against sensitive attribute disclosure. The method is viable and should be of interest for further privacy protection in EHR applications.

  11. An Approach to Reducing Information Loss and Achieving Diversity of Sensitive Attributes in k-anonymity Methods

    PubMed Central

    Yoo, Sunyong; Shin, Moonshik

    2012-01-01

    Electronic Health Records (EHRs) enable the sharing of patients’ medical data. Since EHRs include patients’ private data, access by researchers is restricted. Therefore k-anonymity is necessary to keep patients’ private data safe without damaging useful medical information. However, k-anonymity cannot prevent sensitive attribute disclosure. An alternative, l-diversity, has been proposed as a solution to this problem and is defined as: each Q-block (ie, each set of rows corresponding to the same value for identifiers) contains at least l well-represented values for each sensitive attribute. While l-diversity protects against sensitive attribute disclosure, it is limited in that it focuses only on diversifying sensitive attributes. The aim of the study is to develop a k-anonymity method that not only minimizes information loss but also achieves diversity of the sensitive attribute. This paper proposes a new privacy protection method that uses conditional entropy and mutual information. This method considers both information loss as well as diversity of sensitive attributes. Conditional entropy can measure the information loss by generalization, and mutual information is used to achieve the diversity of sensitive attributes. This method can offer appropriate Q-blocks for generalization. We used the adult database from the UCI Machine Learning Repository and found that the proposed method can greatly reduce information loss compared with a recent l-diversity study. It can also achieve the diversity of sensitive attributes by counting the number of Q-blocks that have leaks of diversity. This study provides a privacy protection method that can improve data utility and protect against sensitive attribute disclosure. The method is viable and should be of interest for further privacy protection in EHR applications. PMID:23612074

  12. Using Fisher Information Criteria for Chemical Sensor Selection via Convex Optimization Methods

    DTIC Science & Technology

    2016-11-16

    best sensors after an optimization procedure. Due to the positive definite nature of the Fisher information matrix, convex optimization may be used to...parametrized to select the best sensors after an optimization procedure. Due to the positive definite nature of the Fisher information matrix, convex op...Naval Research Laboratory Washington, DC 20375-5320 NRL/MR/6180--16-9711 Using Fisher Information Criteria for Chemical Sensor Selection via Convex

  13. A gene-based information gain method for detecting gene-gene interactions in case-control studies.

    PubMed

    Li, Jin; Huang, Dongli; Guo, Maozu; Liu, Xiaoyan; Wang, Chunyu; Teng, Zhixia; Zhang, Ruijie; Jiang, Yongshuai; Lv, Hongchao; Wang, Limei

    2015-11-01

    Currently, most methods for detecting gene-gene interactions (GGIs) in genome-wide association studies are divided into SNP-based methods and gene-based methods. Generally, the gene-based methods can be more powerful than SNP-based methods. Some gene-based entropy methods can only capture the linear relationship between genes. We therefore proposed a nonparametric gene-based information gain method (GBIGM) that can capture both linear relationship and nonlinear correlation between genes. Through simulation with different odds ratio, sample size and prevalence rate, GBIGM was shown to be valid and more powerful than classic KCCU method and SNP-based entropy method. In the analysis of data from 17 genes on rheumatoid arthritis, GBIGM was more effective than the other two methods as it obtains fewer significant results, which was important for biological verification. Therefore, GBIGM is a suitable and powerful tool for detecting GGIs in case-control studies.

  14. Frontal information flow and connectivity in psychopathy.

    PubMed

    Yang, Yaling; Raine, Adrian; Joshi, Anand A; Joshi, Shantanu; Chang, Yu-Teng; Schug, Robert A; Wheland, David; Leahy, Richard; Narr, Katherine L

    2012-11-01

    Despite accumulating evidence of structural deficits in individuals with psychopathy, especially in frontal regions, our understanding of systems-level disturbances in cortical networks remains limited. We applied novel graph theory-based methods to assess information flow and connectivity based on cortical thickness measures in 55 individuals with psychopathy and 47 normal controls. Compared with controls, the psychopathy group showed significantly altered interregional connectivity patterns. Furthermore, bilateral superior frontal cortices in the frontal network were identified as information flow control hubs in the psychopathy group in contrast to bilateral inferior frontal and medial orbitofrontal cortices as network hubs of the controls. Frontal information flow and connectivity may have a significant role in the neuropathology of psychopathy.

  15. Integrating Safety Assessment Methods using the Risk Informed Safety Margins Characterization (RISMC) Approach

    SciTech Connect

    Curtis Smith; Diego Mandelli

    2013-03-01

    Safety is central to the design, licensing, operation, and economics of nuclear power plants (NPPs). As the current light water reactor (LWR) NPPs age beyond 60 years, there are possibilities for increased frequency of systems, structures, and components (SSC) degradations or failures that initiate safety significant events, reduce existing accident mitigation capabilities, or create new failure modes. Plant designers commonly “over-design” portions of NPPs and provide robustness in the form of redundant and diverse engineered safety features to ensure that, even in the case of well-beyond design basis scenarios, public health and safety will be protected with a very high degree of assurance. This form of defense-in-depth is a reasoned response to uncertainties and is often referred to generically as “safety margin.” Historically, specific safety margin provisions have been formulated primarily based on engineering judgment backed by a set of conservative engineering calculations. The ability to better characterize and quantify safety margin is important to improved decision making about LWR design, operation, and plant life extension. A systematic approach to characterization of safety margins and the subsequent margin management options represents a vital input to the licensee and regulatory analysis and decision making that will be involved. In addition, as research and development (R&D) in the LWR Sustainability (LWRS) Program and other collaborative efforts yield new data, sensors, and improved scientific understanding of physical processes that govern the aging and degradation of plant SSCs needs and opportunities to better optimize plant safety and performance will become known. To support decision making related to economics, readability, and safety, the RISMC Pathway provides methods and tools that enable mitigation options known as margins management strategies. The purpose of the RISMC Pathway R&D is to support plant decisions for risk-informed

  16. Using Pop Culture to Teach Information Literacy: Methods to Engage a New Generation

    ERIC Educational Resources Information Center

    Behen, Linda D.

    2006-01-01

    Building on the information needs and the learning style preferences of today's high school students, the author builds a case for using pop culture (TV shows, fads, and current technology) to build integrated information skills lessons for students. Chapters include a rationale, a review of the current literature, and examples of units of study…

  17. Storytelling As a Method of Communicating Information About Other Cultures; An Experimental Study.

    ERIC Educational Resources Information Center

    Ferguson, Mavis B.

    A study was devised to test the effectiveness of storytelling in transmitting information about other cultures. A story containing information about marriage customs and burial traditions in India was presented either by a class instructor, on audio tape, or on video tape to the 47 undergraduate students who served as subjects. Analyses of gains…

  18. Methods for Evaluating Costs of Automated Hospital Information Systems. Research Summary Series.

    ERIC Educational Resources Information Center

    Drazen, Erica; Metzger, Jane

    To provide a compendium of methodologies on cost impacts of automated hospital information systems (AHIS), this report sponsored by the National Center for Services Research identifies, reviews, and summarizes ten studies on information systems which manage patient care data. The studies were identified by a literature search and those that…

  19. Emerging Information Literacy and Research-Method Competencies in Urban Community College Psychology Students

    ERIC Educational Resources Information Center

    Wolfe, Kate S.

    2015-01-01

    This article details an assignment developed to teach students at urban community colleges information-literacy skills. This annotated bibliography assignment introduces students to library research skills, helps increase information literacy in beginning college students, and helps psychology students learn research methodology crucial in…

  20. Automated Methods to Extract Patient New Information from Clinical Notes in Electronic Health Record Systems

    ERIC Educational Resources Information Center

    Zhang, Rui

    2013-01-01

    The widespread adoption of Electronic Health Record (EHR) has resulted in rapid text proliferation within clinical care. Clinicians' use of copying and pasting functions in EHR systems further compounds this by creating a large amount of redundant clinical information in clinical documents. A mixture of redundant information (especially outdated…

  1. Consumer Health Information Behavior in Public Libraries: A Mixed Methods Study

    ERIC Educational Resources Information Center

    Yi, Yong Jeong

    2012-01-01

    Previous studies indicated inadequate health literacy of American adults as one of the biggest challenges for consumer health information services provided in public libraries. Little attention, however, has been paid to public users' health literacy and health information behaviors. In order to bridge the research gap, the study aims to…

  2. Accidental Discovery of Information on the User-Defined Social Web: A Mixed-Method Study

    ERIC Educational Resources Information Center

    Lu, Chi-Jung

    2012-01-01

    Frequently interacting with other people or working in an information-rich environment can foster the "accidental discovery of information" (ADI) (Erdelez, 2000; McCay-Peet & Toms, 2010). With the increasing adoption of social web technologies, online user-participation communities and user-generated content have provided users the…

  3. Professional Identity Development among Graduate Library and Information Studies Online Learners: A Mixed Methods Study

    ERIC Educational Resources Information Center

    Croxton, Rebecca A.

    2015-01-01

    This study explores how factors relating to fully online Master of Library and Information Studies (MLIS) students' connectedness with peers and faculty may impact their professional identity development as library and information studies professionals. Participants include students enrolled in a fully online MLIS degree program in the…

  4. The Swedish strategy and method for development of a national healthcare information architecture.

    PubMed

    Rosenälv, Jessica; Lundell, Karl-Henrik

    2012-01-01

    "We need a precise framework of regulations in order to maintain appropriate and structured health care documentation that ensures that the information maintains a sufficient level of quality to be used in treatment, in research and by the actual patient. The users shall be aided by clearly and uniformly defined terms and concepts, and there should be an information structure that clarifies what to document and how to make the information more useful. Most of all, we need to standardize the information, not just the technical systems." (eHälsa - nytta och näring, Riksdag report 2011/12:RFR5, p. 37). In 2010, the Swedish Government adopted the National e-Health - the national strategy for accessible and secure information in healthcare. The strategy is a revision and extension of the previous strategy from 2006, which was used as input for the most recent efforts to develop a national information structure utilizing business-oriented generic models. A national decision on healthcare informatics standards was made by the Swedish County Councils, which decided to follow and use EN/ISO 13606 as a standard for the development of a universally applicable information structure, including archetypes and templates. The overall aim of the Swedish strategy for development of National Healthcare Information Architecture is to achieve high level semantic interoperability for clinical content and clinical contexts. High level semantic interoperability requires consistently structured clinical data and other types of data with coherent traceability to be mapped to reference clinical models. Archetypes that are formal definitions of the clinical and demographic concepts and some administrative data were developed. Each archetype describes the information structure and content of overarching core clinical concepts. Information that is defined in archetypes should be used for different purposes. Generic clinical process model was made concrete and analyzed. For each decision

  5. Methods and apparatus for capture and storage of semantic information with sub-files in a parallel computing system

    DOEpatents

    Faibish, Sorin; Bent, John M; Tzelnic, Percy; Grider, Gary; Torres, Aaron

    2015-02-03

    Techniques are provided for storing files in a parallel computing system using sub-files with semantically meaningful boundaries. A method is provided for storing at least one file generated by a distributed application in a parallel computing system. The file comprises one or more of a complete file and a plurality of sub-files. The method comprises the steps of obtaining a user specification of semantic information related to the file; providing the semantic information as a data structure description to a data formatting library write function; and storing the semantic information related to the file with one or more of the sub-files in one or more storage nodes of the parallel computing system. The semantic information provides a description of data in the file. The sub-files can be replicated based on semantically meaningful boundaries.

  6. Graph Theory-Based Analysis of the Lymph Node Fibroblastic Reticular Cell Network.

    PubMed

    Novkovic, Mario; Onder, Lucas; Bocharov, Gennady; Ludewig, Burkhard

    2017-01-01

    Secondary lymphoid organs have developed segregated niches that are able to initiate and maintain effective immune responses. Such global organization requires tight control of diverse cellular components, specifically those that regulate lymphocyte trafficking. Fibroblastic reticular cells (FRCs) form a densely interconnected network in lymph nodes and provide key factors necessary for T cell migration and retention, and foster subsequent interactions between T cells and dendritic cells. Development of integrative systems biology approaches has made it possible to elucidate this multilevel complexity of the immune system. Here, we present a graph theory-based analysis of the FRC network in murine lymph nodes, where generation of the network topology is performed using high-resolution confocal microscopy and 3D reconstruction. This approach facilitates the analysis of physical cell-to-cell connectivity, and estimation of topological robustness and global behavior of the network when it is subjected to perturbation in silico.

  7. Can functionalized cucurbituril bind actinyl cations efficiently? A density functional theory based investigation.

    PubMed

    Sundararajan, Mahesh; Sinha, Vivek; Bandyopadhyay, Tusar; Ghosh, Swapan K

    2012-05-03

    The feasibility of using cucurbituril host molecule as a probable actinyl cation binders candidate is investigated through density functional theory based calculations. Various possible binding sites of the cucurbit[5]uril host molecule to uranyl are analyzed and based on the binding energy evaluations, μ(5)-binding is predicted to be favored. For this coordination, the structure, vibrational spectra, and binding energies are evaluated for the binding of three actinyls in hexa-valent and penta-valent oxidation states with functionalized cucurbiturils. Functionalizing cucurbituril with methyl and cyclohexyl groups increases the binding affinities of actinyls, whereas fluorination decreases the binding affinities as compared to the native host molecule. Surprisingly hydroxylation of the host molecule does not distinguish the oxidation state of the three actinyls.

  8. An open-shell restricted Hartree-Fock perturbation theory based on symmetric spin orbitals

    NASA Technical Reports Server (NTRS)

    Lee, Timothy J.; Jayatilaka, Dylan

    1993-01-01

    A new open-shell perturbation theory is formulated in terms of symmetric spin orbitals. Only one set of spatial orbitals is required, thereby reducing the number of independent coefficients in the perturbed wavefunctions. For second order, the computational cost is shown to be similar to a closed-shell calculation. This formalism is therefore more efficient than the recently developed RMP, ROMP or RMP-MBPT theories. The perturbation theory described herein was designed to have a close correspondence with our recently proposed coupled-cluster theory based on symmetric spin orbitals. The first-order wavefunction contains contributions from only doubly excited determinants. Equilibrium structures and vibrational frequencies determined from second-order perturbation theory are presented for OH, NH, CH, 02, NH2 and CH2.

  9. A Gene Selection Method for Microarray Data Based on Binary PSO Encoding Gene-to-Class Sensitivity Information.

    PubMed

    Han, Fei; Yang, Chun; Wu, Ya-Qi; Zhu, Jian-Sheng; Ling, Qing-Hua; Song, Yu-Qing; Huang, De-Shuang

    2017-01-01

    Traditional gene selection methods for microarray data mainly considered the features' relevance by evaluating their utility for achieving accurate predication or exploiting data variance and distribution, and the selected genes were usually poorly explicable. To improve the interpretability of the selected genes as well as prediction accuracy, an improved gene selection method based on binary particle swarm optimization (BPSO) and prior information is proposed in this paper. In the proposed method, BPSO encoding gene-to-class sensitivity (GCS) information is used to perform gene selection. The gene-to-class sensitivity information, extracted from the samples by extreme learning machine (ELM), is encoded into the selection process in four aspects: initializing particles, updating the particles, modifying maximum velocity, and adopting mutation operation adaptively. Constrained by the gene-to-class sensitivity information, the new method can select functional gene subsets which are significantly sensitive to the samples' classes. With the few discriminative genes selected by the proposed method, ELM, K-nearest neighbor and support vector machine classifiers achieve much high prediction accuracy on five public microarray data, which in turn verifies the efficiency and effectiveness of the proposed gene selection method.

  10. Scenario-based design: A method for connecting information system design with public health operations and emergency management

    PubMed Central

    Reeder, Blaine; Turner, Anne M

    2011-01-01

    Responding to public health emergencies requires rapid and accurate assessment of workforce availability under adverse and changing circumstances. However, public health information systems to support resource management during both routine and emergency operations are currently lacking. We applied scenario-based design as an approach to engage public health practitioners in the creation and validation of an information design to support routine and emergency public health activities. Methods: Using semi-structured interviews we identified the information needs and activities of senior public health managers of a large municipal health department during routine and emergency operations. Results: Interview analysis identified twenty-five information needs for public health operations management. The identified information needs were used in conjunction with scenario-based design to create twenty-five scenarios of use and a public health manager persona. Scenarios of use and persona were validated and modified based on follow-up surveys with study participants. Scenarios were used to test and gain feedback on a pilot information system. Conclusion: The method of scenario-based design was applied to represent the resource management needs of senior-level public health managers under routine and disaster settings. Scenario-based design can be a useful tool for engaging public health practitioners in the design process and to validate an information system design. PMID:21807120

  11. A new discrimination method for the Concealed Information Test using pretest data and within-individual comparisons.

    PubMed

    Matsuda, Izumi; Hirota, Akihisa; Ogawa, Tokihiro; Takasawa, Noriyoshi; Shigemasu, Kazuo

    2006-08-01

    A latent class discrimination method is proposed for analyzing autonomic responses on the concealed information test. Because there are significant individual differences in autonomic responses, individual response patterns are estimated on the pretest. Then an appropriate discriminant formula for the response pattern of each individual is applied to the CIT test results. The probability that the individual concealed information is calculated by comparing the discriminant formula value of the crime-related item to that of non-crime-related items. The discrimination performance of the latent class discrimination method was higher than those of the logistic regression method and the discriminant analysis method in an experimental demonstration applying the three methods to the same data set.

  12. Strategies and methods for aligning current and best medical practices. The role of information technologies.

    PubMed Central

    Schneider, E C; Eisenberg, J M

    1998-01-01

    Rapid change in American medicine requires that physicians adjust established behaviors and acquire new skills. In this article, we address three questions: What do we know about how to change physicians' practices? How can physicians take advantage of new and evolving information technologies that are likely to have an impact on the future practice of medicine? and What strategic educational interventions will best enable physicians to show competencies in information management and readiness to change practice? We outline four guiding principles for incorporating information systems tools into both medical education and practice, and we make eight recommendations for the development of a new medical school curriculum. This curriculum will produce a future medical practitioner who is capable of using information technologies to systematically measure practice performance, appropriateness, and effectiveness while updating knowledge efficiently. PMID:9614787

  13. [Visualization and analysis of drug information on adverse reactions using data mining method, and its clinical application].

    PubMed

    Kawakami, Junko

    2014-01-01

    Sources of drug information such as package inserts (PIs) and interview forms (IFs) and existing drug information databases provide primarily document-based and numerical information. For this reason, it is not easy to obtain a complete picture of the information concerning many drugs with similar effects or to understand differences among drugs. The visualization of drug information may help provide a large amount of information in a short period, relieve the burden on medical workers, facilitate a comprehensive understanding and comparison of drugs, and contribute to improvements in patients' QOL. At our department, we are developing an approach to convert information on side effects obtained from PIs of many drugs with similar effects into visual maps reflecting the data structure through competitive learning using the self-organizing map (SOM) technique of Kohonen, which is a powerful method for pattern recognition, to facilitate the grasping of all available information and differences among drugs, to anticipate the appearance of side effects; we are also evaluating the possibility of its clinical application. In this paper, this approach is described by taking the examples of antibiotics, antihypertensive drugs, and diabetes drugs.

  14. Extending Value of Information Methods to Include the Co-Net Benefits of Earth Observations

    NASA Astrophysics Data System (ADS)

    Macauley, M.

    2015-12-01

    The widening relevance of Earth observations information across the spectrum of natural and environmental resources markedly enhances the value of these observations. An example is observations of forest extent, species composition, health, and change; this information can help in assessing carbon sequestration, biodiversity and habitat, watershed management, fuelwood potential, and other ecosystem services as well as inform the opportunity cost of forest removal for alternative land use such as agriculture, pasture, or development. These "stacked" indicators or co- net benefits add significant value to Earth observations. In part because of reliance on case studies, much previous research about the value of information from Earth observations has assessed individual applications rather than aggregate across applications, thus tending to undervalue the observations. Aggregating across applications is difficult, however, because it requires common units of measurement: controlling for spatial, spectral, and temporal attributes of the observations; and consistent application of value of information techniques. This paper will discuss general principles of co-net benefit aggregation and illustrate its application to attributing value to Earth observations.

  15. Robust coastal region detection method using image segmentation and sensor LOS information for infrared search and track

    NASA Astrophysics Data System (ADS)

    Kim, Sungho; Sun, Sun-Gu; Kwon, Soon; Kim, Kyung-Tae

    2013-05-01

    This paper presents a novel coastal region detection method for infrared search and track. The coastal region detection is critical to home land security and ship defense. Detected coastal region information can be used to the design of target detector such as moving target detection and threshold setting. We can detect coastal regions robustly by combining the infrared image segmentation and sensor line-of-sight (LOS) information. The K-means-based image segmentation can provide initial region information and the sensor LOS information can predict the approximate horizon location in images. The evidence of coastal region is confirmed by contour extraction results. The experimental results on remote coasts and near coasts validate the robustness of the proposed coastal region detector.

  16. 14 CFR 39.21 - Where can I get information about FAA-approved alternative methods of compliance?

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 14 Aeronautics and Space 1 2014-01-01 2014-01-01 false Where can I get information about FAA-approved alternative methods of compliance? 39.21 Section 39.21 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION AIRCRAFT AIRWORTHINESS DIRECTIVES § 39.21 Where can I...

  17. Sexual Health Information Seeking Online: A Mixed-Methods Study among Lesbian, Gay, Bisexual, and Transgender Young People

    ERIC Educational Resources Information Center

    Magee, Joshua C.; Bigelow, Louisa; DeHaan, Samantha; Mustanski, Brian S.

    2012-01-01

    The current study used a mixed-methods approach to investigate the positive and negative aspects of Internet use for sexual health information among lesbian, gay, bisexual, and transgender (LGBT) young people. A diverse community sample of 32 LGBT young people (aged 16-24 years) completed qualitative interviews focusing on how, where, and when…

  18. Unpacking (In)formal Learning in an Academic Development Programme: A Mixed-Method Social Network Perspective

    ERIC Educational Resources Information Center

    Rienties, Bart; Hosein, Anesa

    2015-01-01

    How and with whom academics develop and maintain formal and informal networks for reflecting on their teaching practice has received limited attention even though academic development (AD) programmes have become an almost ubiquitous feature of higher education. The primary goal of this mixed-method study is to unpack how 114 academics in an AD…

  19. Following Experts at Work in Their Own Information Spaces: Using Observational Methods To Develop Tools for the Digital Library.

    ERIC Educational Resources Information Center

    Gorman, Paul; Lavelle, Mary; Delcambre, Lois; Maier, David

    2002-01-01

    Offers an overview of the authors' experience using several observational methods to better understand one class of users, expert clinicians treating patients in hospital settings. Shows the evolution of understanding of the users and their information-handling tasks based on observations made in the field by a multidisciplinary research team, and…

  20. Study on the digitized and quantified evaluating method for super information characteristics of herbal preparation by infrared spectrum fingerprints

    PubMed Central

    Li, Lifeng; Li, Yanfei; Song, Aihua

    2014-01-01

    This paper aims to establish the infrared spectrum fingerprint (IRFP) in the absorbing region of 4,000-400 cm-1 and its first derivative infrared spectrum fingerprints (d-IRFP) of ginkgo tablet (GT). And set up theories of the digitized and quantified evaluating method for super information characteristics by IRFPs of traditional Chinese medicine (TCM) which consists of the IRFP index, information index, fluctuation index, information fluctuation index and the quantified infrared fingerprint method (QIFM). Direct tabletting method was applied during the data collection of the IRFPs of 14 batches of GTs by Fourier transform infrared spectrometer. In terms of the digitized features, QIFM and similarity analysis of d-IRFP, sample S4 and S7 were evaluated as suspected outliers while the qualities of S1, S2, S6 and S12 were less well and the rests were relatively good. The assessing approach makes the expression and processing of superposed information in IRFP of TCM digitized simple and effective. What’s more, an approach which can test total chemical contents in the complex system of TCM rapidly, simply and accurately was achieved by the application of QIFM based on IR technique. Finally, the quantitative and digitized infrared fingerprinting method was established as a novel approach to evaluate the quality of TCM. PMID:25405152

  1. Graph-Based Weakly-Supervised Methods for Information Extraction & Integration

    ERIC Educational Resources Information Center

    Talukdar, Partha Pratim

    2010-01-01

    The variety and complexity of potentially-related data resources available for querying--webpages, databases, data warehouses--has been growing ever more rapidly. There is a growing need to pose integrative queries "across" multiple such sources, exploiting foreign keys and other means of interlinking data to merge information from diverse…

  2. Applying Information-Retrieval Methods to Software Reuse: A Case Study.

    ERIC Educational Resources Information Center

    Stierna, Eric J.; Rowe, Neil C.

    2003-01-01

    Discusses reuse of existing software for new purposes as a key aspect of efficient software engineering by matching formal written requirements used to define the new and the old software. Explores two matching methodologies that use information retrieval techniques and describes test results from a comparison of two military systems. (Author/LRW)

  3. 78 FR 25440 - Request for Information and Citations on Methods for Cumulative Risk Assessment

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-01

    ... environment. The EPA is developing guidelines for the assessment of cumulative risk as defined and..., characterization, and possible quantification of the combined risks to health or the environment from multiple... development of regulations and permits. This notice solicits information and citations pertaining...

  4. User-Centered Perspective of Information Retrieval Research and Analysis Methods.

    ERIC Educational Resources Information Center

    Sugar, William

    1995-01-01

    Reviews information retrieval (IR) studies since 1986 from the user's perspective. Identifies two main approaches that advocate user-centered design theory: (1) the cognitive approach; and (2) the holistic approach. Also explores other approaches--systems thinking/action research and usability techniques that may have potential for IR research and…

  5. Interactive Visualization Systems and Data Integration Methods for Supporting Discovery in Collections of Scientific Information

    DTIC Science & Technology

    2011-05-01

    Visualization, and Bibliometrics : Multimedia Information Retrieval” and throughout the duration of my studies. He was always available with an open...door for discussions. Xia Lin guided me on my work with the ipl2 and provided some of the foundational contributions to the field of Bibliometric ... Bibliometrics .......................................................................................................................... 18 PageRank

  6. Relevance Thresholds: A Multi-stage Predictive Method of How Users Evaluate Information.

    ERIC Educational Resources Information Center

    Greisdorf, Howard

    2003-01-01

    Examines end-user judgment and evaluation behavior during information retrieval (IR) system interactions and extends previous research surrounding relevance as a key construct for representing the value end-users ascribe to items retrieved from IR systems. The self-reporting worksheet is appended. (Author/AEF)

  7. 78 FR 34427 - 2012 Tax Information for Use In The Revenue Shortfall Allocation Method

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-07

    ... Transp., Inc. v. STB, 568 F.3d 236 (D.C. Cir. 2009), and vacated in part on reh'g, CSX Transp., Inc. v. STB, 584 F.3d 1076 (D.C. Cir. 2009). In Annual Submission of Tax Information for Use in the...

  8. Organized Content Technique (OCT): A Method for Presenting Information in Education and Training.

    ERIC Educational Resources Information Center

    Wright, Elizabeth E; Pyatte, Jeff A.

    1983-01-01

    Effective display of organized content is demonstrated through the Organized Content Technique (OCT), which synthesizes, collates, and organizes information into a sound conceptual configuration, including layout, display, typography, descriptors, and style. The result is a content page that has entity and unity and is pleasing and effective with…

  9. A Method for Rating Computer-Based Career Information Delivery Systems.

    ERIC Educational Resources Information Center

    Bloch, Deborah Perlmutter; Kinnison, Joyce Ford

    1989-01-01

    Developed a three-part rating system for computer-based career information delivery systems in the areas of comprehensiveness, accuracy, and effectiveness. Used system to rate five popular computer-based systems (C-LECT, CHOICES, CIS, DISCOVER, and GIS). Four systems were evaluated as being very similar, with CIS receiving highest scores.…

  10. Communication and Research Skills in the Information Systems Curriculum: A Method of Assessment

    ERIC Educational Resources Information Center

    Lazarony, Paul J.; Driscoll, Donna A.

    2010-01-01

    Assessment of learning goals has become the norm in business programs in higher education across the country. This paper offers a methodology for the assessment of both communication skills and research skills within a curriculum of the Bachelor of Science in Information Systems Program. Program level learning goals assessed in this paper are: (1)…

  11. 77 FR 34124 - 2011 Tax Information for Use in the Revenue Shortfall Allocation Method

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-08

    ... Transp., Inc. v. STB, 568 F.3d 236 (DC Cir. 2009), and vacated in part on reh'g, CSX Transp., Inc. v. STB, 584 F.3d 1076 (DC Cir. 2009). In Annual Submission of Tax Information for Use in the Revenue...

  12. A Hierarchy Fuzzy MCDM Method for Studying Electronic Marketing Strategies in the Information Service Industry.

    ERIC Educational Resources Information Center

    Tang, Michael T.; Tzeng, Gwo-Hshiung

    In this paper, the impacts of Electronic Commerce (EC) on the international marketing strategies of information service industries are studied. In seeking to blend humanistic concerns in this research with technological development by addressing challenges for deterministic attitudes, the paper examines critical environmental factors relevant to…

  13. Electronic and Courier Methods of Information Dissemination: A Test of Accuracy.

    ERIC Educational Resources Information Center

    DeWine, Sue; And Others

    As part of a larger endeavor to evaluate the impact of communication technology on organizations, this study assesses the accuracy of information diffusion via electronic-mail and courier-mail systems in two large organizations which have implemented electronic-mail systems in the last three years. Data were obtained through the use of…

  14. Methods of extending signatures and training without ground information. [data processing, pattern recognition

    NASA Technical Reports Server (NTRS)

    Henderson, R. G.; Thomas, G. S.; Nalepka, R. F.

    1975-01-01

    Methods of performing signature extension, using LANDSAT-1 data, are explored. The emphasis is on improving the performance and cost-effectiveness of large area wheat surveys. Two methods were developed: ASC, and MASC. Two methods, Ratio, and RADIFF, previously used with aircraft data were adapted to and tested on LANDSAT-1 data. An investigation into the sources and nature of between scene data variations was included. Initial investigations into the selection of training fields without in situ ground truth were undertaken.

  15. Novel classification method for remote sensing images based on information entropy discretization algorithm and vector space model

    NASA Astrophysics Data System (ADS)

    Xie, Li; Li, Guangyao; Xiao, Mang; Peng, Lei

    2016-04-01

    Various kinds of remote sensing image classification algorithms have been developed to adapt to the rapid growth of remote sensing data. Conventional methods typically have restrictions in either classification accuracy or computational efficiency. Aiming to overcome the difficulties, a new solution for remote sensing image classification is presented in this study. A discretization algorithm based on information entropy is applied to extract features from the data set and a vector space model (VSM) method is employed as the feature representation algorithm. Because of the simple structure of the feature space, the training rate is accelerated. The performance of the proposed method is compared with two other algorithms: back propagation neural networks (BPNN) method and ant colony optimization (ACO) method. Experimental results confirm that the proposed method is superior to the other algorithms in terms of classification accuracy and computational efficiency.

  16. Physics and computer architecture informed improvements to the Implicit Monte Carlo method

    NASA Astrophysics Data System (ADS)

    Long, Alex Roberts

    The Implicit Monte Carlo (IMC) method has been a standard method for thermal radiative transfer for the past 40 years. In this time, the hydrodynamics methods that are coupled to IMC have evolved and improved, as have the supercomputers used to run large simulations with IMC. Several modern hydrodynamics methods use unstructured non-orthogonal meshes and high-order spatial discretizations. The IMC method has been used primarily with simple Cartesian meshes and always has a first order spatial discretization. Supercomputers are now made up of compute nodes that have a large number of cores. Current IMC parallel methods have significant problems with load imbalance. To utilize many core systems, algorithms must move beyond simple spatial decomposition parallel algorithms. To make IMC better suited for large scale multiphysics simulations in high energy density physics, new spatial discretizations and parallel strategies are needed. Several modifications are made to the IMC method to facilitate running on node-centered, unstructured tetrahedral meshes. These modifications produce results that converge to the expected solution under mesh refinement. A new finite element IMC method is also explored on these meshes, which offer a simulation runtime benefit but does not perform correctly in the diffusion limit. A parallel algorithm that utilizes on-node parallelism and respects memory hierarchies is studied. This method scales almost linearly when using physical cores on a node and benefits from multiple threads per core. A multi-compute node algorithm for domain decomposed IMC that passes mesh data instead of particles is explored as a means to solve load balance issues. This method scales better than the particle passing method on highly scattering problems with short time steps.

  17. Personalised Information Services Using a Hybrid Recommendation Method Based on Usage Frequency

    ERIC Educational Resources Information Center

    Kim, Yong; Chung, Min Gyo

    2008-01-01

    Purpose: This paper seeks to describe a personal recommendation service (PRS) involving an innovative hybrid recommendation method suitable for deployment in a large-scale multimedia user environment. Design/methodology/approach: The proposed hybrid method partitions content and user into segments and executes association rule mining,…

  18. An Information Retrieval Model Based on Vector Space Method by Supervised Learning.

    ERIC Educational Resources Information Center

    Tai, Xiaoying; Ren, Fuji; Kita, Kenji

    2002-01-01

    Proposes a method to improve retrieval performance of the vector space model by using users' relevance feedback. Discusses the use of singular value decomposition and the latent semantic indexing model, and reports the results of two experiments that show the effectiveness of the proposed method. (Author/LRW)

  19. Bootstrap rank-ordered conditional mutual information (broCMI): A nonlinear input variable selection method for water resources modeling

    NASA Astrophysics Data System (ADS)

    Quilty, John; Adamowski, Jan; Khalil, Bahaa; Rathinasamy, Maheswaran

    2016-03-01

    The input variable selection problem has recently garnered much interest in the time series modeling community, especially within water resources applications, demonstrating that information theoretic (nonlinear)-based input variable selection algorithms such as partial mutual information (PMI) selection (PMIS) provide an improved representation of the modeled process when compared to linear alternatives such as partial correlation input selection (PCIS). PMIS is a popular algorithm for water resources modeling problems considering nonlinear input variable selection; however, this method requires the specification of two nonlinear regression models, each with parametric settings that greatly influence the selected input variables. Other attempts to develop input variable selection methods using conditional mutual information (CMI) (an analog to PMI) have been formulated under different parametric pretenses such as k nearest-neighbor (KNN) statistics or kernel density estimates (KDE). In this paper, we introduce a new input variable selection method based on CMI that uses a nonparametric multivariate continuous probability estimator based on Edgeworth approximations (EA). We improve the EA method by considering the uncertainty in the input variable selection procedure by introducing a bootstrap resampling procedure that uses rank statistics to order the selected input sets; we name our proposed method bootstrap rank-ordered CMI (broCMI). We demonstrate the superior performance of broCMI when compared to CMI-based alternatives (EA, KDE, and KNN), PMIS, and PCIS input variable selection algorithms on a set of seven synthetic test problems and a real-world urban water demand (UWD) forecasting experiment in Ottawa, Canada.

  20. The cognitive interview method of conducting police interviews: eliciting extensive information and promoting therapeutic jurisprudence.

    PubMed

    Fisher, Ronald P; Geiselman, R Edward

    2010-01-01

    Police officers receive little or no training to conduct interviews with cooperative witnesses, and as a result they conduct interviews poorly, eliciting less information than is available and providing little support to assist victims overcome psychological problems that may have arisen from the crime. We analyze the components of a typical police interview that limits the amount of information witnesses communicate, and which militate against victims' overcoming psychological problems. We then describe an alternative interviewing protocol, the Cognitive Interview, which enhances witness recollection and also likely contributes to victims' well being. The component elements of the Cognitive Interview are described, with emphasis on those elements that likely promote better witness recollection and also help to assist victims' psychological health.

  1. Method and apparatus for optimizing a train trip using signal information

    DOEpatents

    Kumar, Ajith Kuttannair; Daum, Wolfgang; Otsubo, Tom; Hershey, John Erik; Hess, Gerald James

    2014-06-10

    A system is provided for operating a railway network including a first railway vehicle during a trip along track segments. The system includes a first element for determining travel parameters of the first railway vehicle, a second element for determining travel parameters of a second railway vehicle relative to the track segments to be traversed by the first vehicle during the trip, a processor for receiving information from the first and the second elements and for determining a relationship between occupation of a track segment by the second vehicle and later occupation of the same track segment by the first vehicle and an algorithm embodied within the processor having access to the information to create a trip plan that determines a speed trajectory for the first vehicle. The speed trajectory is responsive to the relationship and further in accordance with one or more operational criteria for the first vehicle.

  2. [Study of spectrum preprocessing method when applying the characteristic spectrum linear inversion modeling to extract the mineral information].

    PubMed

    Wang, Fei; Lin, Qi-zhong; Wang, Qin-jun; Li, Shuai

    2011-05-01

    The rapid identification of the minerals in the field is crucial in the remote sensing geology study and mineral exploration. The characteristic spectrum linear inversion modeling is able to obtain the mineral information quickly in the field study. However, the authors found that there was significant difference among the results of the model using the different kinds of spectra of the same sample. The present paper mainly studied the continuum based fast Fourier transform processing (CFFT) method and the characteristic spectrum linear inversion modeling (CSLM). On one hand, the authors obtained the optimal preferences of the CFFT method when applying it to rock samples: setting the CFFT low-pass frequency to 150 Hz. On the other hand, through the evaluation and study of the results of CSLM using different spectra, the authors found that the ASD spectra which were denoised in the CFFT method could provide better results when using them to extract the mineral information in the field.

  3. Information Theoretical Methods as Discerning Quantifiers of the Equations of State of Neutron Stars

    NASA Astrophysics Data System (ADS)

    Alvares de Souza, Rodrigo; de Avellar, Marcio G. B.; Horvath, Jorge E.; Paret, D. M.

    2016-04-01

    In this work we use the statistical measures of information entropy, disequilibrium and complexity to discriminate different approaches and parametrizations for different equations of state for quark stars. We confirm the usefulness of such quantities to quantify the role of interactions in such stars. We find that within this approach, a quark matter equation of state such as SU(2) NJL with vectorial coupling and phase transition is slightly favoured and deserves deeper studies.

  4. Track-weighted imaging methods: extracting information from a streamlines tractogram.

    PubMed

    Calamante, Fernando

    2017-02-08

    A whole-brain streamlines data-set (so-called tractogram) generated from diffusion MRI provides a wealth of information regarding structural connectivity in the brain. Besides visualisation strategies, a number of post-processing approaches have been proposed to extract more detailed information from the tractogram. One such approach is based on exploiting the information contained in the tractogram to generate track-weighted (TW) images. In the track-weighted imaging (TWI) approach, a very large number of streamlines are often generated throughout the brain, and an image is then computed based on properties of the streamlines themselves (e.g. based on the number of streamlines in each voxel, or their average length), or based on the values of an associated image (e.g. a diffusion anisotropy map, a T2 map) measured at the coordinates of the streamlines. This review article describes various approaches used to generate TW images and discusses the flexible formalism that TWI provides to generate a range of images with very different contrast, as well as the super-resolution properties of the resulting images. It also explains how this approach provides a powerful means to study structural and functional connectivity simultaneously. Finally, a number of key issues for its practical implementation are discussed.

  5. Grey-Theory-Based Optimization Model of Emergency Logistics Considering Time Uncertainty

    PubMed Central

    Qiu, Bao-Jian; Zhang, Jiang-Hua; Qi, Yuan-Tao; Liu, Yang

    2015-01-01

    Natural disasters occur frequently in recent years, causing huge casualties and property losses. Nowadays, people pay more and more attention to the emergency logistics problems. This paper studies the emergency logistics problem with multi-center, multi-commodity, and single-affected-point. Considering that the path near the disaster point may be damaged, the information of the state of the paths is not complete, and the travel time is uncertainty, we establish the nonlinear programming model that objective function is the maximization of time-satisfaction degree. To overcome these drawbacks: the incomplete information and uncertain time, this paper firstly evaluates the multiple roads of transportation network based on grey theory and selects the reliable and optimal path. Then simplify the original model under the scenario that the vehicle only follows the optimal path from the emergency logistics center to the affected point, and use Lingo software to solve it. The numerical experiments are presented to show the feasibility and effectiveness of the proposed method. PMID:26417946

  6. Grey-Theory-Based Optimization Model of Emergency Logistics Considering Time Uncertainty.

    PubMed

    Qiu, Bao-Jian; Zhang, Jiang-Hua; Qi, Yuan-Tao; Liu, Yang

    2015-01-01

    Natural disasters occur frequently in recent years, causing huge casualties and property losses. Nowadays, people pay more and more attention to the emergency logistics problems. This paper studies the emergency logistics problem with multi-center, multi-commodity, and single-affected-point. Considering that the path near the disaster point may be damaged, the information of the state of the paths is not complete, and the travel time is uncertainty, we establish the nonlinear programming model that objective function is the maximization of time-satisfaction degree. To overcome these drawbacks: the incomplete information and uncertain time, this paper firstly evaluates the multiple roads of transportation network based on grey theory and selects the reliable and optimal path. Then simplify the original model under the scenario that the vehicle only follows the optimal path from the emergency logistics center to the affected point, and use Lingo software to solve it. The numerical experiments are presented to show the feasibility and effectiveness of the proposed method.

  7. An Internet compendium of analytical methods and spectroscopic information for monomers and additives used in food packaging plastics.

    PubMed

    Gilbert, J; Simoneau, C; Cote, D; Boenke, A

    2000-10-01

    An internet website (http:¿cpf.jrc.it/smt/) has been produced as a means of dissemination of methods of analysis and supporting spectroscopic information on monomers and additives used for food contact materials (principally packaging). The site which is aimed primarily at assisting food control laboratories in the European Union contains analytical information on monomers, starting substances and additives used in the manufacture of plastics materials. A searchable index is provided giving PM and CAS numbers for each of 255 substances. For each substance a data sheet gives regulatory information, chemical structures, physico-chemical information and background information on the use of the substance in particular plastics, and the food packaging applications. For monomers and starting substances (155 compounds) the infra-red and mass spectra are provided, and for additives (100 compounds); additionally proton NMR are available for about 50% of the entries. Where analytical methods have been developed for determining these substances as residual amounts in plastics or as trace amounts in food simulants these methods are also on the website. All information is provided in portable document file (PDF) format which means that high quality copies can be readily printed, using freely available Adobe Acrobat Reader software. The website will in future be maintained and up-dated by the European Commission's Joint Research Centre (JRC) as new substances are authorized for use by the European Commission (DG-ENTR formerly DGIII). Where analytical laboratories (food control or other) require reference substances these can be obtained free-of-charge from a reference collection housed at the JRC and maintained in conjunction with this website compendium.

  8. Increasing Cervical Cancer Awareness and Screening in Jamaica: Effectiveness of a Theory-Based Educational Intervention

    PubMed Central

    Coronado Interis, Evelyn; Anakwenze, Chidinma P.; Aung, Maug; Jolly, Pauline E.

    2015-01-01

    Despite declines in cervical cancer mortality in developed countries, cervical cancer incidence and mortality rates remain high in Jamaica due to low levels of screening. Effective interventions are needed to decrease barriers to preventive behaviors and increase adoption of behaviors and services to improve prospects of survival. We enrolled 225 women attending health facilities in an intervention consisting of a pre-test, educational presentation and post-test. The questionnaires assessed attitudes, knowledge, risk factors, and symptoms of cervical cancer among women. Changes in knowledge and intention to screen were assessed using paired t-tests and tests for correlated proportions. Participants were followed approximately six months post-intervention to determine cervical cancer screening rates. We found statistically significant increases from pre-test to post-test in the percentage of questions correctly answered and in participants’ intention to screen for cervical cancer. The greatest improvements were observed in responses to questions on knowledge, symptoms and prevention, with some items increasing up to 62% from pre-test to post-test. Of the 123 women reached for follow-up, 50 (40.7%) screened for cervical cancer. This theory-based education intervention significantly increased knowledge of and intention to screen for cervical cancer, and may be replicated in similar settings to promote awareness and increase screening rates. PMID:26703641

  9. Increasing Cervical Cancer Awareness and Screening in Jamaica: Effectiveness of a Theory-Based Educational Intervention.

    PubMed

    Coronado Interis, Evelyn; Anakwenze, Chidinma P; Aung, Maug; Jolly, Pauline E

    2015-12-22

    Despite declines in cervical cancer mortality in developed countries, cervical cancer incidence and mortality rates remain high in Jamaica due to low levels of screening. Effective interventions are needed to decrease barriers to preventive behaviors and increase adoption of behaviors and services to improve prospects of survival. We enrolled 225 women attending health facilities in an intervention consisting of a pre-test, educational presentation and post-test. The questionnaires assessed attitudes, knowledge, risk factors, and symptoms of cervical cancer among women. Changes in knowledge and intention to screen were assessed using paired t-tests and tests for correlated proportions. Participants were followed approximately six months post-intervention to determine cervical cancer screening rates. We found statistically significant increases from pre-test to post-test in the percentage of questions correctly answered and in participants' intention to screen for cervical cancer. The greatest improvements were observed in responses to questions on knowledge, symptoms and prevention, with some items increasing up to 62% from pre-test to post-test. Of the 123 women reached for follow-up, 50 (40.7%) screened for cervical cancer. This theory-based education intervention significantly increased knowledge of and intention to screen for cervical cancer, and may be replicated in similar settings to promote awareness and increase screening rates.

  10. The use of theory based semistructured elicitation questionnaires: formative research for CDC's Prevention Marketing Initiative.

    PubMed Central

    Middlestadt, S E; Bhattacharyya, K; Rosenbaum, J; Fishbein, M; Shepherd, M

    1996-01-01

    Through one of its many HIV prevention programs, the Prevention Marketing Initiative, the Centers for Disease Control and Prevention promotes a multifaceted strategy for preventing the sexual transmission of HIV/AIDS among people less than 25 years of age. The Prevention Marketing Initiative is an application of marketing and consumer-oriented technologies that rely heavily on behavioral research and behavior change theories to bring the behavioral and social sciences to bear on practical program planning decisions. One objective of the Prevention Marketing Initiative is to encourage consistent and correct condom use among sexually active young adults. Qualitative formative research is being conducted in several segments of the population of heterosexually active, unmarried young adults between 18 and 25 using a semistructured elicitation procedure to identify and understand underlying behavioral determinants of consistent condom use. The purpose of this paper is to illustrate the use of this type of qualitative research methodology in designing effective theory-based behavior change interventions. Issues of research design and data collection and analysis are discussed. To illustrate the methodology, results of content analyses of selected responses to open-ended questions on consistent condom use are presented by gender (male, female), ethnic group (white, African American), and consistency of condom use (always, sometimes). This type of formative research can be applied immediately to designing programs and is invaluable for valid and relevant larger-scale quantitative research. PMID:8862153

  11. Securing Mobile Ad Hoc Networks Using Danger Theory-Based Artificial Immune Algorithm

    PubMed Central

    2015-01-01

    A mobile ad hoc network (MANET) is a set of mobile, decentralized, and self-organizing nodes that are used in special cases, such as in the military. MANET properties render the environment of this network vulnerable to different types of attacks, including black hole, wormhole and flooding-based attacks. Flooding-based attacks are one of the most dangerous attacks that aim to consume all network resources and thus paralyze the functionality of the whole network. Therefore, the objective of this paper is to investigate the capability of a danger theory-based artificial immune algorithm called the mobile dendritic cell algorithm (MDCA) to detect flooding-based attacks in MANETs. The MDCA applies the dendritic cell algorithm (DCA) to secure the MANET with additional improvements. The MDCA is tested and validated using Qualnet v7.1 simulation tool. This work also introduces a new simulation module for a flooding attack called the resource consumption attack (RCA) using Qualnet v7.1. The results highlight the high efficiency of the MDCA in detecting RCAs in MANETs. PMID:25946001

  12. Improving the Impact and Implementation of Disaster Education: Programs for Children Through Theory-Based Evaluation.

    PubMed

    Johnson, Victoria A; Ronan, Kevin R; Johnston, David M; Peace, Robin

    2016-11-01

    A main weakness in the evaluation of disaster education programs for children is evaluators' propensity to judge program effectiveness based on changes in children's knowledge. Few studies have articulated an explicit program theory of how children's education would achieve desired outcomes and impacts related to disaster risk reduction in households and communities. This article describes the advantages of constructing program theory models for the purpose of evaluating disaster education programs for children. Following a review of some potential frameworks for program theory development, including the logic model, the program theory matrix, and the stage step model, the article provides working examples of these frameworks. The first example is the development of a program theory matrix used in an evaluation of ShakeOut, an earthquake drill practiced in two Washington State school districts. The model illustrates a theory of action; specifically, the effectiveness of school earthquake drills in preventing injuries and deaths during disasters. The second example is the development of a stage step model used for a process evaluation of What's the Plan Stan?, a voluntary teaching resource distributed to all New Zealand primary schools for curricular integration of disaster education. The model illustrates a theory of use; specifically, expanding the reach of disaster education for children through increased promotion of the resource. The process of developing the program theory models for the purpose of evaluation planning is discussed, as well as the advantages and shortcomings of the theory-based approaches.

  13. Graph Theory-Based Brain Connectivity for Automatic Classification of Multiple Sclerosis Clinical Courses

    PubMed Central

    Kocevar, Gabriel; Stamile, Claudio; Hannoun, Salem; Cotton, François; Vukusic, Sandra; Durand-Dubief, Françoise; Sappey-Marinier, Dominique

    2016-01-01

    Purpose: In this work, we introduce a method to classify Multiple Sclerosis (MS) patients into four clinical profiles using structural connectivity information. For the first time, we try to solve this question in a fully automated way using a computer-based method. The main goal is to show how the combination of graph-derived metrics with machine learning techniques constitutes a powerful tool for a better characterization and classification of MS clinical profiles. Materials and Methods: Sixty-four MS patients [12 Clinical Isolated Syndrome (CIS), 24 Relapsing Remitting (RR), 24 Secondary Progressive (SP), and 17 Primary Progressive (PP)] along with 26 healthy controls (HC) underwent MR examination. T1 and diffusion tensor imaging (DTI) were used to obtain structural connectivity matrices for each subject. Global graph metrics, such as density and modularity, were estimated and compared between subjects' groups. These metrics were further used to classify patients using tuned Support Vector Machine (SVM) combined with Radial Basic Function (RBF) kernel. Results: When comparing MS patients to HC subjects, a greater assortativity, transitivity, and characteristic path length as well as a lower global efficiency were found. Using all graph metrics, the best F-Measures (91.8, 91.8, 75.6, and 70.6%) were obtained for binary (HC-CIS, CIS-RR, RR-PP) and multi-class (CIS-RR-SP) classification tasks, respectively. When using only one graph metric, the best F-Measures (83.6, 88.9, and 70.7%) were achieved for modularity with previous binary classification tasks. Conclusion: Based on a simple DTI acquisition associated with structural brain connectivity analysis, this automatic method allowed an accurate classification of different MS patients' clinical profiles. PMID:27826224

  14. Graph Theory-Based Brain Connectivity for Automatic Classification of Multiple Sclerosis Clinical Courses.

    PubMed

    Kocevar, Gabriel; Stamile, Claudio; Hannoun, Salem; Cotton, François; Vukusic, Sandra; Durand-Dubief, Françoise; Sappey-Marinier, Dominique

    2016-01-01

    Purpose: In this work, we introduce a method to classify Multiple Sclerosis (MS) patients into four clinical profiles using structural connectivity information. For the first time, we try to solve this question in a fully automated way using a computer-based method. The main goal is to show how the combination of graph-derived metrics with machine learning techniques constitutes a powerful tool for a better characterization and classification of MS clinical profiles. Materials and Methods: Sixty-four MS patients [12 Clinical Isolated Syndrome (CIS), 24 Relapsing Remitting (RR), 24 Secondary Progressive (SP), and 17 Primary Progressive (PP)] along with 26 healthy controls (HC) underwent MR examination. T1 and diffusion tensor imaging (DTI) were used to obtain structural connectivity matrices for each subject. Global graph metrics, such as density and modularity, were estimated and compared between subjects' groups. These metrics were further used to classify patients using tuned Support Vector Machine (SVM) combined with Radial Basic Function (RBF) kernel. Results: When comparing MS patients to HC subjects, a greater assortativity, transitivity, and characteristic path length as well as a lower global efficiency were found. Using all graph metrics, the best F-Measures (91.8, 91.8, 75.6, and 70.6%) were obtained for binary (HC-CIS, CIS-RR, RR-PP) and multi-class (CIS-RR-SP) classification tasks, respectively. When using only one graph metric, the best F-Measures (83.6, 88.9, and 70.7%) were achieved for modularity with previous binary classification tasks. Conclusion: Based on a simple DTI acquisition associated with structural brain connectivity analysis, this automatic method allowed an accurate classification of different MS patients' clinical profiles.

  15. A simple method for estimating basin-scale groundwater discharge by vegetation in the basin and range province of Arizona using remote sensing information and geographic information systems

    USGS Publications Warehouse

    Tillman, F.D.; Callegary, J.B.; Nagler, P.L.; Glenn, E.P.

    2012-01-01

    Groundwater is a vital water resource in the arid to semi-arid southwestern United States. Accurate accounting of inflows to and outflows from the groundwater system is necessary to effectively manage this shared resource, including the important outflow component of groundwater discharge by vegetation. A simple method for estimating basin-scale groundwater discharge by vegetation is presented that uses remote sensing data from satellites, geographic information systems (GIS) land cover and stream location information, and a regression equation developed within the Southern Arizona study area relating the Enhanced Vegetation Index from the MODIS sensors on the Terra satellite to measured evapotranspiration. Results computed for 16-day composited satellite passes over the study area during the 2000 through 2007 time period demonstrate a sinusoidal pattern of annual groundwater discharge by vegetation with median values ranging from around 0.3 mm per day in the cooler winter months to around 1.5 mm per day during summer. Maximum estimated annual volume of groundwater discharge by vegetation was between 1.4 and 1.9 billion m3 per year with an annual average of 1.6 billion m3. A simplified accounting of the contribution of precipitation to vegetation greenness was developed whereby monthly precipitation data were subtracted from computed vegetation discharge values, resulting in estimates of minimum groundwater discharge by vegetation. Basin-scale estimates of minimum and maximum groundwater discharge by vegetation produced by this simple method are useful bounding values for groundwater budgets and groundwater flow models, and the method may be applicable to other areas with similar vegetation types.

  16. A new method for predicting essential proteins based on dynamic network topology and complex information.

    PubMed

    Luo, Jiawei; Kuang, Ling

    2014-10-01

    Predicting essential proteins is highly significant because organisms can not survive or develop even if only one of these proteins is missing. Improvements in high-throughput technologies have resulted in a large number of available protein-protein interactions. By taking advantage of these interaction data, researchers have proposed many computational methods to identify essential proteins at the network level. Most of these approaches focus on the topology of a static protein interaction network. However, the protein interaction network changes with time and condition. This important inherent dynamics of the protein interaction network is overlooked by previous methods. In this paper, we introduce a new method named CDLC to predict essential proteins by integrating dynamic local average connectivity and in-degree of proteins in complexes. CDLC is applied to the protein interaction network of Saccharomyces cerevisiae. The results show that CDLC outperforms five other methods (Degree Centrality (DC), Local Average Connectivity-based method (LAC), Sum of ECC (SoECC), PeC and Co-Expression Weighted by Clustering coefficient (CoEWC)). In particular, CDLC could improve the prediction precision by more than 45% compared with DC methods. CDLC is also compared with the latest algorithm CEPPK, and a higher precision is achieved by CDLC. CDLC is available as Supplementary materials. The default settings of active threshold and alpha-parameter are 0.8 and 0.1, respectively.

  17. High-order Taylor series expansion methods for error propagation in geographic information systems

    NASA Astrophysics Data System (ADS)

    Xue, Jie; Leung, Yee; Ma, Jiang-Hong

    2015-04-01

    The quality of modeling results in GIS operations depends on how well we can track error propagating from inputs to outputs. Monte Carlo simulation, moment design and Taylor series expansion have been employed to study error propagation over the years. Among them, first-order Taylor series expansion is popular because error propagation can be analytically studied. Because most operations in GIS are nonlinear, first-order Taylor series expansion generally cannot meet practical needs, and higher-order approximation is thus necessary. In this paper, we employ Taylor series expansion methods of different orders to investigate error propagation when the random error vectors are normally and independently or dependently distributed. We also extend these methods to situations involving multi-dimensional output vectors. We employ these methods to examine length measurement of linear segments, perimeter of polygons and intersections of two line segments basic in GIS operations. Simulation experiments indicate that the fifth-order Taylor series expansion method is most accurate compared with the first-order and third-order method. Compared with the third-order expansion; however, it can only slightly improve the accuracy, but on the expense of substantially increasing the number of partial derivatives that need to be calculated. Striking a balance between accuracy and complexity, the third-order Taylor series expansion method appears to be a more appropriate choice for practical applications.

  18. A Formative Evaluation of Healthy Heroes: A Photo Comic Book-Social Cognitive Theory Based Obesity Prevention Program

    ERIC Educational Resources Information Center

    Branscum, Paul; Housley, Alexandra; Bhochhibhoya, Amir; Hayes, Logan

    2016-01-01

    Purpose: Low consumption of fruits and vegetables is often associated with poor diet quality, and childhood obesity. The purpose of this study was to assess the feasibility, and conduct a formative evaluation, of Healthy Heroes, an innovative, social cognitive theory-based program that uses child created photo-comic books to promote fruit and…

  19. Efficacy of theory-based interventions to promote physical activity. A meta-analysis of randomised controlled trials.

    PubMed

    Gourlan, M; Bernard, P; Bortolon, C; Romain, A J; Lareyre, O; Carayol, M; Ninot, G; Boiché, J

    2016-01-01

    Implementing theory-based interventions is an effective way to influence physical activity (PA) behaviour in the population. This meta-analysis aimed to (1) determine the global effect of theory-based randomised controlled trials dedicated to the promotion of PA among adults, (2) measure the actual efficacy of interventions against their theoretical objectives and (3) compare the efficacy of single- versus combined-theory interventions. A systematic search through databases and review articles was carried out. Our results show that theory-based interventions (k = 82) significantly impact the PA behaviour of participants (d = 0.31, 95% CI [0.24, 0.37]). While moderation analyses revealed no efficacy difference between theories, interventions based on a single theory (d = 0.35; 95% CI [0.26, 0.43]) reported a higher impact on PA behaviour than those based on a combination of theories (d = 0.21; 95% CI [0.11, 0.32]). In spite of the global positive effect of theory-based interventions on PA behaviour, further research is required to better identify the specificities, overlaps or complementarities of the components of interventions based on relevant theories.

  20. Effects of a Theory-Based Feedback and Consultation Process on Instruction and Learning in College Classrooms

    ERIC Educational Resources Information Center

    Hampton, Scott E.; Reiser, Robert A.

    2004-01-01

    This study examined how midterm student ratings feedback provided to teaching assistants via a theory-based ratings instrument, combined with consultation on instructional practices, would affect teaching practices, ratings of teaching effectiveness, and student learning and motivation. The student ratings instrument that was employed focused on a…

  1. A Proposal of Product Development Collaboration Method Using User Support Information and its Experimental Evaluation

    NASA Astrophysics Data System (ADS)

    Tanaka, Mitsuru; Kataoka, Masatoshi; Koizumi, Hisao

    As the market changes more rapidly and new products continue to get more complex and multifunctional, product development collaboration with competent partners and leading users is getting more important to come up with new products that are successful in the market in a timely manner. ECM (engineering chain management) and SCM (supply chain management) are supply-side approaches toward this collaboration. In this paper, we propose a demand-side approach toward product development collaboration with users based on the information gathered through user support interactions. The approach and methodology proposed here was applied to a real data set, and its effectiveness was verified.

  2. New Term Weighting Formulas for the Vector Space Method in Information Retrieval

    SciTech Connect

    Chisholm, E.; Kolda, T.G.

    1999-03-01

    The goal in information retrieval is to enable users to automatically and accurately find data relevant to their queries. One possible approach to this problem i use the vector space model, which models documents and queries as vectors in the term space. The components of the vectors are determined by the term weighting scheme, a function of the frequencies of the terms in the document or query as well as throughout the collection. We discuss popular term weighting schemes and present several new schemes that offer improved performance.

  3. Evaluation of non-animal methods for assessing skin sensitisation hazard: A Bayesian Value-of-Information analysis.

    PubMed

    Leontaridou, Maria; Gabbert, Silke; Van Ierland, Ekko C; Worth, Andrew P; Landsiedel, Robert

    2016-07-01

    This paper offers a Bayesian Value-of-Information (VOI) analysis for guiding the development of non-animal testing strategies, balancing information gains from testing with the expected social gains and costs from the adoption of regulatory decisions. Testing is assumed to have value, if, and only if, the information revealed from testing triggers a welfare-improving decision on the use (or non-use) of a substance. As an illustration, our VOI model is applied to a set of five individual non-animal prediction methods used for skin sensitisation hazard assessment, seven battery combinations of these methods, and 236 sequential 2-test and 3-test strategies. Their expected values are quantified and compared to the expected value of the local lymph node assay (LLNA) as the animal method. We find that battery and sequential combinations of non-animal prediction methods reveal a significantly higher expected value than the LLNA. This holds for the entire range of prior beliefs. Furthermore, our results illustrate that the testing strategy with the highest expected value does not necessarily have to follow the order of key events in the sensitisation adverse outcome pathway (AOP).

  4. Report on New Methods for Representing and Interacting with Qualitative Geographic Information, Stage 2: Task Group 4 Message-Focused Use Case

    DTIC Science & Technology

    2014-12-17

    New Methods for Representing and Interacting with Qualitative Geographic Information Contract #: W912HZ-12-P-0334 Contract Period: July 1, 2014...Penn State University Report on New Methods for Representing and Interacting with Qualitative Geographic Information , Stage 2: Task Group...this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data

  5. Retrieval practice is an efficient method of enhancing the retention of anatomy and physiology information.

    PubMed

    Dobson, John L

    2013-06-01

    Although a great deal of empirical evidence has indicated that retrieval practice is an effective means of promoting learning and memory, very few studies have investigated the strategy in the context of an actual class. The primary purpose of this study was to determine if a series of very brief retrieval quizzes could significantly improve the retention of previously tested information throughout an anatomy and physiology course. A second purpose was to determine if there were any significant differences between expanding and uniform patterns of retrieval that followed a standardized initial retrieval delay. Anatomy and physiology students were assigned to either a control group or groups that were repeatedly prompted to retrieve a subset of previously tested course information via a series of quizzes that were administered on either an expanding or a uniform schedule. Each retrieval group completed a total of 10 retrieval quizzes, and the series of quizzes required (only) a total of 2 h to complete. Final retention of the exam subset material was assessed during the last week of the semester. There were no significant differences between the expanding and uniform retrieval groups, but both retained an average of 41% more of the subset material than did the control group (ANOVA, F = 129.8, P = 0.00, ηp(2) = 0.36). In conclusion, retrieval practice is a highly efficient and effective strategy for enhancing the retention of anatomy and physiology material.

  6. Method and apparatus for optimizing a train trip using signal information

    DOEpatents

    Kumar, Ajith Kuttannair; Daum, Wolfgang; Otsubo, Tom; Hershey, John Erik; Hess, Gerald James

    2013-02-05

    One embodiment of the invention includes a system for operating a railway network comprising a first railway vehicle (400) during a trip along track segments (401/412/420). The system comprises a first element (65) for determining travel parameters of the first railway vehicle (400), a second element (65) for determining travel parameters of a second railway vehicle (418) relative to the track segments to be traversed by the first vehicle during the trip, a processor (62) for receiving information from the first (65) and the second (65) elements and for determining a relationship between occupation of a track segment (401/412/420) by the second vehicle (418) and later occupation of the same track segment by the first vehicle (400) and an algorithm embodied within the processor (62) having access to the information to create a trip plan that determines a speed trajectory for the first vehicle (400), wherein the speed trajectory is responsive to the relationship and further in accordance with one or more operational criteria for the first vehicle (400).

  7. The development of systematic quality control method using laboratory information system and unity program.

    PubMed

    Min, Won-Ki; Lee, Woochang; Park, Hyosoon

    2002-01-01

    Quality control (QC) process is performed to detect and correct errors in the laboratory, of which systematic errors are repeated and affect all the laboratory process thereafter. This makes it necessary for all the laboratories to detect and correct errors effectively and efficiently. We developed an on-line quality assurance system for detection and correction of systematic error, and linked it to the Unity Plus/Pro (Bio-Rad Laboratories, Irvine, USA), a commercially available quality management system. The laboratory information system based on the client-server paradigm was developed using NCR3600 (NCR, West Columbia, USA) as the server and database for server was Oracle 7.2 (Oracle, Belmont, USA) and development tool was Powerbuilder (Powersoft Burlignton, UK). Each QC material is registered and gets its own identification number and tested the same way as patient sample. The resulting QC data is entered into the Unity Plus/Pro program by in-house data entering program or by manual input. With the implementation of in-house laboratory information system (LIS) and linking it to Unity Plus/Pro, we could apply Westgard's multi-rule for higher error detection rate, resulting in more systematic and precise quality assurance for laboratory product, as well as complementary to conventional external quality assessment.

  8. Methods of information geometry in computational system biology (consistency between chemical and biological evolution).

    PubMed

    Astakhov, Vadim

    2009-01-01

    Interest in simulation of large-scale metabolic networks, species development, and genesis of various diseases requires new simulation techniques to accommodate the high complexity of realistic biological networks. Information geometry and topological formalisms are proposed to analyze information processes. We analyze the complexity of large-scale biological networks as well as transition of the system functionality due to modification in the system architecture, system environment, and system components. The dynamic core model is developed. The term dynamic core is used to define a set of causally related network functions. Delocalization of dynamic core model provides a mathematical formalism to analyze migration of specific functions in biosystems which undergo structure transition induced by the environment. The term delocalization is used to describe these processes of migration. We constructed a holographic model with self-poetic dynamic cores which preserves functional properties under those transitions. Topological constraints such as Ricci flow and Pfaff dimension were found for statistical manifolds which represent biological networks. These constraints can provide insight on processes of degeneration and recovery which take place in large-scale networks. We would like to suggest that therapies which are able to effectively implement estimated constraints, will successfully adjust biological systems and recover altered functionality. Also, we mathematically formulate the hypothesis that there is a direct consistency between biological and chemical evolution. Any set of causal relations within a biological network has its dual reimplementation in the chemistry of the system environment.

  9. Precision calibration method for binocular vision measurement systems based on arbitrary translations and 3D-connection information

    NASA Astrophysics Data System (ADS)

    Yang, Jinghao; Jia, Zhenyuan; Liu, Wei; Fan, Chaonan; Xu, Pengtao; Wang, Fuji; Liu, Yang

    2016-10-01

    Binocular vision systems play an important role in computer vision, and high-precision system calibration is a necessary and indispensable process. In this paper, an improved calibration method for binocular stereo vision measurement systems based on arbitrary translations and 3D-connection information is proposed. First, a new method for calibrating the intrinsic parameters of binocular vision system based on two translations with an arbitrary angle difference is presented, which reduces the effect of the deviation of the motion actuator on calibration accuracy. This method is simpler and more accurate than existing active-vision calibration methods and can provide a better initial value for the determination of extrinsic parameters. Second, a 3D-connection calibration and optimization method is developed that links the information of the calibration target in different positions, further improving the accuracy of the system calibration. Calibration experiments show that the calibration error can be reduced to 0.09%, outperforming traditional methods for the experiments of this study.

  10. Pornographic information of Internet views detection method based on the connected areas

    NASA Astrophysics Data System (ADS)

    Wang, Huibai; Fan, Ajie

    2017-01-01

    Nowadays online porn video broadcasting and downloading is very popular. In view of the widespread phenomenon of Internet pornography, this paper proposed a new method of pornographic video detection based on connected areas. Firstly, decode the video into a serious of static images and detect skin color on the extracted key frames. If the area of skin color reaches a certain threshold, use the AdaBoost algorithm to detect the human face. Judge the connectivity of the human face and the large area of skin color to determine whether detect the sensitive area finally. The experimental results show that the method can effectively remove the non-pornographic videos contain human who wear less. This method can improve the efficiency and reduce the workload of detection.

  11. Methods for semi-automated indexing for high precision information retrieval

    NASA Technical Reports Server (NTRS)

    Berrios, Daniel C.; Cucina, Russell J.; Fagan, Lawrence M.

    2002-01-01

    OBJECTIVE: To evaluate a new system, ISAID (Internet-based Semi-automated Indexing of Documents), and to generate textbook indexes that are more detailed and more useful to readers. DESIGN: Pilot evaluation: simple, nonrandomized trial comparing ISAID with manual indexing methods. Methods evaluation: randomized, cross-over trial comparing three versions of ISAID and usability survey. PARTICIPANTS: Pilot evaluation: two physicians. Methods evaluation: twelve physicians, each of whom used three different versions of the system for a total of 36 indexing sessions. MEASUREMENTS: Total index term tuples generated per document per minute (TPM), with and without adjustment for concordance with other subjects; inter-indexer consistency; ratings of the usability of the ISAID indexing system. RESULTS: Compared with manual methods, ISAID decreased indexing times greatly. Using three versions of ISAID, inter-indexer consistency ranged from 15% to 65% with a mean of 41%, 31%, and 40% for each of three documents. Subjects using the full version of ISAID were faster (average TPM: 5.6) and had higher rates of concordant index generation. There were substantial learning effects, despite our use of a training/run-in phase. Subjects using the full version of ISAID were much faster by the third indexing session (average TPM: 9.1). There was a statistically significant increase in three-subject concordant indexing rate using the full version of ISAID during the second indexing session (p < 0.05). SUMMARY: Users of the ISAID indexing system create complex, precise, and accurate indexing for full-text documents much faster than users of manual methods. Furthermore, the natural language processing methods that ISAID uses to suggest indexes contributes substantially to increased indexing speed and accuracy.

  12. Methods for Semi-automated Indexing for High Precision Information Retrieval

    PubMed Central

    Berrios, Daniel C.; Cucina, Russell J.; Fagan, Lawrence M.

    2002-01-01

    Objective. To evaluate a new system, ISAID (Internet-based Semi-automated Indexing of Documents), and to generate textbook indexes that are more detailed and more useful to readers. Design. Pilot evaluation: simple, nonrandomized trial comparing ISAID with manual indexing methods. Methods evaluation: randomized, cross-over trial comparing three versions of ISAID and usability survey. Participants. Pilot evaluation: two physicians. Methods evaluation: twelve physicians, each of whom used three different versions of the system for a total of 36 indexing sessions. Measurements. Total index term tuples generated per document per minute (TPM), with and without adjustment for concordance with other subjects; inter-indexer consistency; ratings of the usability of the ISAID indexing system. Results. Compared with manual methods, ISAID decreased indexing times greatly. Using three versions of ISAID, inter-indexer consistency ranged from 15% to 65% with a mean of 41%, 31%, and 40% for each of three documents. Subjects using the full version of ISAID were faster (average TPM: 5.6) and had higher rates of concordant index generation. There were substantial learning effects, despite our use of a training/run-in phase. Subjects using the full version of ISAID were much faster by the third indexing session (average TPM: 9.1). There was a statistically significant increase in three-subject concordant indexing rate using the full version of ISAID during the second indexing session (p < 0.05). Summary. Users of the ISAID indexing system create complex, precise, and accurate indexing for full-text documents much faster than users of manual methods. Furthermore, the natural language processing methods that ISAID uses to suggest indexes contributes substantially to increased indexing speed and accuracy. PMID:12386114

  13. Empirical studies on informal patient payments for health care services: a systematic and critical review of research methods and instruments

    PubMed Central

    2010-01-01

    Background Empirical evidence demonstrates that informal patient payments are an important feature of many health care systems. However, the study of these payments is a challenging task because of their potentially illegal and sensitive nature. The aim of this paper is to provide a systematic review and analysis of key methodological difficulties in measuring informal patient payments. Methods The systematic review was based on the following eligibility criteria: English language publications that reported on empirical studies measuring informal patient payments. There were no limitations with regard to the year of publication. The content of the publications was analysed qualitatively and the results were organised in the form of tables. Data sources were Econlit, Econpapers, Medline, PubMed, ScienceDirect, SocINDEX. Results Informal payments for health care services are most often investigated in studies involving patients or the general public, but providers and officials are also sample units in some studies. The majority of the studies apply a single mode of data collection that involves either face-to-face interviews or group discussions. One of the main methodological difficulties reported in the publication concerns the inability of some respondents to distinguish between official and unofficial payments. Another complication is associated with the refusal of some respondents to answer questions on informal patient payments. We do not exclude the possibility that we have missed studies that reported in non-English language journals as well as very recent studies that are not yet published. Conclusions Given the recent evidence from research on survey methods, a self-administrated questionnaire during a face-to-face interview could be a suitable mode of collecting sensitive data, such as data on informal patient payments. PMID:20849658

  14. Designing Health Websites Based on Users’ Web-Based Information-Seeking Behaviors: A Mixed-Method Observational Study

    PubMed Central

    Pang, Patrick Cheong-Iao; Verspoor, Karin; Pearce, Jon

    2016-01-01

    Background Laypeople increasingly use the Internet as a source of health information, but finding and discovering the right information remains problematic. These issues are partially due to the mismatch between the design of consumer health websites and the needs of health information seekers, particularly the lack of support for “exploring” health information. Objective The aim of this research was to create a design for consumer health websites by supporting different health information–seeking behaviors. We created a website called Better Health Explorer with the new design. Through the evaluation of this new design, we derive design implications for future implementations. Methods Better Health Explorer was designed using a user-centered approach. The design was implemented and assessed through a laboratory-based observational study. Participants tried to use Better Health Explorer and another live health website. Both websites contained the same content. A mixed-method approach was adopted to analyze multiple types of data collected in the experiment, including screen recordings, activity logs, Web browsing histories, and audiotaped interviews. Results Overall, 31 participants took part in the observational study. Our new design showed a positive result for improving the experience of health information seeking, by providing a wide range of information and an engaging environment. The results showed better knowledge acquisition, a higher number of page reads, and more query reformulations in both focused and exploratory search tasks. In addition, participants spent more time to discover health information with our design in exploratory search tasks, indicating higher engagement with the website. Finally, we identify 4 design considerations for designing consumer health websites and health information–seeking apps: (1) providing a dynamic information scope; (2) supporting serendipity; (3) considering trust implications; and (4) enhancing interactivity

  15. Complementary medicine use among cancer patients receiving radiotherapy and chemotherapy: methods, sources of information and the need for counselling.

    PubMed

    Pihlak, R; Liivand, R; Trelin, O; Neissar, H; Peterson, I; Kivistik, S; Lilo, K; Jaal, J

    2014-03-01

    Complementary medicine (CM) use is common among cancer patients. However, little is known about CM products that are utilised during radiotherapy and/or chemotherapy. Out of 62 cancer patients who completed a specialised survey, 35 (56%) consumed some type of CM during active anti-cancer therapy. Cancer patients reported the use of herbal teas (52%), vitamins and other dietary supplements (45%), vegetables and juices (39%), special diets (19%), herbal medicines, including Chinese medicines (19%) and 'immunomodulators' (3%). Most of patients (86%) consumed CM products every day. However, nearly 47% of CM users did not admit this to their oncologists. Majority of CM users (85%) were convinced that supplementary products increase the efficacy of standard anti-cancer therapy and prolong their survival. Information about CM was mainly obtained through internet sources (36%), books and brochures (25%). Although most CM users (82%) trusted the received information, 73% of them admitted that additional information about CM methods would be necessary. Patients would like to receive additional information through a specialised consultation (60%), but also from brochures (44%) and the internet (20%). Adequate counselling of patients is of paramount importance since some CM methods may cause significant side effects and decrease the efficacy of radiotherapy and/or chemotherapy.

  16. A new method of CCD dark current correction via extracting the dark Information from scientific images

    NASA Astrophysics Data System (ADS)

    Ma, Bin; Shang, Zhaohui; Hu, Yi; Liu, Qiang; Wang, Lifan; Wei, Peng

    2014-07-01

    We have developed a new method to correct dark current at relatively high temperatures for Charge-Coupled Device (CCD) images when dark frames cannot be obtained on the telescope. For images taken with the Antarctic Survey Telescopes (AST3) in 2012, due to the low cooling efficiency, the median CCD temperature was -46°C, resulting in a high dark current level of about 3e-/pix/sec, even comparable to the sky brightness (10e-/pix/sec). If not corrected, the nonuniformity of the dark current could even overweight the photon noise of the sky background. However, dark frames could not be obtained during the observing season because the camera was operated in frame-transfer mode without a shutter, and the telescope was unattended in winter. Here we present an alternative, but simple and effective method to derive the dark current frame from the scientific images. Then we can scale this dark frame to the temperature at which the scientific images were taken, and apply the dark frame corrections to the scientific images. We have applied this method to the AST3 data, and demonstrated that it can reduce the noise to a level roughly as low as the photon noise of the sky brightness, solving the high noise problem and improving the photometric precision. This method will also be helpful for other projects that suffer from similar issues.

  17. Information Acquisition and Representation Methods for Real-Time Asset Management

    DTIC Science & Technology

    2011-02-01

    Method for Learning Abatement Potential of Emissions Reduction Technologies,” Winter Page 14 Simulation Conference, 2009. M. D. Rossetti , R. R...with Correlated Knowledge-Gradients,” Winter Simulation Conference, M. D. Rossetti , R. R. Hill, B. Johansson, A. Dunkin, and R. G. Ingalls, eds, 2009

  18. Background information for the Leaching environmental Assessment Framework (LEAF) test methods

    EPA Science Inventory

    The U.S. Environmental Protection Agency Office of Resource Conservation and Recovery has initiated the review and validation process for four leaching tests under consideration for inclusion into SW-846: Method 1313 "Liquid-Solid Partitioning as a Function of Extract pH for Co...

  19. 75 FR 8817 - Annual Submission of Tax Information for Use in the Revenue Shortfall Allocation Method

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-26

    ...The Surface Transportation Board (Board) is amending 49 CFR part 1135 to add a rule that requires the Association of American Railroads (AAR) to annually update each Class I railroad's weighted average State tax rate for use in the Revenue Shortfall Allocation Method (RSAM). RSAM is one of three benchmarks that together are used to determine the reasonableness of a challenged rate under the......

  20. 77 FR 24684 - Proposed Information Collection; Comment Request; 2013-2015 American Community Survey Methods...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-25

    ... magnitude requires that the ACS continue research, testing and evaluations aimed at improving data quality... rates. Testing may also include methods that might increase data quality. At this time, plans are in... data quality or testing new questions that have an urgent need to be included on the ACS. During...

  1. Understanding the Effects of Different Study Methods on Retention of Information and Transfer of Learning

    ERIC Educational Resources Information Center

    Egan, Rylan G.

    2012-01-01

    Introduction: The following study investigates relationships between spaced practice (re-studying after a delay) and transfer of learning. Specifically, the impact on learners ability to transfer learning after participating in spaced model-building or unstructured study of narrated text. Method: Subjects were randomly assigned either to a…

  2. A Neighborhood Notion of Emergent Literacy: One Mixed Methods Inquiry to Inform Community Learning

    ERIC Educational Resources Information Center

    Hoffman, Emily Brown; Whittingham, Colleen E.

    2017-01-01

    Using a convergent parallel mixed methods design, this study considered the early literacy and language environments actualized by childcare providers and parents of young children (ages 3-5) living in one large urban community in the United States of America. Both childcare providers and parents responded to questionnaires and participated in…

  3. Clustering Methods; Part IV of Scientific Report No. ISR-18, Information Storage and Retrieval...

    ERIC Educational Resources Information Center

    Cornell Univ., Ithaca, NY. Dept. of Computer Science.

    Two papers are included as Part Four of this report on Salton's Magical Automatic Retriever of Texts (SMART) project report. The first paper: "A Controlled Single Pass Classification Algorithm with Application to Multilevel Clustering" by D. B. Johnson and J. M. Laferente presents a single pass clustering method which compares favorably…

  4. Investigating the Nature of Method Factors through Multiple Informants: Evidence for a Specific Factor?

    ERIC Educational Resources Information Center

    Alessandri, Guido; Vecchione, Michele; Tisak, John; Barbaranelli, Claudio

    2011-01-01

    When a self-report instrument includes a balanced number of positively and negatively worded items, factor analysts often use method factors to aid model fitting. The nature of these factors, often referred to as acquiescence, is still debated. Relying upon previous results (Alessandri et al., 2010; DiStefano & Motl, 2006, 2008; Rauch, Schweizer,…

  5. 78 FR 68076 - Request for Information on Alternative Skin Sensitization Test Methods and Testing Strategies and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-13

    ... niceatm@niehs.nih.gov are preferred. NICEATM, National Institute of Environmental Health Sciences, P.O... direct contact with a skin allergen, is an important public health challenge. ACD frequently develops in... chemico or in vitro methods (the direct peptide reactivity assay, human cell line activation...

  6. Methods and apparatus for multi-resolution replication of files in a parallel computing system using semantic information

    DOEpatents

    Faibish, Sorin; Bent, John M.; Tzelnic, Percy; Grider, Gary; Torres, Aaron

    2015-10-20

    Techniques are provided for storing files in a parallel computing system using different resolutions. A method is provided for storing at least one file generated by a distributed application in a parallel computing system. The file comprises one or more of a complete file and a sub-file. The method comprises the steps of obtaining semantic information related to the file; generating a plurality of replicas of the file with different resolutions based on the semantic information; and storing the file and the plurality of replicas of the file in one or more storage nodes of the parallel computing system. The different resolutions comprise, for example, a variable number of bits and/or a different sub-set of data elements from the file. A plurality of the sub-files can be merged to reproduce the file.

  7. Collecting baseline information for national morbidity alleviation programs: different methods to estimate lymphatic filariasis morbidity prevalence.

    PubMed

    Mathieu, Els; Amann, Josef; Eigege, Abel; Richards, Frank; Sodahlon, Yao

    2008-01-01

    The lymphatic filariasis elimination program aims not only to stop transmission, but also to alleviate morbidity. Although geographically limited morbidity projects exist, few have been implemented nationally. For advocacy and planning, the program coordinators need prevalence estimates that are currently rarely available. This article compares several approaches to estimate morbidity prevalence: (1) data routinely collected during mapping or sentinel site activities; (2) data collected during drug coverage surveys; and (3) alternative surveys. Data were collected in Plateau and Nasarawa States in Nigeria and in 6 districts in Togo. In both settings, we found that questionnaires seem to underestimate the morbidity prevalence compared with existing information collected through clinical examination. We suggest that program managers use the latter for advocacy and planning, but if not available, questionnaires to estimate morbidity prevalence can be added to existing surveys. Even though such data will most likely underestimate the real burden of disease, they can be useful in resource-limited settings.

  8. The personalized reminder information and social management system (PRISM) trial: rationale, methods and baseline characteristics.

    PubMed

    Czaja, Sara J; Boot, Walter R; Charness, Neil; A Rogers, Wendy; Sharit, Joseph; Fisk, Arthur D; Lee, Chin Chin; Nair, Sankaran N

    2015-01-01

    Technology holds promise in terms of providing support to older adults. To date, there have been limited robust systematic efforts to evaluate the psychosocial benefits of technology for older people and identify factors that influence both the usability and uptake of technology systems. In response to these issues, we developed the Personal Reminder Information and Social Management System (PRISM), a software application designed for older adults to support social connectivity, memory, knowledge about topics, leisure activities and access to resources. This trail is evaluating the impact of access to the PRISM system on outcomes such as social isolation, social support and connectivity. This paper reports on the approach used to design the PRISM system, study design, methodology and baseline data for the trial. The trial is multi-site randomized field trial. PRISM is being compared to a Binder condition where participants received a binder that contained content similar to that found on PRISM. The sample includes 300 older adults, aged 65-98 years, who lived alone and at risk for being isolated. The primary outcome measures for the trial include indices of social isolation and support and well-being. Secondary outcomes measures include indices of computer proficiency, technology uptake and attitudes towards technology. Follow-up assessments occurred at 6 and 12 months post-randomization. The results of this study will yield important information about the potential value of technology for older adults. The study also demonstrates how a user-centered iterative design approach can be incorporated into the design and evaluation of an intervention protocol.

  9. The Personalized Reminder Information and Social Management System (PRISM) Trial: Rationale, Methods and Baseline Characteristics

    PubMed Central

    Czaja, Sara J.; Boot, Walter R.; Charness, Neil; Rogers, Wendy; Sharit, Joseph; Fisk, Arthur D.; Lee, Chin Chin; Nair, Sankaran N.

    2014-01-01

    Technology holds promise in terms of providing support to older adults. To date there have been limited robust systematic efforts to evaluate the psychosocial benefits of technology for older people and identify factors that influence both the usability and uptake of technology systems. In response to these issues we developed the Personal Reminder Information and Social Management System (PRISM), a software application designed for older adults to support social connectivity, memory, knowledge about topics, leisure activities and access to resources. This trail is evaluating the impact of access to the PRISM system on outcomes such as social isolation, social support and connectivity. This paper reports on the approach used to design the PRISM system, study design, methodology and baseline data for the trial. The trial is multi-site randomized field trial. PRISM is being compared to a Binder condition where participants received a binder that contained content similar to that found on PRISM. The sample includes 300 older adults, aged 65 – 98 years, who lived alone and at risk for being isolated. The primary outcome measures for the trial include indices of social isolation and support and well-being. Secondary outcomes measures include indices of computer proficiency, technology uptake and attitudes towards technology. Follow-up assessments occurred at 6 and 12 months post-randomization. The results of this study will yield important information about the potential value of technology for older adults. The study also demonstrates how a user-centered iterative design approach can be incorporated into the design and evaluation of an intervention protocol. PMID:25460342

  10. Methods, systems, and apparatus for storage, transfer and/or control of information via matter wave dynamics

    NASA Technical Reports Server (NTRS)

    Vestergaard Hau, Lene (Inventor)

    2012-01-01

    Methods, systems and apparatus for generating atomic traps, and for storing, controlling and transferring information between first and second spatially separated phase-coherent objects, or using a single phase-coherent object. For plural objects, both phase-coherent objects have a macroscopic occupation of a particular quantum state by identical bosons or identical BCS-paired fermions. The information may be optical information, and the phase-coherent object(s) may be Bose-Einstein condensates, superfluids, or superconductors. The information is stored in the first phase-coherent object at a first storage time and recovered from the second phase-coherent object, or the same first phase-coherent object, at a second revival time. In one example, an integrated silicon wafer-based optical buffer includes an electrolytic atom source to provide the phase-coherent object(s), a nanoscale atomic trap for the phase-coherent object(s), and semiconductor-based optical sources to cool the phase-coherent object(s) and provide coupling fields for storage and transfer of optical information.

  11. Evaluation of the clinical process in a critical care information system using the Lean method: a case study

    PubMed Central

    2012-01-01

    Background There are numerous applications for Health Information Systems (HIS) that support specific tasks in the clinical workflow. The Lean method has been used increasingly to optimize clinical workflows, by removing waste and shortening the delivery cycle time. There are a limited number of studies on Lean applications related to HIS. Therefore, we applied the Lean method to evaluate the clinical processes related to HIS, in order to evaluate its efficiency in removing waste and optimizing the process flow. This paper presents the evaluation findings of these clinical processes, with regards to a critical care information system (CCIS), known as IntelliVue Clinical Information Portfolio (ICIP), and recommends solutions to the problems that were identified during the study. Methods We conducted a case study under actual clinical settings, to investigate how the Lean method can be used to improve the clinical process. We used observations, interviews, and document analysis, to achieve our stated goal. We also applied two tools from the Lean methodology, namely the Value Stream Mapping and the A3 problem-solving tools. We used eVSM software to plot the Value Stream Map and A3 reports. Results We identified a number of problems related to inefficiency and waste in the clinical process, and proposed an improved process model. Conclusions The case study findings show that the Value Stream Mapping and the A3 reports can be used as tools to identify waste and integrate the process steps more efficiently. We also proposed a standardized and improved clinical process model and suggested an integrated information system that combines database and software applications to reduce waste and data redundancy. PMID:23259846

  12. A simple ligation-based method to increase the information density in sequencing reactions used to deconvolute nucleic acid selections

    PubMed Central

    Childs-Disney, Jessica L.; Disney, Matthew D.

    2008-01-01

    Herein, a method is described to increase the information density of sequencing experiments used to deconvolute nucleic acid selections. The method is facile and should be applicable to any selection experiment. A critical feature of this method is the use of biotinylated primers to amplify and encode a BamHI restriction site on both ends of a PCR product. After amplification, the PCR reaction is captured onto streptavidin resin, washed, and digested directly on the resin. Resin-based digestion affords clean product that is devoid of partially digested products and unincorporated PCR primers. The product's complementary ends are annealed and ligated together with T4 DNA ligase. Analysis of ligation products shows formation of concatemers of different length and little detectable monomer. Sequencing results produced data that routinely contained three to four copies of the library. This method allows for more efficient formulation of structure-activity relationships since multiple active sequences are identified from a single clone. PMID:18065718

  13. Systematic review of the types of methods and approaches used to assess the effectiveness of healthcare information websites.

    PubMed

    Tieman, Jennifer; Bradley, Sandra L

    2013-01-01

    The aim of this systematic review was to identify types of approaches and methods used to evaluate the effectiveness of healthcare information websites. Simple usage data may not be sufficient to assess whether desired healthcare outcomes were achieved or to determine the relative effectiveness of different web resources on the same health topic. To establish the state of the knowledge base on assessment methods used to determine the effectiveness of healthcare websites, a structured search of the literature was conducted in Ovid Medline, resulting in the retrieval of 1611 articles, of which 240 met the inclusion criteria for the present review. The present review found that diverse evaluation methods were used to measure the effectiveness of healthcare websites. These evaluation methods were used during development, before release and after release. Economic assessment was rare and most evaluations looked at content issues, such as readability scores. Several studies did try to assess the usefulness of websites, but few studies looked at behaviour change or knowledge transfer following engagement with the designated health website. To assess the effectiveness of the knowledge transfer of healthcare information through the online environment, multiple methods may need to be used to evaluate healthcare websites and may need to be undertaken at all stages of the website development process.

  14. A multiple information fusion method for predicting subcellular locations of two different types of bacterial protein simultaneously.

    PubMed

    Chen, Jing; Xu, Huimin; He, Ping-An; Dai, Qi; Yao, Yuhua

    2016-01-01

    Subcellular localization prediction of bacterial protein is an important component of bioinformatics, which has great importance for drug design and other applications. For the prediction of protein subcellular localization, as we all know, lots of computational tools have been developed in the recent decades. In this study, we firstly introduce three kinds of protein sequences encoding schemes: physicochemical-based, evolutionary-based, and GO-based. The original and consensus sequences were combined with physicochemical properties. And elements information of different rows and columns in position-specific scoring matrix were taken into consideration simultaneously for more core and essence information. Computational methods based on gene ontology (GO) have been demonstrated to be superior to methods based on other features. Then principal component analysis (PCA) is applied for feature selection and reduced vectors are input to a support vector machine (SVM) to predict protein subcellular localization. The proposed method can achieve a prediction accuracy of 98.28% and 97.87% on a stringent Gram-positive (Gpos) and Gram-negative (Gneg) dataset with Jackknife test, respectively. At last, we calculate "absolute true overall accuracy (ATOA)", which is stricter than overall accuracy. The ATOA obtained from the proposed method is also up to 97.32% and 93.06% for Gpos and Gneg. From both the rationality of testing procedure and the success rates of test results, the current method can improve the prediction quality of protein subcellular localization.

  15. Novel Method for Calculating a Nonsubjective Informative Prior for a Bayesian Model in Toxicology Screening: A Theoretical Framework.

    PubMed

    Woldegebriel, Michael

    2015-11-17

    In toxicology screening (forensic, food-safety), due to several analytical errors (e.g., retention time shift, lack of repeatability in m/z scans, etc.), the ability to confidently identify/confirm a compound remains a challenge. Due to these uncertainties, a probabilistic approach is currently preferred. However, if a probabilistic approach is followed, the only statistical method that is capable of estimating the probability of whether the compound of interest (COI) is present/absent in a given sample is Bayesian statistics. Bayes' theorem can combine prior information (prior probability) with data (likelihood) to give an optimal probability (posterior probability) reflecting the presence/absence of the COI. In this work, a novel method for calculating an informative prior probability for a Bayesian model in targeted toxicology screening is introduced. In contrast to earlier proposals making use of literature citation rates and the prior knowledge of the analyst, this method presents a thorough and nonsubjective approach. The formulation approaches the probability calculation as a clustering and random draw problem that incorporates few analytical method parameters meticulously estimated to reflect sensitivity and specificity of the system. The practicality of the method has been demonstrated and validated using real data and simulated analytical techniques.

  16. Business Process Design Method Based on Business Event Model for Enterprise Information System Integration

    NASA Astrophysics Data System (ADS)

    Kobayashi, Takashi; Komoda, Norihisa

    The traditional business process design methods, in which the usecase is the most typical, have no useful framework to design the activity sequence with. Therefore, the design efficiency and quality vary widely according to the designer’s experience and skill. In this paper, to solve this problem, we propose the business events and their state transition model (a basic business event model) based on the language/action perspective, which is the result in the cognitive science domain. In the business process design, using this model, we decide event occurrence conditions so that every event synchronizes with each other. We also propose the design pattern to decide the event occurrence condition (a business event improvement strategy). Lastly, we apply the business process design method based on the business event model and the business event improvement strategy to the credit card issue process and estimate its effect.

  17. How informative is your kinetic model?: using resampling methods for model invalidation

    PubMed Central

    2014-01-01

    Background Kinetic models can present mechanistic descriptions of molecular processes within a cell. They can be used to predict the dynamics of metabolite production, signal transduction or transcription of genes. Although there has been tremendous effort in constructing kinetic models for different biological systems, not much effort has been put into their validation. In this study, we introduce the concept of resampling methods for the analysis of kinetic models and present a statistical model invalidation approach. Results We based our invalidation approach on the evaluation of a kinetic model’s predictive power through cross validation and forecast analysis. As a reference point for this evaluation, we used the predictive power of an unsupervised data analysis method which does not make use of any biochemical knowledge, namely Smooth Principal Components Analysis (SPCA) on the same test sets. Through a simulations study, we showed that too simple mechanistic descriptions can be invalidated by using our SPCA-based comparative approach until high amount of noise exists in the experimental data. We also applied our approach on an eicosanoid production model developed for human and concluded that the model could not be invalidated using the available data despite its simplicity in the formulation of the reaction kinetics. Furthermore, we analysed the high osmolarity glycerol (HOG) pathway in yeast to question the validity of an existing model as another realistic demonstration of our method. Conclusions With this study, we have successfully presented the potential of two resampling methods, cross validation and forecast analysis in the analysis of kinetic models’ validity. Our approach is easy to grasp and to implement, applicable to any ordinary differential equation (ODE) type biological model and does not suffer from any computational difficulties which seems to be a common problem for approaches that have been proposed for similar purposes. Matlab files

  18. Methods and means of information-analytical assessment of asteroid and comet hazard

    NASA Astrophysics Data System (ADS)

    Kulagin, V. P.; Shustov, B. M.; Kuznetsov, Yu. M.; Kaperko, A. F.; Bober, S. A.; Obolyaeva, N. M.; Naroenkov, S. A.; Shuvalov, V. V.; Svettsov, V. V.; Popova, O. P.; Glazachev, D. O.

    2016-12-01

    This paper contains a description of methods and software tools for creation of the informationanalytical system for monitoring hazardous space objects. The paper presents the structure of the system and a description of its functional components that enable rapid assessment of the NEO hazard and forecast of the effects of dangerous celestial bodies colliding with the Earth. The results of the system's operation regarding the modeling the motion of space objects are also included in this work.

  19. Methods and Systems for Representing, Using and Displaying Time-Varying Information on the Semantic Web

    DTIC Science & Technology

    2013-11-26

    Tag Multilingual Texts Through Observation," Proceedings of the Second Conference on Empirical Methods in Natural Language Processing, Association...Typed Dependency Parses from Phrase Structure Parses" Proceedings of 5th International Con- ference on Language Resources and Evaluation, pp. 1-6...ence, University of Illinois, pp. 1-10, 2005. Dowding, J., eta!., "Gemini: A Natural Language System for Spo- ken- Language Understanding," Annual

  20. Methods for providing probe position and temperature information on MR images during interventional procedures.

    PubMed

    Patel, K C; Duerk, J L; Zhang, Q; Chung, Y C; Williams, M; Kaczynski, K; Wendt, M; Lewin, J S

    1998-10-01

    Interventional magnetic resonance imaging (MRI) can be defined as the use of MR images for guiding and monitoring interventional procedures (e.g., biopsy, drainage) or minimally invasive therapy (e.g., thermal ablation). This work describes the development of a prototype graphical user interface and the appropriate software methods to accurately overlay a representation of a rigid interventional device [e.g., biopsy needle, radio-frequency (RF) probe] onto an MR image given only the probe's spatial position and orientation as determined from a three-dimensional (3-D) localizer used for interactive scan plane definition. This permits 1) "virtual tip tracking," where the probe tip location is displayed on the image without the use of separate receiver coils or a "road map" image data set, and, 2) "extending" the probe to predict its path if it were directly moved forward toward the target tissue. Further, this paper describes the design and implementation of a method to facilitate the monitoring of thermal ablation procedures by displaying and overlaying temperature maps from temperature sensitive MR acquisitions. These methods provide rapid graphical updates of probe position and temperature changes to aid the physician during the actual interventional MRI procedures without altering the usual operation of the MR imager.